Should social media platforms do more to protect elections?
Join us and take our poll
What is it?
As the midterms loom, there has been an increase of online misinformation which has furthered calls for regulations on big tech’s use of algorithms. Experts warn that social media giants such as Facebook, Twitter, Instagram, and TikTok are not prepared to handle the expected onslaught of false information.
According to internally leaked documents–known as the Facebook Papers–Facebook is already struggling to catch posts that break the platform’s rules in local environments versus posts going viral across the country due to automated systems. Analysts fear a lack of allocated resources and attention combined with the timing of the midterms could further exacerbate the problem.
Sahar Massachi, a former Facebook employee and executive director of the think tank the Integrity Institute, notes that while resources devoted to election integrity in the United States may seem sparse, they are significantly more robust than elsewhere.
When enhancing election safeguards, social media platforms must consider how to combat misinformation and how to handle candidates who question or subvert democratic processes and their legitimacy.
Additionally, to deter extremists from organizing on their platforms, companies must fairly and equitably punish users who violate the rules and spread similar rhetoric to those already banned.
Why is it important?
100 candidates in the 2022 midterm elections have campaigned on the false premise that the 2020 Presidential election was fraudulent and that former President Donald Trump was robbed of a second term in office. This misinformation spread by these candidates also casts doubt on the legitimacy of the November midterm elections.
The concern is growing among advocates and experts that content that seeks to delegitimize primaries will spread on platforms if left unchecked.
What can you do?
When reading up on the news, check your sources. Here is a list of reputable fact-checking resources.