By Lexi Meola
Elm Staff Writer
Facebook plans on blocking political ads starting in October. This move comes in the wake of mass controversy surrounding the site’s alleged spread of political misinformation after the 2016 election, when it was reported that Russian bots were spouting lies through ads and Facebook groups.
Founder and CEO of Facebook Mark Zuckerberg has been subpoenaed numerous times by Con-gress to answer for the controversy.
Zuckerberg stumbled through most of his responses when he was being questioned by Congress.
When Congresswoman Alexandria Ocasio-Cortez asked Zuckerberg about Facebook’s lack of fact-checking capabilities, he did not respond. This should not be comforting to anyone.
“So, you won’t take down lies or you will take down lies?” Ocasio-Cortez said. “It’s a pretty simple yes or no.”
Zuckerberg replied, “In most cases, in a democracy, I believe that people should be able to see for themselves what politicians that they may or may not vote for are saying and judge their character for themselves.”
According to Microsoft, three countries are actively trying to interfere with the 2020 election. In a recent statement, the company said it was “clear that foreign activity groups have stepped up their efforts” in targeting both Trump and Biden’s campaigns. Those groups are Russia, China, Iran, and possibly other countries. Russia and China have both denied these accusations, but Microsoft is seeing a similar pattern to current interference that they saw in the 2016 election.
Q Anon and other conspiracy theory groups get their following from sites like Facebook and Twitter. Twitter has taken actions to start fact-checking people’s posts. They even started fact-checking President Trump after he posted false information regarding COVID-19.
One of the biggest problems with Facebook is that they have almost no security when it comes to fact-checking. Misinformation spread by Trump, Q Anon, and other groups has caused people to lose their lives to the COVID-19.
In mid-March, Trump began to tout the drug hydroxychloroquine as a “cure” for COVID-19. An Arizona man died after taking a form of the drug and his wife was left in critical condition. The wife told NBC News she had watched a briefing of Trump talking about the drug.
“We saw his press conference. It was on a lot, actually,” she said. “Trump kept saying it was basically pretty much a cure.”
This kind of misinformation spread leads to serious harm. Over 193,000 Americans have died from COVID-19, and the more misinformation spread online, the higher the chances are that people could die.
Considering they own other popular social media websites such as WhatsApp and Instagram, Facebook should be taking the spread of misinformation more seriously. However, they clearly learned nothing from the 2016 election.
In October of 2019, multiple Facebook employees sent a letter to Mark Zuckerberg detailing concerns about political ads. Since then numerous employees have expressed their concern about Facebook becoming an unreliable app. They recognized the mistakes made in 2016 and decided that the company needed to change its fact-checking policies.
Although Facebook took some of the recommendations from the letter seriously, it’s too little, too late. Removing political ads in October is meaningless, for most people have made their decisions by then. Every ad should be fact-checked, and if it is found to have misinformation it should be removed.
If Facebook wants to be trusted, it must show its consumers that it will not bend privacy rules for politics. But until that day, Facebook, or any app run by it, is untrustworthy and consumers should be wary.
Featured Photo caption: Facebook Censoring. Photo Courtesy of Wikimedia Commons.