Although Facebook has been praised for its efforts to thwart election meddling and bullying and increase user safety and security, it’s being challenged by the FCC for changing what constitutes “exclusionary speech” just ahead of the 2020 US Presidential election.
In the case of News Feed, Facebook argued that some posts — what it calls “explicit exclusionary speech” — require review by human moderators rather than a computer algorithm, which it demonstrated in its open letter with the FCC. But this isn’t changing until it reaches agreement with the FCC, and that could take years.
However, Facebook has taken some small steps that help eliminate several discriminatory practices that weren’t addressed, like suppressing news outlet statuses. But are these small step indicative of a bigger problem, or just a product of time and expertise?
A few years ago, Facebook used to hide or block statuses that contained the phrase “Don’t f****** vote.” This was done by labeling statuses as “inappropriate”, which also applied to calls for safety protests, and some anti-politicians statements. But the social media giant reversed this policy in 2018, realizing that it didn’t prevent abuse.
New bills being proposed would take deeper control of what becomes acceptable on social media. For example, the Honest Ads Act is being discussed, and it would require social media platforms to make public the information they have on voters, such as demographics, voter registration, and the content they engage in online.
Facebook’s Chief Operating Officer, Sheryl Sandberg, said that these new laws would limit online advertising, but did not say whether it would result in censorship. Regardless, it could still lead to censorship, but probably not through what she called censorship, since like Facebook, most laws will simply be illegal in an online context.
Another small step by Facebook was the release of a new security software called Sentinel, which is supposed to help reduce Facebook’s ability to track the content users send to its servers.
If you recall, some 50,000 Russian nationals used Facebook to promote some fake activism and racial hatred in the U.S. Facebook has said it will also use data and techniques developed by the FBI to improve its systems. It’s a good move, because there’s evidence that as more websites and mobile apps become embedding social media functionality into their services, it’s likely that these services will get into trouble too.