Remove. Reduce. Inform.
These are the three options Facebook uses to cleanup our news feeds.
This is also one Facebook’s primary strategies in regaining our trust.
When Facebook detects misleading or harmful content, these are the actions they might take on that content. They might remove it, reduce its spread (but not fully remove it), or inform users with additional context.
Sounds simple enough, right?
Not really. Understanding what Facebook considers misleading or harmful content is not always cut and dry.
As Tessa Lyons, Head of News Feed Integrity at Facebook said last year, bad content comes in “many different flavors.” Those flavors might include minor annoyances like clickbait to more egregious violations of their content policy, like hate speech and violent content.
Here’s how it works.
Facebook’s community standards and ads policies make clear the kinds of content not permitted on the platform. For example, fake accounts, threats, and terroristic content are big no-no’s. When Facebook discovers these posts, they’re automatically removed.
Of course, Facebook has to find them first. This usually happens with sophisticated algorithms, user reports, and an army of Facebook content reviewers.
Beyond these kinds of harmful posts, “reducing” content gets a little trickier.
Posts that don’t necessarily violate Facebook policies, but are still deemed “misleading or harmful,” are reduced in terms of rankings. Those rankings might determine how, when or where some posts appear on our Feeds.
Again, clickbait is a good example of content that doesn’t necessarily violate Facebook policy, but annoy users. Clickbait posts contain images or text that entice us to click, but don’t always deliver on a promise, or worse, lead us to harmful content.
Facebook also tries to inform us with content context so that we can decide whether to ignore it, read it, trust the content, or share it with friends.
This cleanup strategy has been in place since 2016, but last week Guy Rosen, Vice President of Integrity and Lyons, provided an update on it’s success.
For example, because content is often removed without any of us knowing it, Facebook created a new section on their Community Standards site where people can see these updates.
“We’re investing in features and products that give people more information to help them decide what to read, trust and share,” Rosen and Lyons reports.
“In the past year, we began offering more information on articles...which shows the publisher’s Wikipedia entry, the website’s age, and where and how often the content has been shared.”
Also, starting last week, Facebook expanded the role The Associated Press plays in third-party fact checking. In short, the AP will help the social media giant by debunking false and misleading information in videos and other posts.
Facebook has a long way to go to rebuild our trust, but initiatives like this are moving them in the right direction.
Dr. Adam C. Earnheardt is special assistant to the provost and professor of communication in the department of communication at Youngstown State University in Youngstown, OH, USA where he also directs the graduate program in professional communication. He researches and writes on a variety of topics including communication technologies, relationships, and sports (with an emphasis on fandom). His work has appeared in Mahoning Matters as well as The Vindicator and Tribune-Chronicle newspapers.