Don't worry,george bataille eroticism militia members will have to wait until afterElection Day to be algorithmically pointed to Facebook groups of like-minded individuals.
At Wednesday's Senate hearing on (at least in theory) Section 230, Facebook CEO Mark Zuckerberg let slip a slight behind-the-scenes change his company has taken in the lead up to Nov. 3. Specifically, Zuckerberg offhandedly mentioned that Facebook has temporarily stopped recommending political issue Facebook groups to its users.
Of course, Facebook intends to spin this presumably dangerous — or, at the very least, worrisome — recommendation feature right back up again after the election. So reports BuzzFeed News, which was able to confirm that the new policy is only temporary.
"This is a measure we put in place in the lead-up to Election Day," Facebook spokesperson Liz Bourgeois told the publication. "We will assess when to lift them afterwards, but they are temporary."
Because obviously we won't have any social media-juiced instances of violence after the election. Heavens no.
Notably, this move comes at a time when Zuckerberg — as expressed in his Thursday earnings call — is "worried that with our nation so divided and election results potentially taking days or weeks to be finalized, there is a risk of civil unrest across the country."
Facebook, which recently attempted to ban QAnon conspiracy groups, has particular reason to be concerned about the upcoming election and possible associated violence. Well, concern for its reputation, anyway. The platform has served as a breeding group for violent conspiracy theories for years, and a simple QAnon ban isn't going to change that.
There is a real possibility that the next Kenosha-style tragedy is already being planned, coordinated, or hyped with Facebook tools — only now with an Election Day twist. Facebook's attempt to cool things down by pausing an element of its own recommendation system calls attention to the simple fact that Facebook itself is fundamentally problematic.
Facebook knows this. In May of this year, the Wall Street Journalreported that Facebook had ignored its own internal research showing that its algorithms were making the site more divisive.
"Our algorithms exploit the human brain's attraction to divisiveness," read a slide from a 2018 presentation. "If left unchecked," it warned, Facebook would feed users "more and more divisive content in an effort to gain user attention & increase time on the platform."
SEE ALSO: People are fighting algorithms for a more just and equitable future. You can, too.
No temporary pause of a single recommendation feature, no matter how well intentioned, is going to change that.
Topics Facebook Social Media
(Editor: {typename type="name"/})
Blockchain Explained: How It Works, Who Cares and What Its Future May Hold
In court, oil company admits global warming is real, denies guilt
Tropical Cyclone Nora intensifies off Australia's northeast coast
Find out which Easter candy is the most popular — or polarizing — in your state
Celtic vs. Bayern Munich 2025 livestream: Watch Champions League for free
Facebook regrets lawsuit threat against Cambridge Analytica reporters
After YouTube ban, gun vloggers find a new home on Pornhub
Starbucks unveils the colorful new Crystal Ball Frappuccino
NYT Connections Sports Edition hints and answers for February 11: Tips to solve Connections #141
Indeed taps Comparably, InHerSight, Fairygodboss for diversity tool
In Paris Agreement speech, Trump never acknowledged the reality of global warming
People trying to sell mirrors accidentally take the best selfies
接受PR>=1、BR>=1,流量相当,内容相关类链接。