When that little white box on treatise on eroticismFacebook asks you, "What's on your mind?", could Facebook be responsible for what you have to say?
A group representing French Muslims is suing the French branches of Facebook and YouTube for hosting the video of the Christchurch attack in New Zealand. The group, the French Council of the Muslim Faith (CFCM), says that by enabling the live streaming of the mosque shooting, that killed 50, the tech platforms were "broadcasting a message with violent content abetting terrorism, or of a nature likely to seriously violate human dignity and liable to be seen by a minor," according to the BBC.
The terrorist who allegedly murdered 50 people and injured 50 more at two mosques in New Zealand on March 15 live streamed the attack of the first mosque on Facebook. Facebook has since said that it removed the video 12 minutes after the completion of the live stream. But that didn't stop the video from spreading.
SEE ALSO: Facebook reveals more details about how Christchurch terror attack video spreadFacebook removed 1.5 million versions of the video in just the first 24 hours after the attack. The video also made its way to other sites across the internet, notably YouTube, where users reported that they could still find versions of the video hours after the attack.
This Tweet is currently unavailable. It might be loading or has been removed.
Facebook and YouTube have both stressed that they are cooperating with law enforcement and continuing to work to stop the spread of these videos. But that might be too little, too late.
Generally, despite having terms that disallow posting of violent material, Terms of Service also protect social media sites from being legally liable for what their users post.
That precedent may be changing. The controversial FOSTA-SESTA act in the US, initially allegedly intended to combat sex trafficking, actually caused websites to be more diligent about hosting content related to sex, because it made them potentially legally liable for any illicit sexual activities taking place on their platform. Relatedly, European lawmakers just passed a law that makes internet companies legally and financially liable for the spread of copyrighted material on their platforms.
While these two instances are not directly related to liability for violent content, they both challenge the precedent that social media sites are not ultimately responsible for what their users post. That, dovetailing with growing sentiment that Terms Of Service are not sufficient to govern and enforce conduct, content, and privacy on social media, could add credence to the case.
A spokesman for Federation of Islamic Associations of New Zealand (FIANZ) told Reuters that he supported the French group's action.
“They have failed big time, this was a person who was looking for an audience and ... you were the platform he chose to advertise himself and his heinous crime,” FIANZ spokesman Anwar Ghani told Reuters. “We haven’t been in touch with the (French) group ... but certainly something which can deter the social media space in terms of these types of crimes, we would be supportive of that.”
Topics Facebook YouTube
(Editor: {typename type="name"/})
Today's Hurdle hints and answers for March 18, 2025
If you really want to put 'America First,' support tech industry immigrants
Move over, Alexa, the first AI therapist is here and she's hilarious
Fake 'Elon Musk' scams Twitter users out of cryptocurrency
Razer Kishi V2 deal: Snag one for 50% off
Ted Cruz has made a very serious enemy out of 'Simpsons' fans
Department of Justice announces new cybersecurity task force
Mount Sinabung's massive explosion in Sumatra revealed in photos
Get a Smart AcousticPlus acoustic electric guitar for $199.99
Bitcoin Regret Club shows what you could have if you invested early
Bangladesh vs. New Zealand 2025 livestream: Watch ICC Champions Trophy for free
Joss Whedon "didn’t have a story" for 'Batgirl' — but these women sure do
接受PR>=1、BR>=1,流量相当,内容相关类链接。