A new lawsuit filed against Twitter, Facebook, and Google claims the companies provided “material support” to the Islamic State group, which led to the radicalization of Orlando nightclub shooter Omar Mateen.
The families of three victims of the tragic shooting — Tevin Crosby, Juan Ramon Guerrero, and Javier Jorge-Reyes —filed the complaint on Monday in the Eastern District of Michigan, reports Fox News.
The suit claims the three web giants “knowingly and recklessly” allowed for accounts affiliated with IS to spread “extremist propaganda,” raise funds, and actively recruit on their respective platforms.
On June 12, Mateen indiscriminately shot and killed 49 people inside Orlando’s Pulse nightclub, before being killed in a shootout with police. Officials later claimed the 29-year-old had pledged allegiance to IS. The extremist group accepted responsibility for the attack, but further investigation found Mateen was not a member of the group but had been inspired by them through what he saw online.
Central to the complaint is the interpretation of the Communications Decency Act (CDA), which states that: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” What this essentially means is that social networks and video platforms — such as Facebook, Twitter, or Google’s YouTube — cannot be held accountable for what their users share online.
In August, a similar suit filed against Twitter was dismissed by the judge overseeing the case based on the CDA. This time round, however, the complaint makes far-reaching implications for the companies involved based on their ad targeting and revenue-sharing solutions.
“The defendants create unique content by matching ISIS postings with advertisements based upon information known about the viewer,” Keith Altman, the lawyer representing the three families, told Fox News. “Furthermore, the defendants finance ISIS’s activities by sharing advertising revenue.”
The suit therefore hinges on the algorithms in place on the respective platforms, and whether they are serving up extremist content to people based on their past activity. Twitter, Google, and Facebook have yet to comment on the lawsuit.
The three firms in question recently came together to collaborate on a new database of extremist content pulled from their respective sites. All three have also stepped up their efforts to suspend accounts promoting terrorism, and to counter the spread of extremist propaganda online.
Related Posts
WhatsApp has begun testing a long-overdue group chat feature
The Meta-owned messaging platform is testing a new feature called "group chat history sharing" (via a WABetaInfo report). As the name suggests, the feature lets a WhatsApp user (likely the admin) share the chat history (up to 100 messages sent within 14 days) with someone while adding them to a group.
You can now choose the kind of content you see on Instagram Reels
The announcement came from Instagram CEO Adam Mosseri, giving people a more direct way to shape the kind of videos they actually want to see. At its core, Your Algorithm lets users actively tune their Reels experience.
New UK under-5 screen time guidance targets passive time, what it changes for you
The push is rooted in government-commissioned research that links the highest screen use in two-year-olds, around five hours a day, with weaker vocabulary than peers closer to 44 minutes a day. Screens are already close to universal at age two, so the guidance is being framed as help you can actually use, not a ban.