Preview Mode Links will not work in preview mode

Dec 9, 2022

Groups file flurry of Section 230 briefs with the Supreme Court

 

  • What’s going on? Section 230 of the Communications Decency Act shields platforms like Google and Twitter from liability for content posted by internet users. Republicans and Democrats want the rule changed. It’s important to note that Section 230 protects only publishers of information. The central question here is – at which point do platforms lose their status as publishers and actually become creators of content? Once they’re deemed to be creators, they would lose protection under Section 230.
      • Generally, Republicans like Josh Hawley say platform liability should be a state issue because they think tech companies lean progressive and that seeking to ban harmful content discriminates against conservatives.
      • Democrats argue that Section 230 doesn’t hold platforms accountable enough, especially in the context of how marketers target children.
  • How are politicians trying to change the law? The Supreme Court is set to decide Gonzalez v. Google in which the family of a young woman killed in the 2015 Paris Terror Attacks argues that Google should be liable for aiding and abetting the attack by hosting terror-related videos on YouTube.
  • There are 2 parts to this – 
  • one is whether Google should be held liable for merely hosting terror-related videos the family alleges groomed terrorists involved. Google is arguing that hosting the videos simply makes them publishers and thus they would still be entitled to protection under Section 230.
  • The other is whether recommending content – converts platforms to content creators – in which case the Gonzalez family argues Google should be held liable since Section 230 wouldn’t apply to instances in which people predisposed to terrorism-related content puts Google in the position of being a content creator, in which case Google wouldn’t be shielded from liability under Section 230.
  • How does this affect you? Keep an eye on what your state is doing to change the way content platforms moderate content. For example, Texas and Florida passed statutes preventing platforms from discriminating against so-called “anti-conservative bias.” This has a direct impact on what people see and hear, which directly impacts elections since a scourge of harmful content, such as Trump’s tweets leading up to the Capitol Hill insurrection, have dominated our politics for many years.



Big name advertisers are showing up in white nationalists’ Twitter feeds again

 

  • Why are white nationalists on Twitter? Elon Musk fired Twitter’s entire content moderation team and reinstated the accounts of white nationalists.
  • Which companies showed up in white nationalist’s accounts? Ads for Uber, Amazon, Snap, and even the US Department of Health and Human Services showed up in these accounts. But the Washington Post reports that it saw some 40 advertisers showing up next to content posted by reinstated white nationalists.
  • What are the policy implications? White supremacist content is an example of the type of content Republicans in states like Texas and Florida think internet platforms shouldn’t be allowed to ban. Right now, only advertisers have the ability to discipline Twitter by removing their ads on the platform.
  • What are the real-world effects of white supremacists online? The Department of Homeland Security issued a report in late November expressing urgent concern about the fact that antisemitism online, and in the real world, are  reinforcing each other, leading to an increase in hate crimes.

 

DC Attorney General is suing Amazon over driver tips

 

  • What’s going on? DC Attorney General Karl Racine filed a consumer lawsuit on Wednesday alleging that Amazon basically stole tips from its Flex drivers by hiding from drivers the amounts they were getting in tips and pocketing them. And then Amazon hid the fact that they were doing this from its customers.
  • What is Amazon saying? Amazon is saying it built the tips into drivers’ hourly compensation, which it says is above DC’s minimum wage of $16.10 per hour.
  • What happens next? We’ll see. The DC Circuit Court of Appeals will review Racine’s complaint and that process will start early next year.

 

 

In other tech law & policy news …

 

Women are suing Elon Musk for discrimination against them in layoffs.

 

Staten Island Union organizer lost his lawsuit against Amazon for race discrimination. The court says he was fired for exposing co-workers to COVID during the pandemic lockdowns.

 

The Senate Banking Committee appears likely to subpoena Sam Bankman-Fried after he ignored a request to testify regarding the implosion of crypto-currency exchange FTX.

 

The FTC is suing to prevent Microsoft’s acquisition of Activision, the maker of Modern Warfare and Candy Crush, as well as Facebook’s acquisition of virtual reality firm Within.

 

Apple announced that it will fully encrypt iCloud data, raising alarm from law enforcement officials.


States are now joining the federal government in banning government employees from downloading TikTok on their phones because TikTok’s parent company, ByteDance, is based in China. Officials are concerned China will gain access to sensitive data.