Oct 18, 2021
Across the US, many states are considering laws that prohibit online platforms like Facebook, YouTube, Instagram, etc. from enforcing rules against what we call “lawful but awful” online content.
Lawmakers are motivated to do this because they think laws are needed to prevent social media platforms from censoring conservative viewpoints.
As with many laws though, the unintended consequences of these laws could prove to be much more harmful than the behavior the law was intended to regulate.
To help us navigate the craziness of what would and would not be allowed if these laws go through, our guest today is Elizabeth Banker, VP of Legal Advocacy for Chamber of Progress. Chamber of Progress is also a sponsor of this show.
Elizabeth Banker is Vice President of Legal Advocacy for Chamber of Progress. Elizabeth brings twenty-five years of in-house, law firm, and trade association experience on intermediary liability, Section 230, and online safety. Most recently, Elizabeth was Deputy General Counsel at Internet Association where she directed policy on consumer privacy and content moderation. While at IA, Elizabeth conducted a review of 500 Section 230 decisions and testified twice before the Senate on efforts to reform Section 230.
Elizabeth has first-hand experience responding to the challenges that face online services as a veteran of both Twitter and Yahoo!. She was Vice President and Associate General Counsel for Law Enforcement, Security and Safety at Yahoo! Inc. for more than a decade. More recently she was Senior Director and Associate General Counsel for Global Law Enforcement and Safety at Twitter. Elizabeth spent five years as a shareholder at ZwillGen, a boutique law firm focused on privacy and security in Washington, D.C. Elizabeth began her career in government with the President’s Commission on Critical Infrastructure Protection during the Clinton Administration.
Many of the laws being proposed would actually tie the hands of social media platforms on some of the regulations that they currently have in place about harassment, bullying, and threatening behavior.
These are all types of content that no social media platform wants to see on their platforms. Currently, the social media providers have rules and regulations that they currently enforce across their platforms to keep users free from hateful, bullying speech and harassment.
These new laws would add many complexities to enforcing the rules and it will open them up to the constant appeals process for users who have their content removed, etc.
So far in 2021, we have seen over 100 bills proposed in state legislatures all across the nation. There will probably be many more before the end of the year.
The Amicus Brief that Chamber of Progress files was a way to explain to the court the real world implications of these laws, should they be passed and hold up to the legal battles ensuing.
The TX law that is currently under consideration was one in which we filed our objections in the amicus brief.
We believe that all platforms should be able to moderate harmful content in order for consumers to be healthy and safe on their platforms. Additionally, these platforms should be inclusive and widely accessible.
Here are the main objections we have to this TX law:
The TX law actually prevents platforms from taking the content moderation steps that they currently take. When it comes to content directed at children, there are many areas that fall under the awful, but lawful heading that would probably be left on the platform. For example, content glorifying suicide, or self-harm, or promoting eating disorders, etc. are all types of content that platforms would no longer be able to regulate.
Cyber bullying is another area where the current protections would be removed. So, school fight videos that are normally removed, would still be accessible. Non-consensual intimate images, called revenge porn would not be taken down, as well as other types of harassment that could be very harmful to teens.
So parents have every right to be worried, especially if you’ve already been through dealing with these sorts of problems, because under this law, they will only worsen.
The First Amendment does not apply to private companies. It only prohibits government regulation and restriction. Each social media platform has their own First Amendment concerns about what they allow on their platforms too.
The argument that social media platforms are violating a person’s right to free speech just doesn’t hold water.
Section 230 plays a critical role in allowing the platforms to remove harmful content without being sued. The platforms rely on this protection.
Recently a Russian foreign influence campaign sued because their content was removed. The lawsuit failed because of Section 230.
It’s important for us to fight to keep both the First Amendment and the Section 230 protections for content moderation strong in order to keep consumers safe while they enjoy these online platforms.
Follow Elizabeth on Twitter: @elizabethbanker