In May 2020, Donald Trump decided to strike a blow against the major social networks. Very upset after two of his tweets were classified as “misleading” by Twitter, he planned a complete revision of Article 230 of the Communication Decency Act of 1996. The latter allows platforms to limit the liability of platforms for the content posted by their users . It is clear that the web giants are responsible if they are found to violate internet users’ freedom of expression by deleting or modifying their comments.
It has never been used since this text was published. The decree has in fact been attacked by activists and think tanks who feared it would affect freedom of expression and unfairly punish these companies.
The giants of the network are not out of the woods
However, it has remained in effect since then, but it is over now. In fact, Joe Biden decided on Friday to revoke Executive Order 13925. The President made no additional statements in support of his decision.
However, this does not mean that attempts to amend Article 230 have ended. Democrats and Republicans believe, for a variety of reasons, that the law needs to be changed to reflect current issues. The former in particular believe that the giants of the web are not moderating hate speech enough. For the latter, it’s the opposite, the big platforms would censor their opinions and comments.
This popular idea even resulted in Mark Zuckerberg being converted. When asked by the US Congress in March last year, the Facebook boss said:
The principles of Section 230 are as relevant today as they were in 1996, but the Internet has changed dramatically. We believe that Congress should consider making platform liability protection for certain types of illegal content dependent on companies’ ability to adhere to best practices to combat the spread of such content.
Social networks should therefore prove “that they have systems in place by which illegal content can be identified and removed”. They shouldn’t be held responsible if certain content escapes their detection – which would be impractical for platforms. with billions of messages a day – but they should have adequate systems in place to combat illegal content. “