Directors of social media giants may be liable for harm
Two Irish senators want to make board members (directors) of social media giants like Facebook and Twitter personally liable for facilitating and allowing online harm.
To hold directors criminally liable
Ireland is to establish an online safety and media commission to regulate social media giants. The new commission will have the power to levy social platforms with substantial fines.
However, senators Malcolm Byrne and Shane Cassells say the commission’s powers aren’t strong enough and want Minister for Culture and Media Catherine Martin to beef up the draft Bill so that directors of social media firms would be open to criminal sanctions for failing to police their platforms.
To hold directors criminally liable for their actions would be a significant change in the regulation of social media, say the senators.
Prosecute directors for breaking the law and failing in their duties
The law would be similar to health and safety legislation, which allows directors of drugs companies to be prosecuted for breaking the law or causing harm.
Malcolm Byrne, a Fianna Fáil senator who has advocated strengthening the Bill, said that the goal is not to prosecute large amounts of people but to “change the culture of these companies”.
A provision of the Bill already allows criminal liability to extend to directors of social media firms. However, the amendments proposed by the Senators would make prosecution more straightforward and more likely.
Britain also wants social media firms to be held accountable for harmful content
Britain is also trying to regulate the social media giants, all though progress is slow. In 2019, the British government published an online harm white paper that proposed that social media executives could be held personally liable for harmful content distributed on their platforms.
As concerns grow about the role of the internet in the dissemination of terrorism, child abuse, self-harm, suicide, and misinformation, UK ministers have been pressured to act.
Plans call for the government to enact a statutory duty of care, governed by an independent regulator and likely funded through a levy on social media companies.
In 2018, Ofcom found that 45% of adult internet users had experienced some form of online harm, and 21% had taken action to report harmful content.
Facebook co-founder Mark Zuckerberg has called for international regulation of the web on four fronts: political advertising, data portability, privacy, and harmful content. “I have come to believe that we shouldn’t make so many important decisions about speech on our own,” he stated.