From the 2016 US presidential election to conspiracy theories about COVID-19 – the weaponisation of social media has emerged as a threat to democracies around the world in recent years.
Companies such as Facebook and Twitter have been in the spotlight, as governments and pro-democracy organisations have pressured them to take steps to crack down on the insidious use of their platforms.
Now, an international coalition which focuses on fixing these issues has called on countries to enforce regulations on social media companies, which could include sanctions, holding individual CEOs liable, and requiring a minimum income spend on expanding the number of moderators.
The Forum on Information and Democracy, which is made up of groups from around the world including Reporters Sans Frontiers and the Human Rights Center, has come up with 250 recommendations for tackling the problem.
It specifically looks at “infodemics”, which it describes as an overabundance of information which can be accurate or false, making it hard for users to find trustworthy sources when they need them.
“A structural solution is possible to end the informational chaos that poses a vital threat to democracies,” said Christophe Deloire, Chair of the Forum on Information and Democracy.
Of the 250 recommendations, the organisation highlighted 12 key ones.
- These requirements should relate to all core public functions: content moderation, content ranking, content targeting, and social influence building.
- Regulators in charge of enforcing transparency requirements should have strong democratic oversight and audit processes.
- Sanctions for non-compliance could include large fines, mandatory publicity in the form of banners, liability of the CEO, and administrative sanctions such as closing access to a country’s market.
- Platforms should follow a set of Human Rights Principles for Content Moderation based on international human rights law.
- Social platforms should be obliged to follow rules on fairness similar to those followed by broadcasters in certain countires.
- The number of moderators should be increased, with a minimum percentage spend of income to do so.
New approaches to platform design
- A Digital Standards Enforcement Agency should be launched.
- Conflicts of interests of platforms should be prohibited, in order to avoid the information and communication space being governed or influenced by commercial, political or any other interests.
- A co-regulatory framework for the promotion of public interest journalistic contents should be defined, based on self-regulatory standards such as the Journalism Trust Initiative; friction to slow down the spread of potentially harmful viral content should be added.
Safeguards for private messaging services
- Measures that limit the virality of misleading content should be implemented through limitations of some functionalities; opt-in features to receive group messages, and measures to combat bulk messaging and automated behavior.
- Online service providers should be required to better inform users regarding the origin of the messages they receive, especially by labelling those which have been forwarded.
- Notification mechanisms of illegal content by users, and appeal mechanisms for users that were banned from services should be reinforced.
“It’s time to end the whack-a-mole approach of the technology platforms to fix what they have broken,” said Maria Ressa, Co-chair of the steering committee of the working group on infodemics
“Social media, once an enabler, is now the destroyer, building division – ‘us against them’ thinking into the design of their platforms. It’s not a coincidence that divisive leaders perform best on social media.
“Facebook is now the world’s largest distributor of news. Except there’s a catch: lies laced with anger and hate spread faster and further than the boring facts of news. They create a bandwagon effect of artificial consensus for the lie,” she added.