By Tess Woolfenden.
In the wake of the Cambridge Analytica scandal, the issue of social media regulation – or lack thereof – has been raised in mainstream public debate in the US and, to a lesser extent, here in the UK.
The scandal – in which personal data from Facebook was harvested to create targeted political advertising – has called into question the integrity of elections around the world. This also comes at a time when we are witnessing a significant rise in fake news and online abuse through social media platforms. Regulation is therefore an important topic of discussion for us all.
But what should this regulation look like? And who should be responsible for it? To explore these questions, WebRoots Democracy held an event on 23rd April at Newspeak House titled “Cambridge Analytica and the future of social media”. The event boasted a diverse panel made up of Seyi Akiwowo (Founder, Glitch!UK), Darren Jones MP (Science and Technology Select Committee), Carl Miller (Research Director, Centre for the Analysis of Social Media), and Henna Zamurd-Butt (CEO and Editor of Media Diversified). It was chaired by Areeq Chowdhury, Founder and Chief Executive of WebRoots Democracy.
To kick-off, Areeq introduced the panel and summarised some of the key themes that would be discussed during the evening. This was followed by a short video interview between Darren and Areeq which focused on the Cambridge Analytica files and the role of government in regulating social media platforms – you can watch this here. Each of the panel members then spoke for a short time about some of the issues they saw as important to the topic of discussion, followed by questions from the audience.
The social media experience
A common thread throughout the discussion was the importance of social media in our society today. Darren explained that social media plays an important role in our economic productivity and public services, while Seyi and Henna outlined its role in connecting people, expressing ourselves and engaging all groups.
However, not everyone experiences social media in the same way. Many different groups and individuals are experiencing first-hand the detrimental and harmful outcomes of failing to properly regulate social media. We heard many examples and personal stories from the panel which highlighted this clearly.
Darren explained how many of his female colleagues in Westminster receive harmful online abuse. This is often not the case for his male colleagues, marking a serious gender imbalance in the experience of social media for politicians. Areeq added to this explaining how such abuse often prevents women from wanting to pursue a career in politics. Seyi also outlined evidence that online abuse is driving women off social media platforms, stopping them from standing for office in the UK and can even result in suicide.
She went on to discuss her personal experiences of online abuse after a video of her standing up to racism at a European Youth Event in 2016 went viral. She explained that despite reporting the abuse to Twitter and Facebook she was unable to get the comments removed without drawing on her connections and making an appearance on the ITV News.
“It’s relentless. All day. And social media was not there to protect me so instead I had to use my contacts and the police.”
As Henna explained, these platforms are supposed to be open spaces equally accessible for everyone. But this is not a reality for all users. Both Darren and Seyi outlined that these kinds of attacks reflect wider societal prejudices which people feel they can bring out because they can be anonymous.
“The online world is a reflection of the offline world.”
This can be evidenced by the fact that marginalised groups have faced this kind of abuse online for years.
So, should we just bail?
In light of the abusive and harmful way social media platforms are used by some, one audience member asked if we should just abandon social media platforms altogether?
Darren responded by outlining the importance of social media, and that we need to address the issue of regulation.
Who should be responsible for regulating social media platforms?
With the importance of social media and its regulation defined, the discussion turned to address who should be ultimately responsible for putting in place and enforcing regulation.
Darren explained that social media platforms themselves have some responsibility.
“Platforms will just say that they are just hosting not publishing, but this is weak nowadays, they do have some responsibility.”
Seyi also outlined the ways in which social medial platforms should be enforcing regulation through their own terms and conditions, as well as the use of moderators. She explained, however, that there is often little enforcement of terms and conditions and that these processes tend to lack transparency. For example she described a poor lack of transparency in relation to the Facebook moderation process whereby there is very little information on where moderators are trained, where they are located and who they are. She argued that we need to hold social media platforms to account.
Carl on the other hand expressed caution at focusing on social media platforms when discussing regulation. He argued that we should not have tech companies defining what people can or cannot do online – the likes of Facebook and Twitter are corporations with an incentive for profit, not for the protection of its users.
“A private, profitable enterprise should not be defining what is criminal or not.”
Instead, we need to focus on creating legal frameworks for regulating social media as well as developing ways to ensure that people adhere to the rules. Such frameworks are not yet in place, and where we do have relevant legislation, it is incredibly out of date. For example, Carl explained how there is no clear legal definition of what a hate crime is online. He also explained that cyber-crime teams face real difficulty in protecting citizens from online offenses as evidence very easily travels across borders, and yet the remit of the police does not.
This highlights the need for some form of global consensus and cooperation in defining and implementing social media regulation.
“Crime now flows easily across borders, but prosecution doesn’t. The most trivial of online crimes is a issue of international diplomacy.”
Darren highlighted this in the interview with Areeq in which he outlined that currently there are very different approaches to social media regulation in different countries.
“We need to build a global consensus, because these are global platforms.”
While recognising this as important, Henna expressed caution about a global approach to social media regulation. She outlined how every country will have very different regulatory needs based on culture and social media use; this needs to be a part of the conversation. We need to remain context-specific and avoid the colonialism of western law on the rest of the world.
“There are groups that need social media in societies where governments are not friendly to their citizens. Regulation needs to be context specific.”
An additional concern raised by a member of the audience was whether governments had the technical capacity to write legislation for the regulation of social media.
“The government is a bunch of old people that needs to catch up. Us, the people need to find an alternative way to protect ourselves.”
In his interview with Areeq, Darren outlined that while government does have a role to play in defining how social media is regulated, we must also recognise that only very few politicians have the necessary technical expertise. He discussed the potential of partnering with, or establishing, external bodies with expert knowledge to assist with the technical details.
Darren also outlined that politicians respond to the priorities of their constituents, most of whom do not place Cambridge Analytica or wider social media regulation at the top of their list. This highlights the need for us, the public, to speak up about what we want and need from social media regulation. As Seyi explained:
“We need to reclaim the space back, it wasn’t always toxic and it has great power in bring people together.”
A big part of this will come from digital literacy which can not only help us to learn navigate the internet, but also to know our rights and therefore personally stand against abuses. As perfectly summarised by Carl:
Thanks to all who attended. WebRoots Democracy will be considering all thoughts put forward as part of our new research project, Regulating Social Media.
Tess Woolfenden is a Research Assistant at WebRoots Democracy.