29 October 2020

What is Talking Points?

Cambridge News asks a representative of each local political party to answer a question on a local issue in just 300 words. The answers are then published in the physical copy of the Cambridge News every fortnight. Here, we share our response to the question.

 

Today's Question: Given the default position of social media platforms is the exact opposite of traditional media (accountable for nothing rather than accountable for every word), should tech giants be held as accountable for what they un-publish as editors-in-chief are for what they publish? Is this enforceable? Do you think there should there be an external regulator in this country, comparable to IPSO and Ofcom, monitoring the behaviour of ‘tech giants’ when it comes to what they choose to censor/remove/shadow ban, etc? Can you think of other ways of improving the situation? Or should ‘tech giants’ be allowed to do as they wish?

Social media is not designed to serve its users. We use it for entertainment, to keep in touch, to get the latest news, but the real customers are the advertisers. The commodity they are buying is our time. When you visit a social media page, you are presented with personalised content selected by computer algorithms to hold your attention. At a societal level, consequences range from phone addiction and rise in body image problems to polarisation of political debate and propagation of false news. This may not be deliberate, but it is having measurable impacts in the real world.

Clearly social media needs to be regulated: the question is how. The UK Government’s proposed approach is set out in the Online Harms White Paper. It would establish a statutory ‘duty of care’ for tech companies, enforced by a regulator. The approach has been criticised by those who claim it will amount to state censorship, curtailing legitimate free speech while failing to address wider problems. I agree. Yes, companies must be required to quickly remove illegal content, set terms and conditions around lawful content and uphold these consistently and transparently. But to be effective, regulation needs to holistically address root causes. These include the business models underlying social media and the asymmetries of power between user and provider. The problem is bigger than which posts are censored: online platforms need to be more transparent and accountable. Any new regulatory body must be robust and independent from both government and industry.

The UK-based Open Rights Group and others are calling for regulatory approaches based round the principles of Human Rights. As a Green I am committed to these principles and to me, this seems like a good starting point. Meanwhile, let’s all be conscious of the repercussions of our online behaviour, to ourselves and wider society.

 

Today's Author

Ellie CraneEllie lives in Milton and is a long-term member of the Green Party. With a background in ecological science and environmental and farming policy, she is currently a stay-at- home parent and freelance environmental consultant. She is the Social Media co-ordinator for South Cambridgeshire Green Party.






More Information

Please contact press@cambridge.greenparty.org.uk

 

News archive

For news items from 2019 and earlier, visit the archive page.

 

Social media

Follow Cambridgegreenparty on Facebook