Facebook and other social media platforms to be regulated by Ofcom

Ofcom will be put in charge of regulating the internet, the government has announced. This gives the watchdog control over ensuring that social media platforms act over harmful content.

Until now, firms like Facebook, Tiktok, YouTube, Snapchat and Twitter have largely been self-regulating.

Ofcom already regulates television and radio broadcasters, including the BBC, and deals with complaints about them.

This is the government's first response to the Online Harms consultation it carried out in the UK in 2019, which received 2,500 replies.

Executives at internet firms could face hefty fines or even prison sentences if they fail to protect users from “harmful and illegal content” online.

The companies have defended their own rules about taking down unacceptable content, but critics say independent rules are needed to keep people safe.

Under the proposals, Ofcom will not have the power to remove specific posts from social media platforms. However, it will require internet companies such as Facebook and Google to publish explicit statements setting out which content and behaviour they deem to be acceptable on their sites.

The media regulator will then ensure internet businesses enforce these standards “consistently and transparently”.

Culture secretary, Nicky Morgan, and the home secretary, Priti Patel, promised that changes to the proposals would guarantee free speech for adults online and only target larger internet businesses.

It has not been made clear what penalties Ofcom will be able to enforce to target violence, cyber-bullying and child abuse.

The regulator has just announced the appointment of a new chief executive, Dame Melanie Dawes, who will take up the role in March.

Dame Melanie, will join Ofcom next month after stepping down as permanent secretary at the Ministry of Housing, Communities and Local Government, a post she has held since 2015.

Recently she became Permanent Secretary Champion for Diversity and Inclusion for the Civil Service.

She said that Ofcom plays a crucial role in ensuring that people and businesses across the UK get the best from their communications services. Adding that it is a great privilege to be the new Chief Executive at a time of significant change in the sectors Ofcom regulates. Some of the proposals announced on Wednesday:

Any business that enables the sharing of user-generated content – such as online comments or video uploads – is likely to be affected by the new rules.
Internet businesses will be required to publish annual transparency reports explaining what harmful content they have removed and how they are meeting their standards.
Bringing back age verification for certain websites, following an abandoned attempt to introduce it last year to restrict access to online pornography.

Ms Morgan said that with Ofcom at the helm of a proportionate and strong regulatory regime, there is an incredible opportunity to lead the world in building a thriving digital economy, driven by groundbreaking technology, that is trusted by and protects everyone in the UK.

The regulations are broadly focused on two new sets of requirements. One, around illegal content, the other around reducing content that glorifies self-harm or suicide but that focuses instead, on putting a requirement on large online platforms to better enforce their own terms of service.

The first set of requirements will give platforms new targets to ensure that such content is removed quickly, ideally prevented from being posted, with a particular focus on terrorist and child sexual abuse content. #Ofcom #Facebook #Google #Socialmedia