Crackdown on harmful social media content agreed
Social media firms are more responsible for users’ safety on their platforms under a controversial new law passed by peers.
It has taken years to agree on the Online Safety Bill, which will force firms to remove illegal content and prevent children from accessing some legal but harmful content.
NSPCC, a children’s charity, said the law would make the internet a safer place.
Critics, however, argued that it would allow a regulator as well as tech companies to dictate what can and cannot be said online.
As part of the nearly 300-page bill, newly introduced rules will require pornographic sites to check the ages of users before allowing them to view content.
Over 20,000 small businesses will be required to comply with the act, even though it is often viewed as a tool to rein in Big Tech.
Cyber-flashing and sharing of “deepfake” pornography have also been included as new offences in the bill.
It also includes measures aimed at making it easier for bereaved parents to obtain information about their children.
In an interview with the BBC, Michelle Donelan, the technology secretary, said the bill was “extremely comprehensive”.
When asked when we will see evidence of tech firms changing their behavior, she replied: “We are already seeing that change in behavior.”.
As soon as this bill receives Royal Assent, the regulator will work even more closely with those social media platforms, and you’ll see them change the way they operate.