Top

UK focuses on child safety at the start of new online regime

By Paul Sandle

Child safety: LONDON (Reuters) – Britain’s communications regulator said tech companies must focus on protecting children from abuse, grooming and pro-suicide content, in its first steps as the enforcer of online safety.

Ofcom, which gained new powers when the Online Safety Act came into law last month, said children were a key priority.

It said its role would be to force firms, such as Facebook and Instagram owner Meta, to tackle the causes of online harm by making their services safer.

It will not, however, make decisions about individual videos, posts, messages or accounts, or respond to individual complaints.

Chief Executive Melanie Dawes said Ofcom was wasting no time in setting out how it expected tech firms to protect people from illegal harm online.

“Children have told us about the dangers they face, and we’re determined to create a safer life online for young people in particular,” she said.

Its draft code published on Thursday included measures such as stopping users who are not in a child’s connection list from messaging them and not making children’s location information visible.

UK focuses on child safety at the start of new online regime
Child safety – FILE PHOTO: Facebook’s new rebrand logo Meta is seen on smartphone in front of displayed logo of Facebook, Messenger, Instagram, Whatsapp and Oculus in this illustration picture taken October 28, 2021. REUTERS/Dado Ruvic/Illustration/File Photo

It said firms should also use a technology called “hash matching” to identify illegal images of child sexual abuse by checking them against a database.

It said it would consult on its measures, which also include action to fight fraud and terrorism, before publishing a final version next year, which will be subject to parliamentary approval.

If companies do not comply with the new law, they could be fined up to 18 million pounds ($22.1 million) or 10% of their annual global turnover.

($1 = 0.8156 pounds)