
According to the UK communications regulator Ofcom, social media platforms will be subject to fines of up to$ 22.5 million unless they take steps to ensure that their algorithms do n’t steer children away from harmful content.
New legal obligations are set forth in a new European online security law for platforms that promote children’s education, and Ofcom released a draft code of conduct that specifies how to comply with them.
Our proposed rules strongly impose the accountability for keeping kids safe on tech companies, according to Ofcom CEO Melanie Dawes.
She added that they will need to “tame violent algorithms that push dangerous material to children in their personalised feeds” and introduce age-based controls to ensure children receive an experience appropriate for their situation.
The regulatory chief said the report lists 40 actionable steps that will” step-change online safety for children.”
She warned that once they are in place, we wo n’t hesitate to use all of our statutory enforcement powers to hold platforms accountable.
The new procedures are due to come into force next year, with rule- breaks facing fines of up to £18 million ($ 22.5 million ) or 10 percent of their earnings.
Rogue platforms, according to Dawes, may be “named and shamed” and even be made illegal for kids.
The code also mandates companies to implement content restraint systems and ensure that hazardous content is removed rapidly along with strong age-checks.
Peter Wanless, chief executive of children’s donation the NSPCC, called the document password a “welcome step in the right direction”.
When the last script goes into effect, tech companies may be legally required to make sure their systems are designed to be children’s safes, he added.