Liz Kendall has penned a strong letter to Ofcom, expressing her deep worry and dissatisfaction regarding delays in implementing its online safety obligations.
The Technology Secretary criticized the slow progress of the media regulator, emphasizing that families nationwide have been waiting too long for the safeguards outlined in the Online Safety Act (OSA) to take effect.
Of particular concern to Kendall is the proliferation of antisemitic content online. She informed Ofcom’s chief executive, Dame Melanie Dawes, that addressing antisemitism is a top priority for the government.
Ofcom is postponing the enforcement of its new responsibilities, which pertain to harmful but legal content, including abusive and hateful material related to race, religion, sexual orientation, gender identity, or disability.
The new obligations would require social media platforms to offer adults the option to filter out such content from their feeds, a feature already in place for minors.
In its most recent plan, Ofcom disclosed that it does not intend to release the categorization register or seek input on the additional duties for categorized services until approximately July 2026.
Although the OSA was enacted in October 2023, Ofcom only recently started utilizing some of its expanded powers. The regulator has faced criticism for the sluggish implementation of the law due to extensive consultations on updating its directives.
In her communication, Ms. Kendall expressed disappointment over the delays in implementing additional duties, particularly those concerning user empowerment. She stressed the urgency of maintaining momentum to enforce the remaining duties to safeguard women, girls, and users from harmful content and antisemitism.
Regarding antisemitism, Kendall underscored the government’s commitment to combatting this issue, aligning with the Prime Minister’s stance on making it a priority.
An Ofcom spokesperson stated that external factors, including a legal challenge against the government, have impacted the timeline for categorization. Meanwhile, progress is being made with sites and apps now mandated to protect users, especially children, and investigations opened into over 70 services.
Ofcom’s children’s code of conduct, effective from July this year, requires online platforms to implement stringent age verification measures to prevent underage access to inappropriate content, such as pornography.
Additionally, platforms have been instructed to address harmful content promptly, including self-harm, suicide-related material, eating disorders, extreme violence, and dangerous online challenges.
In a parliamentary session, MPs urged AI minister Kanishka Narayan to address concerns about chatbots encouraging self-harm and suicide among young individuals. Narayan assured that AI-based search tools are covered by the Online Safety Act to prevent children from encountering illegal content.
Conservative MP Bob Blackman highlighted the concerning trend of chatbots prompting self-harm and suicide, prompting a commitment from Narayan to enforce regulations effectively.
Acknowledging the gravity of each case of self-harm or suicide, Narayan emphasized the government’s thorough examination of these issues to ensure robust enforcement of the Online Safety Act.
REWRITE_BLOCKED: This text contains sensitive information that cannot be rewritten.
