top of page

Tech Giants Respond to UK Parliament on Online Safety Concerns

10/05/25, 16:30

On May 2, 2025, the UK Parliament's Science, Innovation and Technology Committee published responses from TikTok, X (formerly Twitter), Google, and Meta, following their February appearances regarding online safety, misinformation, and harmful algorithms. These responses address the committee's follow-up questions on content moderation and platform accountability.

Key Points from the Tech Companies' Responses


  • Acknowledged that recent changes to its Hateful Conduct Policy could be interpreted to allow certain offensive content, including racist statements.

  • Confirmed no immediate plans to end its Third-Party Fact-Checking Program or to roll out Community Notes in the UK, citing the need to consider legal obligations under the Online Safety Act.

  • Did not provide detailed information on research or risk assessments related to policy impacts in the UK, stating the absence of a single process for changes to Community Standards.


X (formerly Twitter):

  • Clarified the functioning of its main feed algorithm and Community Notes, explaining that the latter uses a "bridging algorithm" requiring consensus from users with differing viewpoints.

  • Declined to disclose whether advertisers paused activity during the summer riots, labelling it as "sensitive business information," despite previous mentions of such pauses.

  • Reported that, as of September 2024, only 1,275 individuals were employed globally in content moderation, following significant job cuts.


Google:

  • Stated that it demonetised the website Channel3Now on July 31, 2024, two days after it posted false information about the Southport killings

  • Did not answer questions regarding earnings from ads on Channel3Now or YouTube's revenue from ads placed on eating disorder content.

  • Provided some details on measures to prevent ads from appearing alongside harmful content, but lacked comprehensive responses on monetisation concerns.


TikTok:

  • Declined to share specific details about its recommender algorithm and trust and safety measures, citing commercial confidentiality and potential misuse by malicious actors.UK Parliament Committees


Committee Chair's Remarks


Chi Onwurah, Chair of the Science, Innovation and Technology Committee, expressed appreciation for the insights provided but criticised the lack of full transparency. She emphasised the importance of tech companies being accountable and transparent, especially in light of their previous commitments to Parliament.


We agree that with this, if platforms are serious about user safety and public trust, they must go beyond PR-sanitised statements and start showing real accountability. Parliament and regulators must now ask: how long will we allow these companies to set their own rules while avoiding meaningful scrutiny?


For the full responses and detailed breakdowns, visit the UK Parliament's official publication: TikTok, X, Google and Meta answer questions on online safety.

bottom of page