
Britain plans to expand its rules for illegal material online to cover how social media companies respond to crises such as recent riots in the country, a statement by UK regulators on Monday showed.
The agency released the first guidelines of the Online Security Act, which passed a law in 2023 to manage illegal material on internet platforms. OFCOM said in a press release that it is planning other measures taken next spring, including new recommendations to eliminate material related to child sexual abuse and terrorism. It will also introduce a “crisis response program for emergencies.”
Amid rumors that three young girls in Southport were deadly stabbing and attackers were Muslim asylum seekers, violent protests broke out throughout the UK in August. The riots were a major challenge for Prime Minister Keir Starmer’s administration in its second month of office. Starmer called on social media companies to stop what he called “violent disorders of violence are obvious online whipping.”
X’s billionaire boss Elon Musk repeatedly criticized Starmer for handling the riots. The consequences prompted Starmer’s Labor Party to consider stricter rules to construct burning content online, Bloomberg News reported.
Ofcom said on Monday that under the agency’s first provision, the company has three months to complete an assessment of the illegal harm on its platform. The regulator said that a penalty for non-compliance could result in a fine of 10% of the platform’s global revenue, or “in very serious cases” that the court ordered to block its services in the UK.
“These laws mark a basic reset of society’s expectations of technology companies for technology companies,” said British technical secretary Peter Kyle in a statement on Monday. “I hope they can deliver and will Keep an eye on to make sure they do it.”
©2024 Bloomberg LP
(This story has not been edited by Tech Word News’s staff and is automatically generated from the joint feed.)