Human rights organisations, academics and writers have called on Ofcom to clarify what the High Court’s decision that the ban on Palestine Action was unlawful will mean for online platforms, pending the Home Secretary’s appeal against the ruling.
The Metropolitan police have said officers will no longer arrest people at protests who express support for the direct action group. But signatories to a letter to Ofcom say it is unclear what it will mean for platforms required to remove terrorist content under the Online Safety Act.
Open Rights Group, Amnesty International UK, Big Brother Watch, Access Now and others have called on the communications regulator to clarify whether platforms are still expected to remove content. They also want to know how new obligations to remove terrorist content will be implemented and whether content can be restored if the government loses its appeal.
Sara Chitseko, pre-crime program director at Open Rights Group, said: “The UK’s vague definition of terrorism and legal duties under the Online Safety Act already risk content being wrongly defined as illegal and removed. There is now additional confusion over whether tech companies are targeting and removing online content linked to the Palestine Action.
“In light of the court’s ruling and comments on freedom of expression, Ofcom must provide immediate guidance to ensure that important public debates about Palestine are not censored.”
Last week, the judges decided that the banning order banning Palestinian Action under anti-terrorism laws would remain in force pending Shabana Mahmood’s appeal against the high court’s decision. It means that the legal position remains that content supporting Palestine Action must be removed when a platform finds it or is reported to them.
But the letter’s signatories, which also include Statewatch, Netpol, Article 19 and computer forensics expert Duncan Campbell, are urging Ofcom to follow the Met’s lead in clarifying the situation pending the appeal. They say it will become an even more pressing issue if new requirements to proactively scan illegal content, restrict live streaming and suppress algorithms come into effect later this year, as expected.
The letter says the banning of Palestine Action “raised serious concerns about the criminalization of political expression” and there was an escalation in the removal of content on platforms such as Instagram, TikTok and
He adds: “The Supreme Court’s ruling should be a turning point. It demonstrates how easily counter-terrorism powers and platform regulation can be used to silence debate and suppress dissent and how difficult it is to undo such damage once systems of censorship and surveillance are established.”
When The Guardian contacted Ofcom, it did not directly address the situation pending appeal. “Under the Online Safety Act, technology companies must quickly remove illegal terrorist content when they become aware of it,” a spokesperson said. “There is no requirement for sites and apps to restrict legal content for adult users. In fact, in carrying out their duties to keep people safe, the law requires platforms to give special consideration to the importance of protecting users’ right to freedom of expression.”






