IT West Ltd

From the blog

News from IT West

One Hour To Take Down Illegal Content

New measures by the EU will mean that technology companies will have as little as just one hour to take down illegal and terror content, or face penalties under new legislation.

Why Only One Hour?

The new measure, which has reportedly been met with dismay by the big tech companies such as Google and Facebook (who will arguably be most seriously affected), is focused mainly on terror-related content. The logic is that because terrorist content is considered to be most harmful in the first hours of its appearance online, all companies will, therefore, be required to remove such content within only one hour from its referral, as a general rule.

Other illegal content that is being targeted by the new measures includes incitement to hatred and violence, child sexual abuse material, counterfeit products and copyright infringement content.

3 Months To Report Back

As well as the news that tech companies must remove the most serious content within one hour, the EC has also announced that any tech company that is responsible for people posting content online will have only three months from now to report back to the EU on what they were doing to meet the new targets it has set.

Operational Measures

The EC recommendations are that a set of operational measures will be used to ensure faster detection and removal of illegal content online, to reinforce the cooperation between companies, trusted flaggers and law enforcement authorities, and to increase transparency and safeguards for citizens. These operational measures will be:

  • Clearer ‘notice and action’ procedures. Companies should set out easy and transparent rules for notifying illegal content. These should include fast-track procedures for ‘trusted flaggers’. Also, to avoid unintended removal of content which is not illegal, content providers should be informed about such decisions and have the opportunity to contest them.
  • More efficient tools and proactive technologies. This means that companies should set out clear notification systems for users. These should include proactive tools to detect and remove illegal content, in particular for terrorism-related content and for content which does not need contextualisation to be deemed illegal, such as child sexual abuse material or counterfeited goods.
  • Stronger safeguards to ensure rights. To ensure that decisions to remove content are accurate and well-founded, companies should put in place effective and appropriate safeguards. These should include human oversight and verification, in full respect of fundamental rights, freedom of expression and data protection rules.
  • Special attention to small companies. The technology industry should, through voluntary arrangements, cooperate and share experiences, best practices and technological solutions, and this shared responsibility should particularly benefit smaller platforms with more limited resources and expertise.
  • Closer cooperation with authorities. If there is evidence of a serious criminal offence or a suspicion that illegal content is posing a threat to life or safety, companies will be required to promptly inform law enforcement authorities, and EC Member States should establish the appropriate legal obligations.

The recommendations are in addition to on-going work with the technology industry through voluntary initiatives to ensure that the internet is free of illegal content, and are intended to reinforce actions taken under different initiatives.

Response From The Tech Industry

Although Facebook has said that it shares the European Commission’s goal, the industry association EDiMA, (which includes Facebook, Google, and Twitter) has stressed that the one-hour turn-around time could harm the effectiveness of service providers’ take-down systems rather than help.

What Does This Mean For Your Business?

As the Vice-President for the Digital Single Market Andrus Ansip has pointed out, online platforms have become many people’s main gateway to information. For this reason, and if we accept that what is illegal offline is also illegal online, many people feel that these widely used technology platfoms now have a responsibility to provide a secure environment for their users. Many businesses are advertisers on these platforms, and are likely to share a desire to rid them of illegal content.

While some popular tech platforms have continued to resist what some see as too much censorship, interference, or over-regulation, the frequency and severity of terrorist attacks in Europe and the role and influence of platforms in spreading information, true or false (e.g. the US election) has given governments the fuel, impetus, and feeling of justification to try and apply more force to tech companies. The EC’s view is that the spread of illegal content online undermines the trust of citizens in the Internet and poses security threats, and the new operational measures could, along with any self-regulation, speed up the process of clearing illegal content.

The scale and frequency of illegal content posting has posed serious cost and resources challenges to tech platforms in recent years.