| By Xuan Zhong |
The UK’s new internet safety law, recently approved by lawmakers, requires social media platforms to implement several preventive measures including:
- Monitoring and promptly removing illegal content relating to child sexual abuse, hate speech and terrorism, revenge pornography, and promotion of self-harm
- Blocking and other penalties for users posting such content
- Verifying the age of the user, generally ensuring that the user is at least 13 years old
- Blocking children from accessing content that, while not illegal, may be harmful or inappropriate for a child’s age, including pornography, bullying, glorifying eating disorders, or providing suicide guidance
If a social media platform fails to enforce the measures, the platform will face a fine of up to £18 million or 10% of its global sales if they are greater than £18 million if its services are accessible to users in the UK. In addition, its executives may be held criminally liable if they fail to meet the information requirements or fulfill other obligations of the new law.
As the new cybersecurity law does not provide a detailed explanation of how social media platforms will enforce its provisions, many stakeholders believe that the new requirements are likely to infringe on users’ privacy due to the need to comply with the law, resulting in information leaks, misuse, and other pitfalls.
Photo Credit: Vanguard News