Instagram will discontinue end-to-end encryption for private messages beginning May 8, 2026, a move that will allow Meta to access the content of all user communications on the platform. This decision comes after years of criticism from law enforcement and child safety organizations who argued that encryption hinders efforts to protect children and combat illegal activities online. Meta cited low user adoption of the encryption feature as the primary reason for its removal, offering WhatsApp as an alternative for users seeking end-to-end encrypted messaging. However, some experts suggest that the decision may also be linked to Meta’s broader platform strategy and potential commercial interests in message content for advertising and AI development.
Read More
The platform is implementing new default settings to enhance user safety and privacy. These changes will restrict visibility and communication options, with access to age-restricted communities and sensitive content now requiring age verification. This ensures that only adults can engage with potentially mature material, fostering a more controlled online environment.
Read More
Following Spain’s announcement of new regulations for social media platforms, including a ban for children under 16 and holding tech executives criminally liable, Elon Musk publicly denounced Prime Minister Pedro Sanchez, calling him a “tyrant and traitor.” These measures come as part of a broader European effort to protect minors online and address concerns about mental health and illegal content, with Spain leading a coalition of six countries to coordinate enforcement against tech giants. This legislative push aligns with similar actions taken by countries like Australia, reflecting growing global anxieties regarding the impact of social media on young users.
Read More
Spain is set to ban social media access for minors under 16, as announced by Prime Minister Pedro Sanchez, who cited concerns over the exposure of young people to harmful content like hate speech and disinformation. This initiative follows a similar ban implemented in Australia and aims to shield children from what the Prime Minister described as the “digital Wild West.” Spain is also joining a “Coalition of the Digitally Willing” with five other European nations to coordinate cross-border digital regulation and will introduce legislation next week to hold social media executives accountable for illegal content and algorithmic manipulation.
Read More
Spain is implementing stricter measures to shield children from harmful online content, following a trend seen across Europe with similar initiatives in Denmark and France. This legislative proposal includes requiring parental consent for social media access for minors and holding platform executives legally accountable for illegal content. The government aims to combat disinformation and hate speech by investigating algorithms that amplify such content for profit, aligning with the EU’s Digital Services Act which mandates platforms to mitigate online risks. The European Commission, responsible for enforcing these regulations on large platforms, has previously fined X for transparency violations.
Read More
Meta has reported removing nearly 550,000 accounts belonging to users under 16 across its platforms in response to Australia’s Online Safety Amendment Act 2024, which went into effect in December. The social media ban restricts access to platforms like Instagram, Facebook, and Threads. While complying with the law, Meta is urging the Australian government to collaborate with the industry for a more effective solution. The company suggests incentivizing the industry to improve safety standards and implement age verification tools to protect young users across all apps, thereby avoiding the need for blanket bans.
Read More
A recent amendment proposed by a cross-party group of House of Lords Peers seeks to ban children in the UK from using VPNs. If enacted, VPN providers would be obligated to implement stringent age verification measures for all UK users, and the government would establish a monitoring regime to enforce compliance. This proposal aims to prevent children from circumventing age verification under the Online Safety Act. While supported by various Lords members, the amendment’s future remains uncertain as it requires approval from both the House of Lords and the House of Commons before becoming law.
Read More
The European Parliament has passed a resolution advocating for a ban on social media use for children under 16, with parental consent being an exception. This non-binding resolution aims to address growing concerns about the potential mental health risks associated with unrestricted internet access for minors. The European Commission is currently evaluating Australia’s similar ban, and a panel of experts is expected to advise on the best approach to protect children online. The resolution also calls for the disabling of addictive features on internet platforms used by minors, such as infinite scrolling and excessive notifications.
Read More
The European Parliament is considering a proposal to ban children under 16 from social media platforms. Citing concerning data on young people’s internet usage and potential risks, the Parliament’s Committee on the Internal Market and Consumer Protection (IMCO) has drafted a resolution advocating for a harmonized digital age limit across the EU. This resolution, which will be voted on in a plenary session, also suggests applying the same age limit to video-sharing platforms and AI assistants, with the possibility of a harmonized digital age limit of 13 years. While the European Commission has been working on measures to protect minors online, it has previously resisted imposing an EU-wide digital age, leaving the decision to member states.
Read More
Denmark is set to introduce a minimum age of 15 for certain social media platforms, following Prime Minister Mette Frederiksen’s concerns regarding youth mental health. This decision, supported by a majority of the parliament, aims to protect children and young people from harmful content online. The government will also invest 160 million Danish kroner in initiatives to strengthen online child protection and improve the digital landscape. While the specific platforms and enforcement methods are yet to be announced, parental consent may allow access for children as young as 13.
Read More