The US Attorney General of New Mexico’s demands for stricter regulation of technology platforms continue to be criticized by industry representatives and some non-governmental organizations. Meta and groups with similar views state that this approach is far from providing the expected benefits and may bring different risks. These initiatives, which are especially at the center of discussions about children’s online safety, have initiated a new debate on the scope of responsibility of technology companies and the limits of state intervention.
Maureen Flatley, president of the organization Stop Child Predators, states that the regulations proposed in New Mexico are not based on a sound enough basis. According to Flatley, these demands are not only inadequate to solve existing problems, but can also increase the risks of different types of abuse. Stating that it is unrealistic for platforms to be held responsible for blocking all harmful users, Flatley criticizes this approach by comparing it to banks being held responsible for all robberies. Accordingly, the responsibility placed on digital platforms draws a framework that does not coincide with the functioning of the current system.
A similar approach stands out on the commodity front. Making a statement on behalf of the company, Chris Sgro states that the New Mexico Attorney General’s focus on only a single platform is an incomplete strategy. Sgro reminds that young users use many different applications during the day, and states that addressing the problem only on a specific platform narrows the scope. In addition, evaluations are also shared that the proposed regulations may violate parental rights and limit freedom of expression. In addition to all these, Meta states that it has implemented 13 different security measures in the last year and continues to work on making the user experience age-appropriate.
Debate on liability in the technology sector widens
Contrary to criticism, New Mexico Attorney General Raul Torrez points out the need for broader regulation. Torrez, who is not limited to only certain platforms, argues that more comprehensive legal regulations should be made for the technology sector in general. Torrez, who made contacts for new measures to protect children online during his visit to Washington DC, states that the Section 230 law, which provides broad protection to technology companies regarding user content, especially in the USA, should be reconsidered.
According to Torrez, the current legal framework contains uncertainties and does not clearly define the scope of responsibility of technology companies. He states that if the protection provided by Section 230 is removed, companies may have to be more accountable in legal processes. It is evaluated that this situation may lead platforms to implement content moderation and security policies more carefully.
On the other hand, legal experts point out that regulatory changes in the USA are often shaped not through direct legislation but through lawsuits. According to Chapman, judicial processes can pave the way for broader policy changes, as seen in the tobacco, opioid and e-cigarette industries. In this context, it is stated that lawsuits filed against technology companies may have a similar effect.
The discussions bring to the agenda again the question of how to establish the balance between the responsibility of technology platforms and user rights. On the one hand, the safety of children and the need to prevent harmful content come to the fore, while on the other hand, concerns are expressed about freedom of expression and the functioning of the platforms. Current developments show that more comprehensive changes will be discussed in terms of both legal regulations and sector practices in the coming period.
In order not to miss the technology agenda, 📰 add it to Google News, 💬 join our WhatsApp channel, ▶ subscribe to YouTube, 📷 follow us on Instagram and 𝕏 X.