With the iOS 26.4 update, Apple not only introduced new features for users in the UK, but also introduced a remarkable regulation regarding account security and age verification. The company has made it mandatory for users to verify their age in order to use certain services to access their iCloud accounts. Accordingly, users must prove that they are over 18 years of age in order to access some features or perform certain transactions on their accounts. This approach stands out as a parallel development with increasing regulatory pressures, especially on digital security and child protection. In addition, Apple’s integration of this system directly into the operating system level stands out as an element that expands the scope of the application.
Users can perform age verification through the Settings menu. Methods such as adding credit card information to their accounts or scanning official identity documents are offered for this. However, the automatic control mechanism of the system comes into play for users who have previously added a payment method to their Apple account. Thus, it can be evaluated whether the user’s age is appropriate or not based on the available data. Although this method provides practicality, it seems likely that discussions about user privacy and data security will come to the fore. Despite this, it is known that Apple limits data transfer to third parties by keeping this process within the system.
Apple automatically activates some measures for those who do not verify their age
Apple automatically activates some security features for users who have not completed age verification or are determined to be under 18 years of age. In this context, Web Content Filter and Communication Security tools come into play. These tools can restrict access to certain websites in third-party browsers, especially Safari. In addition, systems that warn the user if photos and videos containing nudity are sent or received are also actively working. Although these features were previously available in the Apple ecosystem, they have become mandatory for a wider range of users with the new regulation.
On the other hand, Ofcom, the UK’s communications regulator, evaluated Apple’s move as a positive development. The Authority specifically emphasized that it is not a legal obligation for the company to implement this practice. Despite this, it is noteworthy that Apple has made the UK one of the first countries where such child safety measures are implemented. Ofcom officials state that flexible regulations encourage technology companies to develop innovative solutions, and that such practices can provide more effective protection against harmful content.
In addition, the regulations enacted under the online security law aim to limit young users’ access to risky content. This step taken by Apple provides an example of going beyond the legal framework in question and strengthening in-platform security measures. Nevertheless, it continues to be closely monitored how such systems will balance in terms of user experience, data privacy and freedom of access in the long term.
To avoid missing the technology agenda, 📰 add it to Google News, 💬 join our WhatsApp channel, ▶ subscribe to YouTube, 📷 follow us on Instagram and 𝕏 X.