Apple is preparing to offer remarkable security innovations to iPhone users with iOS 26 version. One of the most prominent ones is the nudity detection -based stop system integrated into FaceTime. This system instantly cuts the image and sound when inappropriate content is detected during the interview. This feature with artificial intelligence is currently tested in the beta version.
This new particularly, for user safety, the function of communication security, previously developed for child accounts, may be expanding. According to Apple’s previous statements, this system blurred the content when he detected nudity or warned the user. However, observations made in the last beta version show that this feature is also active in adult accounts. This brings new question marks about the scope of the system.
When it detects non -conformity with iOS 26, it deducts the interview
In a video shared by the source of idevicehelp on the X platform, it is seen that the screen stopped when nudity was detected during the FaceTime interview. The application suspends the sound and image and shows a warning message on the screen. The message includes the phrase, “sound and video has been paused because you show a sensitive content”. The user is advised to make a choice between continuing or terminating.
Looking at how the system works, it is stated that everything happens in the device. Apple does not send any content to its servers for nudity detection. Instead, analysis is performed through machine learning models inside the device. Thus, user privacy is technically preserved.
Although the system seems to pay attention to the limits of privacy in this respect, the activation of adult users has also caused some concerns. Because it is not clear according to the criteria of such an intervention. The interruption of the image without the consent of the user brings the discussions of digital privacy again. This uncertainty raises the need for more explanation as to exactly which account types of the feature.
Apple said that this new security function was aimed at only children in the introduction of iOS 26. However, signs in the beta version show that the system may also affect adult users. This increases the need for clarity on how the company implements user limits and security policies. In addition to all these, it is not yet known how to offer the feature in the final version.
On the other hand, how sensitive the algorithm of the system works is a matter of discussion. Is it only nudity perceived, or is it activated in every body themed image; These questions are not yet clear. In addition, whether the system produces the wrong positive results is currently in the test phase. If the interviews stop, the experience of users will be decisive in terms of the usability of the application.