Technology
Danish Kapoor
Danish Kapoor

Meta also wants to analyze photos that are not shared on Facebook

Meta offers a new feature on Facebook and demands that users upload photos on their phones directly to the cloud. When the feature is enabled, not only shared, but also non -shared images are transferred to the meta’s system for analysis. Although there are purposes such as making suggestions through photographs, this step means a new stage in the extensive use of personal data.

New system, Facebook Story When it is desired, it comes into play with an information window opened. Here, users are offered “Cloud Processing”, the cloud processing option. When the user accepts this feature, the photos on the device can be regularly transferred to Meta’s servers. In this way, Meta says that he can offer the themes prepared with collage suggestions, past summaries or artificial intelligence. During this process, not only visuals, but also meta data such as faces, objects and shooting history are analyzed. When the user accepts this process, Meta also obtains the right to store and use this information.

All the public content on Facebook and Instagram has already been explained that since 2007, Meta has been used to train artificial intelligence models. However, the scope of the definition of public content still remains unclear. Based on the expression of “users over 18”, the company uses these contents and passes without explaining how the age determination is made. Now, the possibility that photographs that are not public and have never been shared can be included in this process are on the agenda. In the conditions of the “Cloud Processing” feature, there is no clear provision as to whether these visuals will be used for artificial intelligence training.

How the special photos on the camera roller are processed for now it remains unclear

Meta says that this process has started with the approval of the user and can always be closed from the Settings menu. When the cloud processing feature is disabled, Meta removes the previously installed images from the system within 30 days. However, some users shared this feature through platforms such as Reddit. Especially the wedding photos of a user Studio Ghibli His reorganized in his style increased the question marks about which visuals the system is trading. So far, Meta has not answered the questions of broadcasting organizations such as The Verge and Techcrunch about the scope or functioning of this system. Therefore, it remains unclear how the system works exactly, which data is processed and how it is stored.

Facebook’s cloud processing system first began to test in the United States and Canada. Although the pop-up messages offered to users clearly indicate that the feature is optional, many users may not notice this feature due to the location and language of the settings. The fact that the settings are included under the title of “Film Roll Sharing Suggestions ve and the details of the details in multiple menu make access difficult. This creates a user experience that can lead to a passive acceptance of the feature. Following the development, concerns about the issue began to be expressed on privacy -oriented platforms. Unless users disable this feature, there is a possibility that their photographs will be evaluated in the artificial intelligence systems of Meta.

In addition, the statements used in the official documents about what Meta is doing with these data are also confusing. “Media and facial features can be analyzed”, which comes with the expression “right to store and use information, does not clearly reveal how and for what period. In addition, the presentation of these information documents only in English enlarges the problem of transparency for users who speak different languages. Considering the past applications of the company’s user data in processing and use of artificial intelligence models, it is not possible to make a definite prediction about the long -term effects of the new system. Meta’s cloud processing feature is thought to be activated in different regions at the end of a large -scale test process.

In any case, this new step makes it possible to analyze the content that users have stored on their devices directly by the technological infrastructure of the Meta. The inclusion of the user in this analysis process without any content sharing reveals a new situation in terms of digital privacy principles. In particular, the application of processes such as face recognition technologies, time and content labeling to such special data reunites the discussions of data ownership. Considering that the photograph is not only the image, but also personal knowledge and context, the scope of the results of this development is better understood. It is not yet known to what extent Meta will inform the public about this process and in the future.

Danish Kapoor