Apple is preparing to convert the Apple Watch series into a wearable device with artificial intelligence. According to information shared by Bloomberg writer Mark Gurman in the Power Ten Bulletin, the company plans to place integrated cameras in future Apple Watch models. It is stated that both standard series and its durability are evaluated in this scope. According to Gurman, Apple aims to use these new models until 2027.
With the addition of a camera to the Apple Watch, the device is expected to gain artificial intelligence -based visual detection capabilities. This system will be able to provide instant information about that object when the user directs the camera to an object. This feature, defined as visual intelligence, can seriously convert the user experience.
For example, it may be possible to learn the type of a plant, the brand of an object, or the evaluation of a restaurant directly over the clock. In addition, such systems bring deeper integration possibilities with augmented reality (AR) applications.
Camera Apple Watch and AirPods can be released in the same period
It was stated that Gurman worked for AirPods on a similar camera system in Gurman’s previous bulletins. According to new information, camera AirPods models can also be released in a similar time frame, ie in 2027. These two groups of product groups seem to make Apple’s artificial intelligence orientation more visible in wearable technologies in line with a common vision.
There are different approaches about camera placement. According to Gurman’s statements, Apple Watch Series is planned to be integrated directly into the screen. In ultra models, the larger body structure is located on the side of the device by using the structure. These technical differences may directly affect the usage scenarios of devices. For example, a camera placed on the screen offers a more natural user experience, while a camera placed on the side can offer different shooting angles.
Although there has not yet been an official explanation by Apple, such hardware updates once again show how much importance they attach to the company’s artificial intelligence infrastructure. Apple aims to expand its capacity to provide instant information to the user by moving the visual recognition features it offers on iPhone and iPad devices to more mobile and wearable forms.
In addition to the advanced artificial intelligence features, it is also a matter of curiosity how new equipment to be used on these devices will provide a structure in terms of energy consumption, processing power and data privacy. Apple’s software in parallel with hardware developments, Siri’s smarter and more environmental awareness is known to have a high structure. As with the Vision Pro title, Apple is expected to prioritize the processing of artificial intelligence (on-government) on such devices.
The fact that camera Apple Watch and AirPods products will be launched in the same period indicate that Apple aims to create a more holistic and interactive experience in wearable technologies. Together evaluation of artificial intelligence and camera integration can only be beyond basic functions such as health or notification management. While the competition in this area is expected to accelerate in the coming years, it is careful how Apple will offer a combination of hardware and software.