Apple seems to have reached a concrete stage in the smart glasses project, which has been the subject of rumors for a long time. According to Bloomberg’s information, the company’s first smart glasses can be available in 2026. The glasses will include microphone, speaker and camera. In addition, the device will be able to directly interact with Apple’s digital assistant Siri.
One of the most remarkable aspects of this new hardware will be the ability to analyze environmental data. In other words, glasses will not only have a screen, but also a interface that connects between the user and the surrounding area. The ability to make phone calls, controls my music players and offers directions will take this device beyond an ordinary accessory. In addition, the ability of the device to make live translation can provide significant convenience for users in different geographies.
Apple plans to present its own model with more advanced hardware as an alternative to meta’s glasses.
Bloomberg designed a special chip for this new product. It is not yet clear to what extent this chip will provide performance or provide energy efficiency. However, considering that Apple has successfully used its chips for a long time, a similar performance can be expected to be reflected on wearable technology. In addition to all these, the augmented reality support will not be in the first version.
Although advanced features such as augmented reality are not yet ready, even the basic functions of the device can place it in a different location in the market. Siri integration will provide an important advantage in terms of harmony with Apple’s other products. In addition, via voice commands can be remarkable in terms of both accessibility and user experience. The claims that the glasses will not be the screen may directly affect the way the product uses.
On the other hand, the companies that Apple will be directly competing with this product do not stand empty. Meta has made more than 1 million sales from smart glasses produced in cooperation with Ray-Ban. Google is working on the Android XR platform with companies such as Xreal, Warby Parker, Samsung and Gentle Monster. This shows that there will be serious competitors in the market.
Sam Altman and Jony Ive are expected to introduce an artificial intelligence -based hardware device in 2026. The fact that Apple’s glasses will be introduced on similar dates indicates that wearable AI hardware will increase significantly in the same period. Despite everything, Apple’s ecosystem can be decisive in the adoption of this device. Integration with devices such as iPhone, Apple Watch and AirPods can direct Apple users to this new product.
Apple’s other projects are progressing in parallel with the development process of glasses. According to Bloomberg, the company was working on an Apple Watch model with camera and artificial intelligence functions, but this initiative was canceled. However, camera -featured AirPods prototypes are still in the stage of development. It is understood that the company has done various experiments in different form factors in this field.
Together with the new generation of wearable technologies, the concept of personal information process is transformation. Users are no longer expected to interact with devices with natural conversations and gestures, not with screens. How Apple responds to these expectations can be a decisive factor in the success of the glasses. In the light of all this, it may be the beginning of a new era in the smart glasses market of 2026.