Meta brings a new artificial intelligence function to its smart glasses with Ray-Ban partnership and brings its product to a different point in this field. The new feature offers a system that can define the environment in a more detailed and understandable way. This system aims to help individuals who are visually impaired and less independently in daily life. Glasses analyze the surroundings with the command from the user and provide detailed information about visual elements.
Users can make this artificial intelligence feature active through the Meta AI mobile application. The system comes into play with a short process from the device settings in the application. When the user asks a question about his environment, artificial intelligence begins to produce answers. These answers are provided by real -time scanning of the visual content.
A promotional video shared by Meta clearly demonstrates how the feature works. On the question posed by a visually impaired user to artificial intelligence in a park, the system correctly describes the pathway, trees and water bodies in the distant. On a different scene, the objects in a kitchen are listed by artificial intelligence and transferred to the user. These examples clearly show how the system can become functional in daily life.
The system is available not only for visually impaired individuals, but for anyone who wants to learn more about their environment. The user, who wears the glasses, can reach verbal explanations with the help of artificial intelligence, even if he cannot see the elements in his own way. In this way, even in complex environments can be easier to find direction. The feature is currently available for users in the USA and Canada.
Meta announced this innovation within the scope of the World Accessibility Awareness Day. Other accessibility -oriented solutions were introduced. One of them is a new system called “Call A Volunteer .. This system brings visually impaired individuals together with voluntary people in real time.
The volunteer network is organized on the platform called Be Mys Eyes. Users can get support by connecting to a volunteer with a single button at the time of need. This support system works actively while shopping or trying to find direction. The platform is planned to be operational in 18 countries at the end of this month.
In addition, Meta has updated live subtitle support for its products such as Quest VR headlines and Meta Horizon Worlds. This system makes it possible for users to follow the content by reading the contents instantly. This subtitle feature, especially for hearing impaired individuals, offers a significant contribution. The system is currently reported to be active in quest titles.
All these updates show that Meta adopts a more holistic approach to accessibility. The visual description feature of the glasses is not only a technology innovation, but also a harbinger of a more inclusive experience in the digital world. Although it is available in limited regions in the first place, it is expected to reach a wider audience in the future. These features are thought to be adapted for other user groups.