Technology
Danish Kapoor
Danish Kapoor

Updated with a simpler interface in Google Lens, iOS and Android

Google Lens is a tool that allows users to learn about objects using their cameras. This application has become one of the first options that come to mind when it comes to visual search. Both Android and iOS users can easily access the Lens by touching the camera icon in the search bar. It is possible to search or translate with the camera brought on the image.

In the updated version, Google Lens’s interface is very simplified. There are only two main options for users: “Search” and “Translation”. The previous “Homework” button was completely removed from the interface. This change offers ease of use and makes the application look cleaner.

Google lens directs users to Gemini for homework

Google’s choice is not a coincidence. The company is now emphasizing the Gemini artificial intelligence system to get help in their homework. Gemini not only offers the right response; At the same time, he explains step by step how he reached that conclusion. Thus, the user learns not only the answer but also the solution method.

Gemini’s explanatory answers draw attention, especially when it comes to mathematical procedures. The system, which analyzes the question step by step, accelerates the learning process. In addition to all these, the simplified interface of the Lens facilitates this transition. With the removal of unnecessary buttons, the user experience becomes more oriented.

Another effect of these changes in the interface can be seen in the visor area. In the new version, the viewfinder offers a wider imaging area. This creates a chance to examine the objects that the camera sees in more detail. Users now face a simple and functional design.

Under the “Ara” feature of Lens, there is now an artificial intelligence -supported mode. With this mode, more deep information can be obtained about the object shown with the camera. The visual recognition process is no longer superficial; It is also enriched. In addition, switching to this AI mode is quite easy.

The new Google Lens interface first appeared on Galaxy Z Fold 7. Users in this foldable display model had the chance to experience updated lens. However, this change is not yet active in the Pixel 6 Pro model that uses the Android 16 QPR1 beta version. Interestingly, the updated lens version is available on iPhone 15 Pro Max with iOS 26 Developer Beta 3 installed.

Apple offers a similar function in the iPhone 15 Pro and iPhone 16 series under the name of “Visual Intelligence”. Visual intelligence comes into play through the camera control button or action button. But Apple’s approach differs from Google; Here, while the operations are carried out in the device, Google Lens is working on cloud. Both solutions offer different advantages in visual search.

Danish Kapoor