While Google continues to expand the scope of artificial intelligence-supported translation technologies, it has made the “Live Translation” feature, which offers instant translation via headphones, available to more users. According to the information shared by the company, this feature, which was previously only available in certain regions and on Android devices, is now also available on the iOS platform. In addition, the list of supported countries has also been expanded. Access limited to the US, India and Mexico; It has been expanded to cover different markets such as Germany, Spain, France, Italy, United Kingdom, Japan, Nigeria, Bangladesh and Thailand.
This feature allows users to instantly listen to conversations in foreign languages through their headphones. In addition, the system does not only translate words; It offers a more natural experience by preserving the speaker’s tone, emphasis and speaking tempo. Despite this, the main purpose of technology stands out as facilitating communication between people speaking different languages. This structure, supported by Google’s Gemini artificial intelligence model, aims to reduce the language barrier, especially in face-to-face communication.
Google Translate Live Translation feature supports over 70 languages
One of the highlights of the Live Translation feature is that it works without the need for special hardware. Users can experience this experience with their existing headphones. In addition, the system supports more than 70 languages, making it possible to use it in different countries. Although translation applications have been a part of life for a long time, offering a real-time and fluent listening experience takes this feature to a different point.
On the other hand, the usage process is designed in a very practical way. After opening the Google Translate application, simply tap the “Live Translation” option and connect the headphones. In this way, users can instantly follow conversations in a foreign language in their own language. However, daily scenarios such as understanding public transportation announcements while traveling or communicating with people speaking different languages are among the prominent areas of use of this technology.
In addition, Google announced that it expanded the scope of a different artificial intelligence feature on the same day. The system, called “Live Search” and providing real-time information via the camera, can now be used in more countries and languages. This feature, which was first introduced in July 2025, allows users to get instant information by pointing the phone camera at an object. In addition, by utilizing the visual context, a mutual interaction can be established between the user and the system.
The Live Search feature can be accessed on both Android and iOS devices via the Google application. This system, which is opened via the “Live” icon under the search bar, is available in more than 200 countries and regions. Despite everything, the performance of such artificial intelligence-supported solutions; It may vary depending on factors such as internet connection, device compatibility and language support. Despite this, these steps taken by Google enable real-time artificial intelligence experiences on mobile devices to reach a wider user base.
In order not to miss the technology agenda, 📰 add it to Google News, 💬 join our WhatsApp channel, ▶ subscribe to YouTube, 📷 follow us on Instagram and 𝕏 X.