
Google launched a new artificial intelligence (AI) upgrade on Thursday, called “expression title.” This feature has been launched on the real-time subtitles feature on Android. This way, users will be able to see live subtitles of videos played across the device in a new format to better convey the context behind the sound. The AI feature will convey excitement, yell and loudness, and display the text on all hats. Currently, there are expressive titles on Android 14 and Android 15 devices in the United States.
Google’s “expression subtitles” feature relies on AI
Search giants share details of a new AI feature that is being added to Android’s live subtitles and says their presentations are being made public, although subtitles first became accessible to the deaf and hearing-impaired communities in the 1970s. It has not changed over the past 50 years.
Today, many people use subtitles when streaming content online in loud public places to better understand what is said or consume content in a foreign language. Google noted the popularity of subtitles for Android users, and said it is now using AI to innovate the message that subtitles convey.
Expressive subtitles, live subtitles will be able to convey content such as tone, volume, ambient cues, and human noise. “These little things change a lot in conveying content beyond the text, especially for life and social content without preloaded or high-quality titles,” Google said.
One way to express a title will innovate a title is to display all capital letters to indicate the intensity of the speech, whether it is excitement, loudness or anger. These subtitles will also recognize sounds like sighs, grunts and breathing, helping users better understand the nuances of voice. Additionally, it will capture ambient sounds such as applause and cheer that play in the foreground and background.
Google says expression subtitles are part of live subtitles, and the feature is built into the operating system and will be used on Android devices regardless of the app or interface the user uses. As a result, users can find real-time AI subtitles when watching live streams, memories in social media posts and Google photos, and videos shared on the messaging platform.
It is worth noting that AI processing expressing subtitles is done on the device, which means that users will see them even if the device is not connected to the Internet or in airplane mode.