Live Text in iOS 15 vs Google Lens in Android 12 Comparison

BY Sanuj Bhatia

Published 17 Jun 2021

ios 15 live text vs google lens android 12

iOS 15 brings a number of features to the table. One of the key improvements in the operating system is the ability to extract text out of pictures. Called Live Text, iOS 15 makes it easier for you to fail a phone number or visit a website directly from your camera. Android devices have had this feature for quite a while, called Google Lens. How Live Text in iOS 15 compares against Google Lens on Android 12? Let’s dig in.

After posting a detailed Siri in iOS 15 vs Google Assistant in Android 12 last week, YouTuber In Depth Reviews posted a video comparing Live Text in iOS 15 vs Google Lens on Android 12. Before we take a look at the test results, we should keep in our minds that iOS 15 is in beta right now. Apple will fine-tune the text extracting feature before stable roll out in the fall.

Live Text vs Google Lens: Text Recognition

The YouTuber divided the test into six parts. The first one, and the most important one, is how well the text recognition works on both. iOS 15 has a better text recognition system since you can directly search for the text from Spotlgith — no need to go to the image and select the text, you can directly do it by typing the text you’re looking for directly in the Spotlight.

When extracting text from images that have printed texts, such as contact cards and pamphlets, both Google Lens and Live Text worked just fine. However, in extracting text from hand-written images, Google Lens took the lead. Overall, both text extracting services worked just fine.

Translating Text

When it came to translating text from images taken from a computer, both Google Lens and Live Text worked fine. However, once again, Google Lens translated the text better when it came to translating text from hand-written images.

Visual Lookup

Visual Lookup is a feature in iOS 15 that provides you with the information of a landmark directly from the Photos app. Google Lens can already do this thing, with an added ability to search for objects too. As expected, Lens came on top since Live Text doesn’t recognize objects as of now. Even in extracting landmarks from an image, Live Text made an error identifying Dubai’s Tolerance bridge as New Castle’s Millennium bridge.

It’s clear from this test that Live Text is still in beta right now. Apple will further fine-tune the software over the upcoming days, and we’ll report more on this as the launch comes closer.

Eager to give iOS 15’s new Live Text feature a try? You can download and install iOS 15 Beta 1 on your iPhone right now.

On a personal note, I’m still a fan of Live Text on iOS 15. The fact that you can search for any text from any image directly from the Spotlight excited me. Have you tried Live Text yet? Do you prefer Google Lens or Apple’s Live Text? Why do you prefer it over the other? Let us know in the comments section below!