Behold, my beautiful … moth orchids. | Image: The Verge

Sometimes, even as a tech reporter, you can be caught out by how quickly technology improves. Case in point: it was only today that I learned that my iPhone has been offering a feature Iā€™ve long desired ā€” the ability to identify plants and flowers from just a photo.

Itā€™s true that various third-party apps have offered this function for years, but last time I tried them I was disappointed by their speed and accuracy. And, yes, thereā€™s Google Lens and Snapchat Scan, but itā€™s always less convenient to open up an app I wouldnā€™t otherwise use.

But, since the introduction of iOS 15 last September, Apple has offered its own version of this visual search feature. Itā€™s called Visual Look Up, and itā€™s pretty damn good.

It works very simply. Just open up a photo or screenshot in the Photos app and look for the blue ā€œiā€ icon underneath. If it has a little sparkly ring around it, then iOS has found something in the photo it can identify using machine learning. Tap the icon, then click ā€œLook Upā€ and itā€™ll try and dredge up some useful information.

Tapping the ā€œiā€ icon usually gives you more information about when you took the photo and your camera settings. If the ring sparkles, though, thereā€™s Visual Look Up data to see, too.

After you hit the ā€œiā€ icon youā€™ll have the option to look up more information based on a few select categories.

It doesnā€™t just work for plants and flowers, either, but for landmarks, art, pets, and ā€œother objects.ā€ Itā€™s not perfect, of course, but itā€™s surprised me more times than itā€™s let me down. Here are some more examples just from my camera roll:

Image: The Verge
Visual Look Up works for landmarks, animals, and art, as well as plants and flowers.

Although Apple announced this feature last year at WWDC, it hasnā€™t exactly been trumpeting its availability. (I spotted it via a link in one of my favorite tech newsletters, The Overspill.) Even the official support page for Visual Look Up gives mixed messages, telling you in one place itā€™s ā€œU.S. onlyā€ then listing other compatible regions on a different page.

Visual Look Up is still limited in its availability, but access has expanded since launch. Itā€™s now available in English in the US, Australia, Canada, UK, Singapore, and Indonesia; in French in France; in German in Germany; in Italian in Italy; and in Spanish in Spain, Mexico, and the US.

Itā€™s a great feature, but itā€™s also got me wondering what else visual search could do. Imagine snapping a picture of your new houseplant, for example, only for Siri to ask ā€œwant me to set up reminders for a watering schedule?ā€ ā€” or, if you take a picture of a landmark on holiday, for Siri to search the web to find opening hours and where to buy tickets.

I learned long ago that itā€™s foolish to pin your hopes on Siri doing anything too advanced. But these are the sorts of features we might eventually get with future AR or VR headsets. Letā€™s hope if Apple does introduce this sort of functionality, it makes a bigger splash.

By

Leave a Reply

X