Behold, my beautiful … moth orchids. | Image: The Verge
Sometimes, even as a tech reporter, you can be caught out by how quickly technology improves. Case in point: it was only today that I learned that my iPhone has been offering a feature Iāve long desired ā the ability to identify plants and flowers from just a photo.
Itās true that various third-party apps have offered this function for years, but last time I tried them I was disappointed by their speed and accuracy. And, yes, thereās Google Lens and Snapchat Scan, but itās always less convenient to open up an app I wouldnāt otherwise use.
But, since the introduction of iOS 15 last September, Apple has offered its own version of this visual search feature. Itās called Visual Look Up, and itās pretty damn good.
It works very simply. Just open up a photo or screenshot in the Photos app and look for the blue āiā icon underneath. If it has a little sparkly ring around it, then iOS has found something in the photo it can identify using machine learning. Tap the icon, then click āLook Upā and itāll try and dredge up some useful information.
Tapping the āiā icon usually gives you more information about when you took the photo and your camera settings. If the ring sparkles, though, thereās Visual Look Up data to see, too.
After you hit the āiā icon youāll have the option to look up more information based on a few select categories.
It doesnāt just work for plants and flowers, either, but for landmarks, art, pets, and āother objects.ā Itās not perfect, of course, but itās surprised me more times than itās let me down. Here are some more examples just from my camera roll:
Image: The Verge
Visual Look Up works for landmarks, animals, and art, as well as plants and flowers.
Although Apple announced this feature last year at WWDC, it hasnāt exactly been trumpeting its availability. (I spotted it via a link in one of my favorite tech newsletters, The Overspill.) Even the official support page for Visual Look Up gives mixed messages, telling you in one place itās āU.S. onlyā then listing other compatible regions on a different page.
Visual Look Up is still limited in its availability, but access has expanded since launch. Itās now available in English in the US, Australia, Canada, UK, Singapore, and Indonesia; in French in France; in German in Germany; in Italian in Italy; and in Spanish in Spain, Mexico, and the US.
Itās a great feature, but itās also got me wondering what else visual search could do. Imagine snapping a picture of your new houseplant, for example, only for Siri to ask āwant me to set up reminders for a watering schedule?ā ā or, if you take a picture of a landmark on holiday, for Siri to search the web to find opening hours and where to buy tickets.
I learned long ago that itās foolish to pin your hopes on Siri doing anything too advanced. But these are the sorts of features we might eventually get with future AR or VR headsets. Letās hope if Apple does introduce this sort of functionality, it makes a bigger splash.