Advertisement

Ray-Ban's Meta sunglasses can now identify and describe landmarks

Now in beta, it's one of the more useful 'multimodel' AI-powered features.

Engadget

AI-powered visual search features arrived to Ray-Ban's Meta sunglasses last year with some impressive (and worrying) capabilities — but a new one in the latest beta looks quite useful. It identifies landmarks in various locations and tells you more about them, acting as a sort of tour guide for travelers, Meta CTO Andrew Bosworth wrote in a Threads post.

Bosworth showed off a couple of sample images explaining why the Golden Gate Bridge is orange (easier to see in fog), a history of the "painted ladies" houses in San Francisco and more. For those, the descriptions appeared as text below the images.

On top of that, Mark Zuckerberg used Instagram to show off the new capabilities via a few videos taken in Montana. This time, the glasses use audio to provide a verbal description of Big Sky Mountain and the history of the Roosevelt Arch, while explaining (like a caveman) how snow is formed.

Meta previewed the feature at its Connect event last year, as part of new "multimodal" capabilities that allow it to answer questions based on your environment. That in turn was enabled when all of Meta's smart glasses gained access to real-time info (rather than having a 2022 knowledge cutoff as before), powered in part by Bing Search.

The feature is part of Meta's Google Lens-like feature that enables users to “show” things they are seeing through the glasses and ask the AI questions about it — like fruits or foreign text that needs translation. It's available to anyone in Meta's early access program, which is still limited in numbers. "For those who still don’t have access to the beta, you can add yourself to the waitlist while we work to make this available to more people," Bosworth said in the post.

If you buy something through a link in this article, we may earn commission.