Meta’s smart glasses can now tell you where you parked your car

Meta is rolling out some of the previously announced features to its AI-powered Ray-Ban smart glasses for users in the US and Canada. CTO Andrew Bosworth posted on Threads that today’s update to the glasses includes more natural language recognition, meaning the stilted commands of “Hey Meta, look and tell me” should be gone. Users will be able to engage the AI assistant without the “look and” portion of the invocation.

Most of the other AI tools showed off during last month’s Connect event are also arriving on the frames today. That includes voice messages, timers

→ Continue reading at Engadget

Similar Articles

Advertisment

Most Popular