The Ray-Ban Meta Smart Glasses already worked well as a head-mounted camera and pair of open-ear headphones, but now Meta is updating the glasses with access to live AI without the need for a wake word, live translation between several different languages, and access to Shazam for identifying music.
Meta first demoed most of these features at Meta Connect 2024 in September. Live AI lets you start a “live session” with Meta AI that gives the assistant access to
→ Continue reading at Engadget