The Ray-Ban Meta glasses are the primary actual synthetic intelligence wearable success story. In reality, they’re truly fairly good. They’ve acquired that stylish Ray-Ban styling, which means they don’t look as goofy as a number of the bulkier, heavier makes an attempt at blended actuality face computer systems. The on-board AI agent can reply questions, and even establish what you’re taking a look at utilizing the embedded cameras. Individuals additionally love utilizing voice instructions to seize pictures and movies of no matter is correct in entrance of them with out whipping out their telephone.
Quickly, Meta’s sensible glasses are getting some extra of those AI-powered voice options. Meta CEO Mark Zuckerberg introduced the most recent updates to the sensible glasses’ software program at his firm’s Meta Join occasion right this moment.
“The reality is that most of the time you’re not using smart functionality, so people want to have something on their face that they’re proud of and that looks good and that’s, you know, designed in a really nice way,” Zuckerberg stated at Join. “So they’re great glasses. We keep updating the software and building out the ecosystem and they keep on getting smarter and capable of more things.”
The company also used Connect to announce its new Meta Quest 3S, a more budget-friendly version of its mixed reality headsets. It also unveiled a host of other AI capabilities across its various platforms, with new features being added to its Meta AI and Llama large language models.
As far as the Ray-Bans go, Meta isn’t doing too much to mess with a good thing. The smart spectacles got an infusion of AI tech earlier this 12 months, and now Meta is including extra capabilities to the pile, although the enhancements listed below are fairly minimal. You may already ask Meta AI a query and listen to its responses immediately from the audio system embedded within the frames’ temple items. Now there are a number of new issues you possibly can ask or command it to do.
Probably the most impressive is the ability to set reminders. You can look at something while wearing the glasses and say, “Hey, remind me to buy this book next week,” and the glasses will understand what the book is, then set a reminder. In a week, Meta AI will tell you it’s time to buy that book.
Meta says live transcription services are coming to the glasses soon, meaning people speaking in different languages could see transcribed speech in the moment—or at least in a somewhat timely fashion. It’s not clear exactly how well that will work, given that the Meta glasses’ previous written translation talents have confirmed to be hit and miss.
There are new frame colors and lens colors being added, and customers now have the option to add transition lenses that increase or decrease their shading depending on the current level of sunlight.
Meta hasn’t stated precisely when these extra AI options will likely be coming to its Ray-Bans, besides that they’ll arrive someday this 12 months. With solely three months of 2024 left, meaning very quickly.