Athira Sethu
Kochi, 17 December 2024
Meta Platforms, Inc. recently updated its smart glasses under the Ray-Ban Meta label, adding an exciting AI video capability to the feature set, allowing for real-time language translation. This is available only to those enrolled in the “Early Access Program.” The v11 software update was made on Monday.
One of the major updates is the addition of AI-powered video. This feature allows glasses to capture video and work with Meta’s AI chatbot assistant. The glasses now understand what the wearer is looking at and can give answers to questions on real time. This makes glasses much smarter, helping users interact more easily with the world around them.
Another exciting addition is the real-time language translation. The Ray-Ban smart glasses can now translate spoken language between English and three other languages: Spanish, French, and Italian. If you’re speaking with someone who uses one of these languages, the glasses will either translate what they say into English through the open-ear speakers or show the translation as text on your phone. This makes conversations between people who speak different languages much easier.
Meta has also integrated Shazam into the smart glasses. Shazam is an app that allows any song to be identified and, therefore, found or browsed by listening. By using the Ray-Ban smart glasses, customers are now able to tap songs they hear in and around them. This would apply to users in both U.S. and Canadian lands.
This year, Meta announced even more AI updates for the Ray-Ban glasses, including the ability to set reminders and use voice commands to scan QR codes and phone numbers. These updates are making smart glasses much more useful and versatile, helping users do more without needing to take out their phones.
The new updates from Meta will make smart glasses a powerful tool in everyday life, easier to communicate, get information, and stay connected.