Ray-Ban Meta smart glasses get groundbreaking AI-powered updates that would change the way users connect with technology. The smart glasses now boast real-time video, live translations, and smooth conversations with the AI assistant. This innovation puts Meta at the vanguard of smart glasses.
With the latest firmware update (v11), Meta’s Ray-Ban smart glasses now feature “live AI,” which allows users to have continuous conversations with Meta AI. First introduced earlier this year, live AI eliminates the need for a wakeword like “Hey, Meta” during interactions. Users can now ask follow-up questions, shift topics mid-conversation, and reference earlier discussions effortlessly. This continuous conversational capability represents a significant leap in making AI more intuitive and user-friendly.
Features of Ray-Ban Meta glasses
One of the coolest new features is live AI video. Wearers can point a question at their surroundings by using the front-facing camera on the glasses to give them instant answers. A user might ask, for example, about the closest landmarks, businesses, or other points of interest in their neighborhood. This is the greatest feature Meta’s Connect developer conference had in store and the most direct answer to Advanced Voice Mode with Vision on OpenAI and Google’s Project Astra. Therefore, with this new upgrade, Meta would become among the first giant technology corporations to market the first video capabilities using real-time AI via smart glasses.
Firmware v11 includes live translation, which is intended to break language barriers. Users can translate speech from one of three languages: English and Spanish, French, or Italian in real-time. In conversation with someone speaking one of these languages, users will hear the translated speech in their native language via the open-ear speakers in the glasses, while a text transcript will appear on their paired smartphones. This function will make it easier to communicate across cultures than ever before.
Shazam support is another exciting feature on the Ray-Ban Meta glasses. Users can now identify songs by simply saying, “Hey, Meta, Shazam this song.” The glasses will try to recognize and name the tune, adding another layer of utility to these AI-enhanced wearables.
Future Possibilities and Current Limitations
Meta has dropped hints at even more sophisticated features in the future, including predictive suggestions from the AI before users even ask for help. However, it has not elaborated much on the specifics of the features. Meta also notes that the current features such as live AI and translation might not always be error-free, stating that it is constantly working to enhance the user experience.
Meta’s commitment to innovation can be seen in the continued updates of its Ray-Ban Meta glasses. The current update comes after the last major update in November that brought AI capabilities to the users in France, Italy, and Spain. Since their launch, these glasses have been a commercial success, with Ray-Ban Meta becoming the leading selling glasses brand in 60% of Ray-Ban stores across Europe, the Middle East, and Africa, according to EssilorLuxottica.