Multi-Mode Artificial Intelligence Update Arrived for Ray-Ban Meta Smart Glasses
The multimodal artificial intelligence addition, which started early access testing for Ray-Ban Meta smart glasses in January, is finally available to all users. presented. Multimodal AI allows an AI assistant to process multiple types of information, including photos, videos, text, and audio, so it can view and understand the world around you in real time.
Multimodal Meta AI is rolling out widely on Ray-Ban Meta starting today! It’s a huge advancement for wearables & makes using AI more interactive & intuitive.
Excited to share more on our multimodal work w/ Meta AI (& Llama 3), stay tuned for more updates coming soon. pic.twitter.com/DLiCVriMfk
— Ahmad Al-Dahle (@Ahmad_Al_Dahle) April 23, 2024
The glasses have a camera and five microphones that act as the eyes and ears of the artificial intelligence. With this in mind, you can ask the glasses to identify anything you’re looking at. Meta says the AI can also read signs in different languages, which is great for travel. For example, it is possible to learn the breed of the dog you are looking after, get information about a work and more with this innovation. Real-time translation will also be a great convenience for communicating with foreigners.
Meta also shared the AI update as well as hands-free video calling integration with WhatsApp and Messenger. There are also some new frame designs for smart glasses.