Meta Unveils AI Visual Recognition Features for Ray-Ban Meta Smart Glasses


Meta has introduced an early access program for Ray-Ban Meta smart glasses users, inviting them to test and provide feedback on new features. The American tech giant is currently deploying experimental AI-driven functionalities to participants in the program, initially limited to users in the United States. However, these innovative features are anticipated to be globally available once the testing phase concludes.

Among the novel features, Meta has integrated ‘Look and Ask’ capabilities into Ray-Ban Meta smart glasses, empowering them to perceive and comprehend their surroundings through built-in cameras. This functionality enables users to invoke Meta AI, prompting it to describe objects in their vicinity and offer suggestions based on visual inputs. The smart glasses can even interpret text from signs and other visuals, providing real-time translations to users.

The Look and Ask feature allows users to initiate questions about their surroundings by saying, “Hey Meta, look and…”. Additionally, users can inquire about pictures captured by the smart glasses within 15 seconds of taking them by stating, “Hey Meta…”. Ray-Ban Meta smart glasses facilitate photo capture through voice commands or the capture button.

When users prompt Meta AI with queries regarding their surroundings, the glasses capture and send an image to Meta’s cloud for AI processing. Subsequently, Meta AI delivers an audio response directly to the glasses. Users can conveniently review their inquiries, the corresponding images, and the AI-generated responses within the ‘Your Requests’ section.

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post
apple 2

Apple Introduces Advanced ‘Stolen Device Protection’ Mode, Elevating iPhone Security

Next Post

Elon Musk Reveals Plans for Video Feature in X Spaces

Related Posts