Meta has launched new functionalities for its Ray-Ban smart glasses, introducing live translation and AI capabilities. The glasses utilize an integrated camera to perceive visual input, enabling real-time conversations without the need for a wake phrase.
These smart glasses assist users with various tasks hands-free, including meal preparation and navigation. Meta aims for the AI to eventually offer proactive suggestions based on user context.
The live translation feature allows for real-time communication between English and languages such as Spanish, French, and Italian. When a speaker communicates in one of these languages, the glasses translate their words into English through built-in speakers or a connected smartphone.
Additionally, users can identify songs by asking, 'Hey Meta, what is this song?' which prompts Shazam to provide the title. This feature is part of the Early Access Program available to Meta glasses users, although limited to those in the United States and Canada.
The launch has generated significant consumer interest, with speculation about Apple's potential entry into the smart glasses market, as they explore similar devices with integrated features.