Ticker

6/recent/ticker-posts

Meta's Ray-Ban glasses become the user's eyes

Meta's Ray-Ban glasses become the user's eyes

On Ray-Ban smart glasses, the Meta AI assistant appeared in France last November. Users can ask questions on all sorts of topics; answers are given through the integrated speakers. However, this was only a first step.

Augmented vision and instant translation

Starting next week, Meta AI will be able to use the glasses' camera to "see" what is happening in front of the user. This video data will be interpreted by AI, which will then allow The assistant can answer questions about the wearer's immediate environment. For example: "Hey Meta, what's this building?" This feature will be available in France next week, as well as in other European countries where Meta AI is activated—such as Belgium and Germany.

This visual recognition is essential to strengthening Meta AI's capabilities. It's a common feature in other bots, such as ChatGPT or Gemini, which can analyze images to extract contextual information. By integrating it directly into the glasses, Meta is taking a step towards an increasingly "augmented" personal assistant, capable of interacting with the environment in real time, without the user having to take out their smartphone.

Another function that could be useful this summer during a trip abroad: the glasses will be able to instantly translate English, Italian, Spanish and even French, even without an internet connection. However, you will need to download the language pack beforehand. The update The update is currently being rolled out.

Meta continues to bet big on its partnership with Ray-Ban. And the company has several major projects in the pipeline, including a pair of glasses equipped with a screen to display information and apps.

Source: Meta

Post a Comment

0 Comments