spot_img
15.6 C
London
spot_img
HomeAI & Machine LearningRay-Ban Meta Glasses Get Smarter for Low Vision Users. Here's How

Ray-Ban Meta Glasses Get Smarter for Low Vision Users. Here’s How

image

If you haven’t heard, AI now has eyes, and Meta has unveiled some enhancements to its AI-equipped Ray-Ban Meta glasses. Wearers of the smart specs can now customize Meta AI to give detailed responses based on what’s in the surrounding environment, Meta said in a blog post for Global Accessibility Awareness Day.

Artificial intelligence is opening a whole new world for accessibility, with lots of features appearing. Tech giants like Google, Apple and Meta are engaged in efforts to create a world where people with disabilities, such as low or no vision, can more easily interact with the world around them. 

Though Live AI for the Meta glasses has been around, the additional enhancements for low vision users will undoubtedly be welcomed.

Below are some of the other highlights from Meta’s accessibility-focused blog post. For more, check out the glimpse of brain accessibility features headed to Apple devices

‘Call a volunteer’ feature expanding to 18 countries

Though it isn’t AI-focused, the Meta and Be My Eyes feature Call a Volunteer will soon be expanding to all 18 countries where Meta AI is available. Launched in November 2024 in The US, Canada, UK, Ireland and Australia, the expansion of Call a Volunteer will be a very handy (and hands-free) feature for low vision users. 

Once set up, a Meta glasses user can simply ask AI to “Be My Eyes.” From there, you’ll be connected to one of over 8 million volunteers that will be able to view the live camera stream from your glasses and provide real-time assistance for whatever you need help with. The feature will be available to all supported countries later this month.

Meta additional research and accessibility features

Meta also detailed some of its existing features and research taking place in its effort to expand accessibility for its products, especially in the extended reality space. 

  • Features like live captions and live speech are currently available on devices like the Quest, Meta Horizon and Horizon Worlds.
  • Also shown was a WhatsApp chatbot from Sign-Speaks that uses its API and Meta’s Llama AI models. The chatbot allows live translation of American Sign Language to text and vice versa to create easier communication between deaf and hard of hearing individuals.

For more, don’t miss the handful of new accessibility features announced by Apple.

spot_img

latest articles

explore more

LEAVE A REPLY

Please enter your comment!
Please enter your name here

en_USEnglish