Search

New Tech Feature Review: Envision Glasses

Authors: Donna Clemens, Assistive Technology Consultant, Outreach Program, Texas School for the Blind and Visually Impaired (TSBVI)

Keywords: Envision, assistive technology, artificial intelligence, AI, blindness products, low vision products, accessibility, empowerment

Abstract: The author provides a comprehensive review of Envision Glasses, a product from Envision. She describes her personal trial of the glasses, from unboxing of the device to in-depth information on specific features such as the use of AI and capturing visual information in real time. She also offers advice on using the device to access both print and environmental information and provides tips for avoiding fatigue.

For individuals who are blind or visually impaired, navigating print materials and new environments can often be daunting. However, thanks to groundbreaking advancements in Assistive Technology (AT), spaces and print materials are transformed into more accessible environments. I had the opportunity to preview and try one such technology, Envision Glasses from Envision. 

I received a well-packaged and stylish device quickly and securely. The glasses arrived  in a well-constructed travel case, which made it easy for me to travel with the glasses when I wasn’t wearing them. I opted not to have Envision create a QR code for my home connections and connected the device to my phone and home network myself. If you have them create a QR code for you, it could be quicker and without the possibility of user error. The Bluetooth connection between my phone and the glasses took a while, but it was secure and stable once established. I did not have any issues with dropping or inconsistent connection. 

The glasses have two different frame options: a thin titanium frame and the Lux Frame. The metal frame is sleek and stylish, with the optics mounted from the hinge to the ear on the right side. I wear glasses, and this frame would not fit well when I was wearing them, but that issue can be addressed with the Lux frame. The Smith Optics, or Lux Frames, are stylish frames that come with blank lenses that can be replaced with your prescription. For the purposes of this article, I wore the titanium frame without my personal glasses, thereby testing the device with approximately 20/200 visual acuity. 

Once I installed the app and connected the frames to my app and WiFi, it was time to dig into the features and see what the Envision Glasses could do. Envision Glasses utilize artificial intelligence (AI) to comprehensively understand the surrounding environment. They are equipped with a small camera that scans objects, people, and text, relaying the information to the wearer through a built-in speaker (which can be Bluetooth to earbuds). 

Does any of this sound familiar? To some of you, it may. Envision Glasses is built on the foundation of Google Glass, which was initially introduced in 2013 as a multipurpose wearable device. The wearable nature of Envision Glasses ensures that users can effortlessly capture and comprehend visual information in real time. YES! In real-time!

Envision glasses narrate a continuous stream of information using the onboard AI, allowing users to read signs and documents, identify objects and money, name colors, or analyze a full scene. This can sometimes be overwhelming if the glasses are set to read everything. There are words everywhere in a school or store. This visual assault of words can become an auditory one, so it may be helpful to turn down this feature at times.  

Navigation—Swipes, Taps, and Voice

The Envision Glasses are built with audio in mind. Users do not have to use a phone to complete any of the commands. All swipes, taps, and voice commands interact directly with the mechanism on the glasses. None of the interactions need the use of vision to complete. The glasses’ user interface is essentially a set of displays and menus. All displays talk to the wearer who can use their voice, finger taps, or swipes to navigate them. 

The menu has three distinct tiers. There is the Main menu on One, the Activate menu on Two, and the Context menu on Three. Swiping forward and backward using one finger allows wearers to move through the main menu. I could form a sign language “d” with my fingers (make a circle with your fingers with the pointer finger up) and rest on my cheekbone with all but my pointer finger, which I used to swipe and tap. This reduced my fatigue and kept my actions consistent for placement. The most often used options can be found in the main menu (home). These include: Instant text, Scan text, Call an ally, Describe a scene, Recognize currency, Find objects, Detect light, and More. “More” is actually a submenu that leads to Batch scan, Find individuals, Explore, Scan QR code, and Detect colors. Instead of scrolling to the bottom of the menu in search of the setting or support you need, just swipe back from the home screen and tap the item you want from the menu when you hear it. It’s an easy concept, right? These are the same basic movements used on phones and tablets, except now they’re performed vertically. To quit or return to the previous screen, swipe down with only one finger.

Two-finger gestures: two fingers swiping down on the home screen enters sleep mode, which is useful to avoid interruptions when listening or watching. Swiping two fingers across any screen besides Ally or Aira will instantly adjust the volume. When performing two-finger swipes, the manual and I concur that holding the glasses in place at the bridge of the nose allows swiping more securely.

Tap gestures: the single tap command selects as usual. A double tap with one finger will start or activate the current application; hold will provide additional information about that screen. This is like a long press when using a smartphone. A two-finger single tap allows access to more options for any feature. The two-finger single tap is quicker than navigating to the feature preference menu.

The feature for voice commands is activated by pressing the hinge button and simply speaking the function you want to open. Can it be simpler? When navigating through a menu, there are currently only certain voice commands available. These voice commands are: Instant text, Scan text, Batch scan, Call an Ally, Call an Aira agent, Describe a scene, Detect light, Recognize cash, Detect colors, Find an object, Find people, Explore, Change volume, and Home. All of the voice commands are only available in English at this point. In addition to the previously mentioned commands, a Quick Switch allows you to switch to any other known network or an active network around you. This is one area where Envision is actively adding more commands with future updates.

On-Device Tutorials

The on-device tutorial is helpful. It allows wearers to train and become oriented to their new glasses. The training includes smart guidance, gesture introduction, and gesture practice training. I recommend exploring this first when working with your new glasses.

Detailed Information on Some Useful Apps

Instant Text: Instant text is the quickest way to access print that is detected by the camera. This is not ideal for long passages or for reading large amounts of text, but it is good for reading a sign or something short in order to get information quickly.

Scan and Batch Scan: the Scan and Batch Scan features take Instant text and add more features. For reading a whole document or a text that is longer and more intense, including handwriting, then use Scan text. This opens through a reading feature which allows play, pause, and navigate. These options are not a feature of Instant text. This feature can be used online and offline, which is really convenient in a testing or classroom situation. The Batch Scan feature builds on the Scan feature. It allows scanning of more than one document at a time for reading multiple pages in one sitting. The Scan and Batch Scan features also include a word detection feature, which allows capturing and adjusting the text in order to ensure that all of the text is included. Both Scan and Batch Scan have multiple language supports so that the text being read aloud is read correctly in the language and voice needed. When the layout detection feature is turned on, wearers can read things in columns that have headings such as bank statements, textbooks, and things of that nature. Word detection, language detection, and layout detection can be turned on and off as needed. 

Handwriting: the handwriting feature is so exciting on these glasses. The Scan text feature allows Envision wearers to capture handwritten messages like birthday cards or  simple notes. This feature may not work, however, if the handwriting resembles a doctor’s scribble!

The Ally Feature: one of the standout features of the Envision Glasses is the Ally feature, which enables users to initiate video calls with friends and family to seek assistance. A user can initiate a call and have their ally or partner describe to them what is in the surrounding environment. This can allow wearers to more confidently navigate the environment, work equipment in a room that has no tactile buttons (microwaves, thermostats, etc.), and in general, become oriented to a new environment.

Aira Agent: the Aira Agent feature builds on the Ally feature and allows wearers to make a video call to a professional agent to visually interpret the environment or task. This feature is available 24/7.

Facial Recognition: The Facial Recognition feature allows wearers to take pictures of friends and families in multiple positions (front, sides, etc.) and then enter them with corresponding names. The glasses will then audibly announce the names of recognized individuals when they come into the user’s field of vision.

Final Thoughts

While there are many existing technologies that have made a positive impact on our lives, Envision Glasses aims to take accessibility to new heights. The unique design of the glasses assists users in moving through the environment safely by freeing up their hands, enabling them to hold a cane or walk with their guide dog more comfortably.

Additionally, the positioning of the camera next to the user’s eyes eliminates the need to hold a phone up for scanning, providing a more seamless and intuitive experience.

Envision is expanding the capabilities of its glasses and partnering with other assistive technology providers to enhance user experiences. Updates are made through an internet connection, not device replacement, which provides a wonderful opportunity to learn and grow with a device over time. At the time of this article, AI was being released in a more active mode. 

Overall, Envision Glasses boast an impressive array of features that are specifically designed with accessibility, empowerment, and ease of use in mind. They are a revolutionary device offering a glimpse of the future.

Previous Article

5-a-side Blind Soccer Has Arrived!

Effective Practices
Next Article

SCIVIS: Reaching for the Stars at Space Camp

Effective Practices