Overview
GTag aims to assist visually impaired and blind people in doing grocery shopping independently in an urban context. It consists of a mobile app and third-party NFC tags. The primary function of GTag is to record grocery-related information (e.g., expiration dates, cooking instructions, etc.) while in-store and replay them anytime post grocery shopping.
Timeline
08/2021 - 12/2021
Tools
Figma
ProtoPie
Miro
Adobe Illustrator
Otter.ai
Team
Watson Hartsoe
Tommy Ottolin
Meichen Wei
Qiqi Yang
My Role
UX Designer
UX Researcher
My Role
UX Designer
-
Designed and iterated of digital prototype's user flows, wireframes, prototypes, and audio style guide
-
Suggested NFT technology and the idea of combination of physical tag and digital mobile app
-
Designed the "wizard of oz" methods of using prototypes in user evaluations, and facilitated as a technician in operating the prototype in user evaluations
UX Researcher
-
Developed the user scoping spectrum idea with the help from teammates
-
Made hierarchical task analysis based on contextual inquiries findings
-
Analyzed data and generated findings and design requirements from interviews and contextual inquiries
Problem
In the context of grocery shopping and consuming purchased food products, there are some common problems for people with vision loss.
Obtain information from food packages, especially for expiration dates
Identify foods in the same size and shape
In the grocery shopping context
How might we minimize cognitive load for visually impaired people by making information accessible?
Solution
Whenever no one is around to be your "eyes", Gtag is here to help!
+
=
Write
Users can store desired information on food packaging into designated NFC tags by creating audio recordings. New NFC tags need to be scanned to log in before storing any information.
Read
To access information, users simply scan tags hanging on food packages and listen to them. Tags can be read in or out of the app.
Erase
Information stored in each tag can be deleted.
Edit
Edit existing Information stored in tags.
Research
Comparative Analysis
First, we searched for commonly used mobile-based assistive applications that aid everyday tasks for visually impaired people. This provided us the knowledge on the ecosystem of assistive technologies for our users, which helped us in designing a user-friendly product.
Be My Eyes
A mobile application that utilizes sighted remote assistance operators (volunteers) to assist visually impaired users to see objects and circumstances.
Be My Eyes - https://www.bemyeyes.com/
Seeing AI - https://www.microsoft.com/en-us/ai/seeing-ai
Seeing AI
AI(Artificial Intelligence)-powered mobile app that serves the same purpose as Be My Eyes.
Contextual Inquiries & Interviews
Our Visually Impaired Participants:
01
02
03
04
05
06
All discussions and observations on participants were mainly on their grocery shopping experience. We also asked for participants' perspectives on popular assistive technologies, their medical conditions, daily life, and careers. We collected notes from contextual inquiries and interviews and analyzed them which assisted later research and design.
User Scoping Spectrums
The spectrums are high-level analysis of participants' behaviors collected from contextual inquiries and interviews to find relationships between user behaviors data. Some have stronger relationships compared to others.
The spectrums helped us narrow down the user group to the visually impaired population living in urban contexts. Our product would lie in the current assistive technology ecosystem to promote independent shopping.