top of page
gtag cover 4-02.png
Show (1).png

Overview

GTag aims to assist visually impaired and blind people in doing grocery shopping independently in an urban context. It consists of a mobile app and third-party NFC tags. The primary function of GTag is to record grocery-related information (e.g., expiration dates, cooking instructions, etc.) while in-store and replay them anytime post grocery shopping. 

Timeline

08/2021 - 12/2021

Tools

Figma

ProtoPie

Miro

Adobe Illustrator

Otter.ai

Team

Watson Hartsoe

Tommy Ottolin

Meichen Wei

Qiqi Yang

My Role

UX Designer

UX Researcher

My Role

UX Designer

  • Designed and iterated of digital prototype's user flows, wireframes, prototypes, and audio style guide

  • Suggested NFT technology and the idea of combination of physical tag and digital mobile app

  • Designed the "wizard of oz" methods of using prototypes in user evaluations, and facilitated as a technician in operating the prototype in user evaluations

UX Researcher

  • Developed the user scoping spectrum idea with the help from teammates

  • Made hierarchical task analysis based on contextual inquiries findings

  • Analyzed data and generated findings and design requirements from interviews and contextual inquiries

Problem
problem.png

Problem

In the context of grocery shopping and consuming purchased food products, there are some common problems for people with vision loss.

Obtain information from food packages, especially for expiration dates

identical food-02.png

Identify foods in the same size and shape

In the grocery shopping context

How might we minimize cognitive load for visually impaired people by making information accessible?

Solution

Solution

solution.png

Whenever no one is around to be your "eyes", Gtag is here to help!

+

=

Write

Users can store desired information on food packaging into designated NFC tags by creating audio recordings. New NFC tags need to be scanned to log in before storing any information.

write 1-06.png
write 2-06.png

Read

To access information, users simply scan tags hanging on food packages and listen to them. Tags can be read in or out of the app.

read 1-06.png
read 2-06.png

Erase

Information stored in each tag can be deleted.

erase 1-06.png
erase 2-06.png

Edit

Edit existing Information stored in tags.

edit 1-06.png
edit 2-06.png
Research

Research

Search.png

Comparative Analysis

First, we searched for commonly used mobile-based assistive applications that aid everyday tasks for visually impaired people. This provided us the knowledge on the ecosystem of assistive technologies for our users, which helped us in designing a user-friendly product.

Be My Eyes

A mobile application that utilizes sighted remote assistance operators (volunteers) to assist visually impaired users to see objects and circumstances.

Seeing AI

AI(Artificial Intelligence)-powered mobile app that serves the same purpose as Be My Eyes.

Contextual Inquiries & Interviews

Our Visually Impaired Participants:

Participant 1_edited.jpg

01

Participant 2.png

02

Participant 3.png

03

Participant 4.png

04

Participant 5.png

05

Participant 6.png

06

b92aed64d74f836efdacc7653b794ab1.gif

All discussions and observations on participants were mainly on their grocery shopping experience. We also asked for participants' perspectives on popular assistive technologies, their medical conditions, daily life, and careers. We collected notes from contextual inquiries and interviews and analyzed them which assisted later research and design.

User Scoping Spectrums

The spectrums are high-level analysis of participants' behaviors collected from contextual inquiries and interviews to find relationships between user behaviors data. Some have stronger relationships compared to others.

 

The spectrums helped us narrow down the user group to the visually impaired population living in urban contexts. Our product would lie in the current assistive technology ecosystem to promote independent shopping.