Mockups of three laptops with avatars and their accompanying image descriptions.
Avatars created by Janet Mac and Patrick Dias 
Commissioned by Google Material Design
*Avatars created by Janet Mac and Patrick Dias, Commissioned by Google Material Design

Google Avatar Project

Role

Assistant User Experience (UX) Researcher

Teammates

Emory James Edwards
Lead User Experience Researcher, UCI INsite Lab
Kyle Lewis Polster
User Experience Researcher, UCI INsite Lab
Michael Gilbert
User Experience Researcher, Google Material Design
Emily Blank
Senior Art Director, Google Material Design
Stacy Branham
Assistant Professor/Research Advisor, UCI INsite Lab

Tools

Zoom/Google Meets/Skype
Google Workspace
MURAL

Overview

I worked with Google as an Assistant User Experience Researcher at INsite Lab (INclusive Studio for Innovative Technology and Education) to assess a series of images commissioned as part of Google's effort to think more critically — both internally and externally — about representation and inclusivity. This image set, created by illustrators Janet Mac and Patrick Dias, consisted of fictional representations of people of different ability, race, gender, sexuality, age, and other identities. As a research team, we sought to understand: how does one describe identity in the most inclusive way possible, especially for images with fictional subjects who cannot self-identify? We conducted a series of 9 focus groups and 19 interviews with participants with disabilities (or other marginalized identities), presenting the images with their co-designed image descriptions. With this data, we authored a qualitative research paper (ASSETS 2021); informed imagery guidelines at Google; had our research documented in a Google Material Design blog post; and became the first ever instance of accessible user representations of disability included in user profile setup, now on all Google Chromebooks starting November 2021.

This work is under a NDA.

If you would like learn more about this case study
please email me.

Results

Our resulting paper "‘That’s in the eye of the beholder’: Layers of Interpretation in Image Descriptions for Fictional Representations of People with Disabilities” was accepted by the ACM SIGACCESS Conference on Computers & Accessibility (ASSETS ‘21) with a total conference acceptance rate of 29%. The paper is completely open access and can be found at the embedded link above. Our paper was selected to be showcased live at ASSETS '21. Below is the video presentation supplied for the conference of our lead researcher Emory introducing the paper and summarizing its primary findings.
Unique to this project was its ties to both academia and industry. Our team was well aware of this, and also very concerned about the impact of our research. It was crucial to our team that we sought to extend past academic circles, and to make our research accessible to the everyday user. Throughout the project, myself and my labmates presented our findings to our industry partners at Google. From this, our team was invited to collaborate on updating Google's imagery guidelines, based on our own research. The research we conducted was provided to Google Material Design and further informed guidelines in-use internally. These guidelines directly inform the imagery interacted with by millions of end users daily.

In terms of scale, Google has notified our team that the images and descriptions we assessed and co-created will be shipped out on all Google Chromebooks starting November 2021, an estimated 40 million computers. This marks the first instance of accessible user representations of disability included in any user profile setup. Our research was also documented in this co-authored article on Google Material Design's Blog, which you can read to learn more about our study and it's real-world impact.

Myself, as well as the rest of the team is beyond proud of our research and its impact. Though the end-result is nothing I could have ever imagined, I am honored to have served as part of a larger goal in making millions of users feel seen, understood, and included.

Reflections

Working with INsite Lab and Google on the Avatar Project was an inspiring and formative experience for me. It was the first ever longstanding project I've worked on as a UX researcher and my first time working with an established lab and well-known tech company. This opportunity gave me valuable knowledge of working both in academia and in industry. Some key takeaways I have are: 

It's Okay to Ask Questions.

Being the only undergraduate on a team of Masters and PhD students definitely didn't help my imposter syndrome. At first, all of the academic language and processes went over my head and I was too anxious to stop and ask questions. As the project progressed and I become more comfortable with my lab mates, I eventually became more empowered to ask questions and even make active suggestions. This evolved into a process of learning for both myself and my lab mates, with all of us learning from our different perspectives and benefiting from everyone's input. Eventually, I was able to make decisions that impacted the project altogether, and was knowledgable to teach incoming undergraduates about our project.

Don't Forget the Bigger Picture.

Sometimes, I would get too caught up in the small details of my own research that I would find my work difficult to complete. I would get stuck on finding just the right code for a participant quote, or I would get so stressed over coming up with the perfect finding right off the bat. In these times, it was important for me to ground myself and remember the purpose of the project as a whole: to help people feel seen, understood, and safe. Once I understood our objective, each step in the process began to seem clearer. Whether this meant learning more about a quote's larger context before trying to quickly assign it, or thinking higher-level about a potential finding, abstracting these thoughts and understanding the bigger picture allowed me to more easily connect with my research and data than being overwhelmed by smaller details.