Sonify

MHCI Capstone Project

Technology Lead
5-Person Team | 8 Months
Sponsor: Bloomberg, LP

Interactive iOS app that uses pitch modification and screen reader technology to make stock price data accessible to people with visual impairments.

Problem

As the sponsor of my Capstone Project, Bloomberg asked us to design a way to improve the accessibility of desktop applications like the Bloomberg terminal, enabling Bloomberg to become a leader in computer accessiblity.

Solution

After 4 months of research, my team identified 7 key insights about the experience of using computers and desktop applications for people with some limitation. We narrowed our focus to help people with blindness understand the gist of data like one can through a line chart. Building on past related research, we iteratively prototyped an iOS app to sonify line graph information through a unique "scrubbing" interaction and text-to-speech output.

Skills

Iterative Prototyping

iOS App Development

User Research / Contextual Inquiry

User Testing

Storytelling


Current Prototype

Research


Literature Review / Domain Analysis


To develop an understanding of computer accessibility and how we could improve it, we created a report to map accessibility-related conditions and limitations, their effect on computer use, and their existing technological solutions. We studied academic and industry research as well as commercial accessibility solutions and accessibility communities.

Through this research, we saw that disabilities are typically divided into four medical categories: visual, auditory, cognitive, and motor. However, we further organized our research into disabilities, their associated limitations, and what solutions exist to address those limitations. By doing this, we learned that specific limitations are more relevant to computer accessibility solutions than disabilities - many disabiliites across the 4 categories share the same limitations. We also identified which limitations have many solutions or still lack any.



User Interviews + Contextual Inquiry


To better understand the human experience of computer accessibility, we interviewed 11 computer users with a variety of disabilities from Bloomberg and other organizations. We asked participants to tell us how their conditions or limitations affected their computer usage and about any outstanding challenges they faced while working.

We observed them complete tasks such as writing and responding to emails, or using Excel spreadsheets. Through observation, we could see where accessibility tools helped, and where they fell short. We also saw that all 11 individuals we spoke with had developed their own workarounds in order to effectively use their computers.

After conducting interviews and contextual inquiry, we synthesized our data into work models and an affinity diagram. Our affinity diagram captured the benefits and disadvantages of accessibility tools, how participants discovered and learned to use accessibility tools, and the way they build those tools into their every day workflows. We found that participants love and rely on their computers, but that they still face daily frustrations (even with accessibility tools). We also began to understand how participants viewed themselves in a social context.

Empathy-Building Exercises


To complement our interviews, each member of our team simulated using a computer with a disability and recorded our sentiments during the process. I simulated partial vision impairment by using my computer without corrective lenses, which required that I use screen magnification software to see my screen. I was quickly frustrated and disoriented because I couldn't understand the entire layout of my screen. I also noticed that when I typed notes for myself, I overlooked my typing errors, but when I went to write a Facebook comment, I caught myself triple checking the text so nobody would judge me for making mistakes.

This exercise helped us to empathize with our users and we also faced some of the same experiences that interviewees had or would tell us about.

Insights

After doing an extensive literature review and conducting 11 interviews with people with disabilities who use desktop computers, we established the following insights:

1 I invent workarounds for even basic tasks


2 Updates derail my work


3 I am afraid of making mistakes other people might see


4 Even if accessibility tools exist, I can't easily find or learn to use them


5 I have to use multiple accessibility tools to accomplish a task


6 Tools that help me also hurt my work


7 It's impossible for me to get the "gist" of a page



We delivered a presentation and print book detailing our findings and research process to our clients. The book and presentation were also available to other employees at the company to help build interest and understanding for computer accessibility in the workplace.

Design

My team and I are now spending the summer at Bloomberg in New York City where we have moved from research into ideation, prototyping and testing to deliver a functional prototype to our client. As technology lead, I have focused on creating prototypes of increasing fidelity to deliver the highest quality end-product to our client. With my business background, I will also seek opportunities to portray the potential value of our project to Bloomberg as we finish the project.

Brainstorming

After our research presentation, we held a visioning session with our clients in order to incorporate their opinions and ideas in our transistion to design. After more brainstorming and focused literature review, we chose to work from our insight It's impossible for me to get the "gist" of a page. From there, we brainstormed 50+ ideas before selecting 11 to prototype for a critique. These prototypes all helped a user with a vision impairment get the gist of content, GUI, or system status.


Prototyping

Because tools for people with vision impairments largely rely on audio and haptic feedback, many common protoyping tools did not suit our needs. We created 11 mockups of our ideas to present to the UX team at Bloomberg for feedback. Then, we created rough prototypes for five of our ideas using Keynote, paper and audio files and tested them with three internal users at Bloomberg. Through these tests, we identified what was working well and what wasn't, as well as what would be most useful to a Bloomberg user before deciding to focus on graph sonification.



In order to understand how charts and graphs are used by those in the finance field, we did interviews and "think-alouds" with about 20 participants: We interviewed four graph designers/engineers and three expert usersand conducted ten quick think-alouds with other Bloomberg employees. We met with people from a local blind community who explained how they use graphs and data currently. This research powered our continued ideation and prototyping.



From there, we focused on interaction by prototyping three different ways of getting the "gist" of a line graph: through playing a single changing tone, keyboard commands, or scrubbing back and forth through an audio version of the graph on a trackpad. I built the trackpad scrubbing prototype; I used Processing (IDE) because it allowed me to code accurate sound mapping of data, but was still relatively quick and easy to pick up.


User Testing

When we completed our prototypes, we tested with 6 participants in the usability lab at Bloomberg (in addition to a pilot test). We sought to understand:

Which interaction gives participants the most accurate interpretation of a graph?

Which audio sounds are most pleasant?

Which audio sounds most effectively communicate the graph?


We found that scrubbing through a graph was the most accurate interaction, but participants were confused about picking up their finger or wondering whether they were at the end of the graph or not. We decided to move from a trackpad to a smartphone touch interface to use an absolute rather than relative positioning system to address this confusion. Participants also chose to get more exact information through text-to-speech output, so we decided to combine these ideas in the iOS touchscreen version of our prototype.


Final Development

We tested the iOS prototype with 3 users with blindness and 1 person with a visual impairment and received positive feedback. We redesigned the interactions to be more compatible with VoiceOver (iOS screen reader) for improved discoverability and fluidity. We tested on both an iPad and an iPhone - we thought that a larger iPad screen might provide more precise data, but we found that blind participants preferred the iPhone screen because they could constantly track their finger position relative to the edges of the screen. We also built and tested playing 2 different tones for 2 "lines" at once through varying sound quality and playing the two sounds into left and right stereo. We achieved our goal of helping users with visual impairments gather the gist of a line graph and feel that continued work in this space could potentially enable the 39 million people worldwide who have visual impairments to interact with data in a way they couldn't before

felicia.alfieri@gmail.com | linkedin.com/in/feliciaalfieri