Jacob Wobbrock – 91̽News /news Tue, 16 Dec 2025 17:30:05 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 Video: Drivers struggle to multitask when using dashboard touch screens, study finds /news/2025/12/16/video-drivers-struggle-to-multitask-when-using-dashboard-touch-screens-study-finds/ Tue, 16 Dec 2025 17:00:09 +0000 /news/?p=90099

Once the domain of buttons and knobs, car dashboards are increasingly home to large touch screens. While that makes following a mapping app easier, it also means drivers can’t feel their way to a control; they have to look. But how does that visual component affect driving?

New research from the 91̽ and Toyota Research Institute, or TRI, explores how drivers balance driving and using touch screens while distracted. In the study, participants drove in a vehicle simulator, interacted with a touch screen and completed memory tests that mimic the mental effort demanded by traffic conditions and other distractions. The team found that when people multitasked, their driving and touch screen use both suffered. The car drifted more in the lane while people used touch screens, and their speed and accuracy with the screen declined when driving. The effects increased further when they added the memory task. 

These results could help auto manufacturers design safer, more responsive touch screens and in-car interfaces.

The team Sept. 30 at the ACM Symposium on User Interface Software and Technology in Busan, Korea. 

“We all know ,” said co-senior author , a 91̽professor in the Paul G. Allen School of Computer Science & Engineering. “But what about the car’s touch screen? We wanted to understand that interaction so we can design interfaces specifically for drivers.”

As the study’s 16 participants drove the simulator, sensors tracked their gaze, finger movements, pupil diameter and electrodermal activity. The last two are common ways to measure mental effort, or “cognitive load.” For instance, pupils tend to grow when people are concentrating. 

Related:

  • Story from

While driving, participants had to touch specific targets on a 12-inch touch screen, similar to how they would interact with apps and widgets. They did this while completing three levels of an “N-back task,” a memory test in which the participants hear a series of numbers, 2.5 seconds apart, and have to repeat specific digits. 

The participants’ performance changed significantly under different conditions:

  • When interacting with the touch screen, participants drifted side to side in their lane 42% more often. Increasing cognitive load had no effect on the results.
  • Touch screen accuracy and speed decreased 58% when driving, then another 17% under high cognitive load.
  • Each glance at the touchscreen was 26.3% shorter under high cognitive load.
  • A “hand-before-eye” phenomenon, in which drivers’ reached for a control before looking at it, increased from 63% to 71% as memory tasks were introduced.

The team also found that increasing the size of the target areas participants were trying to touch did not improve their performance. 

“If people struggle with accuracy on a screen, usually you want to make bigger buttons,” said , a 91̽doctoral student in the Allen School. “But in this case, since people move their hand to the screen before touching, the thing that takes time is the visual search.”

Based on these findings, the researchers suggest future in-car touch screen systems might use simple sensors in the car — eye tracking, or touch sensors on the steering wheel — to monitor drivers’ attention and cognitive load. Based on these readings, the car’s system might adjust the touch screen’s interface to make important controls more prominent and safer to access.

“Touch screens are widespread today in automobile dashboards, so it is vital to understand how interacting with touch screens affects drivers and driving,” said co-senior author , a 91̽professor in the Information School. “Our research is some of the first that scientifically examines this issue, suggesting ways for making these interfaces safer and more effective.”

, a 91̽doctoral student in the Information School, is co-lead author. Other co-authors include , , and of TRI. This research was funded in part by TRI.

For more information, contact Wobbrock at wobbrock@uw.edu and Fogarty at jfogarty@cs.washington.edu.

]]>
A Google Slides extension can make presentation software more accessible for blind users /news/2023/10/30/a11yboard-google-slides-powerpoint-accessible-blind-users/ Mon, 30 Oct 2023 16:34:45 +0000 /news/?p=83353 A user demonstrates creating a presentation slide with A11yBoard on a touchscreen tablet and computer screen.
A team led by researchers at the 91̽ has created A11yBoard for Google Slides, a browser extension and phone or tablet app that allows blind users to navigate through complex slide layouts, objects, images and text. Here, a user demonstrates the touchscreen interface. Photo: 91̽

Screen readers, which convert digital text to audio, can make computers more accessible to many disabled users — including those who are blind, low vision or dyslexic. Yet slideshow software, such as Microsoft PowerPoint and Google Slides, isn’t designed to make screen reader output coherent. Such programs typically rely on — which follows the way objects are layered on a slide — when a screen reader navigates through the contents. Since the Z-order doesn’t adequately convey how a slide is laid out in two-dimensional space, slideshow software can be inaccessible to people with disabilities.

A team led by researchers at the 91̽ has created A11yBoard for Google Slides, a browser extension and phone app that allows blind users to navigate through complex slide layouts and text. Combining a desktop computer with a mobile device, A11yBoard lets users work with audio, touch, gesture, speech recognition and search to understand where different objects are located on a slide and move these objects around to create rich layouts. For instance, a user can touch a textbox on the screen, and the screen reader will describe its color and position. Then, using a voice command, the user can shrink that textbox and left-align it with the slide’s title.

The team presented Oct. 25 at in New York. A11yBoard is not yet available to the public.

“For a long time and even now, accessibility has often been thought of as, ‘We’re doing a good job if we enable blind folks to use modern products.’ Absolutely, that’s a priority,” said senior author , a 91̽professor in the Information School. “But that is only half of our aim, because that’s only letting blind folks use what others create. We want to empower people to create their own content, beyond a PowerPoint slide that’s just a title and a text box.”

A11yBoard for Google Slides builds on a line of research in Wobbrock’s lab exploring how blind users interact with “artboards” — digital canvases on which users work with objects such as textboxes, shapes, images and diagrams. Slideshow software relies on a series of these artboards. When lead author , a 91̽doctoral student in the iSchool, joined Wobbrock’s lab, the two sought a solution to the accessibility flaws in creativity tools, like slideshow software. Drawing on on the problems blind people have using artboards, Wobbrock and Zhang presented a in April. They then worked to create a solution that’s deployable through existing software, settling on a Google Slides extension.

For the current paper, the researchers worked with co-author , an undergraduate at Stanford University, who is blind, to improve the interface. The team tested it with two other blind users, having them recreate slides. The testers both noted that A11yBoard greatly improved their ability to understand visual content and to create slides themselves without constant back-and-forth iterations with collaborators; they needed to involve a sighted assistant only at the end of the process.

The testers also highlighted spots for improvement: Remaining continuously aware of objects’ positions while trying to edit them still presented a challenge, and users were forced to do each action individually, such as aligning several visual groups from left to right, instead completing these repeated actions in batches. Because of how Google Slides functions, the app’s current version also does not allow users to undo or redo edits across different devices.

Ultimately, the researchers plan to release the app to the public. But first they plan to integrate a large language model, such as GPT, into the program.

“That will potentially help blind people author slides more efficiently, using natural language commands like, ‘Align these five boxes using their left edge,’” Zhang said. “Even as an accessibility researcher, I’m always amazed at how inaccessible these commonplace tools can be. So with A11yBoard we’ve set out to change that.”

This research was funded in part by the 91̽’s ( 91̽CREATE).

For more information, contact Zhang at zhuohao@uw.edu and Wobbrock at wobbrock@uw.edu.

]]>
VoxLens: Adding one line of code can make some interactive visualizations accessible to screen-reader users /news/2022/06/01/voxlens-adding-one-line-of-code-can-make-some-interactive-visualizations-accessible-to-screen-reader-users/ Wed, 01 Jun 2022 16:15:09 +0000 /news/?p=78662
91̽ researchers worked with screen-reader users to design VoxLens, a JavaScript plugin that — with one additional line of code — allows people to interact with visualizations. Millions of Americans use screen readers for a variety of reasons, including complete or partial blindness, learning disabilities or motion sensitivity. Shown here is a screen reader with a refreshable Braille display. Photo:

Interactive visualizations have changed the way we understand our lives. For example, they can showcase the number of

But these graphics often are not accessible to people who use screen readers, software programs that scan the contents of a computer screen and make the contents available via a synthesized voice or Braille. Millions of Americans use screen readers for a variety of reasons, including complete or partial blindness, learning disabilities or motion sensitivity.

91̽ researchers worked with screen-reader users to design VoxLens, a JavaScript plugin that — with one additional line of code — allows people to interact with visualizations. VoxLens users can gain a high-level summary of the information described in a graph, listen to a graph translated into sound or use voice-activated commands to ask specific questions about the data, such as the mean or the minimum value.

The team May 3 at CHI 2022 in New Orleans.

“If I’m looking at a graph, I can pull out whatever information I am interested in, maybe it’s the overall trend or maybe it’s the maximum,” said lead author , a 91̽doctoral student in the Paul G. Allen School of Computer Science & Engineering. “Right now, screen-reader users either get very little or no information about online visualizations, which, in light of the COVID-19 pandemic, can sometimes be a matter of life and death. The goal of our project is to give screen-reader users a platform where they can extract as much or as little information as they want.”

Screen readers can inform users about the text on a screen because it’s what researchers call “one-dimensional information.”

“There is a start and an end of a sentence and everything else comes in between,” said co-senior author , 91̽professor in the Information School. “But as soon as you move things into two dimensional spaces, such as visualizations, there’s no clear start and finish. It’s just not structured in the same way, which means there’s no obvious entry point or sequencing for screen readers.”

The team started the project by working with five screen-reader users with partial or complete blindness to figure out how a potential tool could work.

“In the field of accessibility, it’s really important to follow the principle of ‘nothing about us without us,'” Sharif said. “We’re not going to build something and then see how it works. We’re going to build it taking users’ feedback into account. We want to build what they need.”

The VoxLens code is .

To implement VoxLens, visualization designers only need to add a single line of code.

“We didn’t want people to jump from one visualization to another and experience inconsistent information,” Sharif said. “We made VoxLens a public library, which means that you’re going to hear the same kind of summary for all visualizations. Designers can just add that one line of code and then we do the rest.”

The researchers evaluated VoxLens by recruiting 22 screen-reader users who were either completely or partially blind. Participants learned how to use VoxLens and then completed nine tasks, each of which involved answering questions about a visualization.

This image has three parts labeled by a, b, and c. For a, it says "Task 1 of 9, which date has the maximum average temperature in this visualization?" and a button to click that says "proceed to visualization." For b, it shows the same as in a, plus "a chart is presented below. The chart shows average temperature in NYC in July 2016 and it shows the temperature decrease over time. It starts at 88 degrees and ends at 74 degrees. At the bottom is a button to click that says "proceed to answer choices." For c, it says Task 1 of 9, the question in a and b and then four multiple choice answers: offering three various dates and then one option that says "unable to extract information." At the bottom is a button that says "proceed to task 2"
Participants learned how to use VoxLens and then completed nine tasks (one of which is shown here), each of which involved answering questions about a visualization. Each task was divided into three pages. Page 1 (labeled with ‘a’) presented the question a participant would be answering, page 2 (b) displayed the question and the visualization and page 3 (c) showed the question with four multiple choice responses. Photo: Sharif et al./CHI 2022

Compared to participants from who did not have access to this tool, VoxLens users completed the tasks with 122% increased accuracy and 36% decreased interaction time.

“We want people to interact with a graph as much as they want, but we also don’t want them to spend an hour trying to find what the maximum is,” Sharif said. “In our study, interaction time refers to how long it takes to extract information, and that’s why reducing it is a good thing.”

The team also interviewed six participants about their experiences.

“We wanted to make sure that these accuracy and interaction time numbers we saw were reflected in how the participants were feeling about VoxLens,” Sharif said. “We got really positive feedback. Someone told us they’ve been trying to access visualizations for the past 12 years and this was the first time they were able to do so easily.”

Right now, VoxLens only works for visualizations that are created using JavaScript libraries, such as , or Google Sheets. But the team is working on expanding  to other popular visualization platforms. The researchers also acknowledged that the voice-recognition system can be frustrating to use.

“This work is part of a much larger agenda for us — removing bias in design,” said co-senior author , 91̽associate professor in the Allen School. “When we build technology, we tend to think of people who are like us and who have the same abilities as we do. For example, D3 has really revolutionized access to visualizations online and improved how people can understand information. But there are values ingrained in it and people are left out. It’s really important that we start thinking more about how to make technology useful for everybody.”

Additional co-authors on this paper are , a 91̽undergraduate student in the Allen School, and , a 91̽undergraduate student studying human centered design and engineering. This research was funded by the Mani Charitable Foundation, the 91̽ , and the 91̽ .

For more information, contact Sharif at asharif@cs.washington.edu, Wobbrock at wobbrock@uw.edu and Reinecke reinecke@cs.washington.edu.

]]>
Faculty/staff honors: Energy-efficient computing, Cottrell Scholar, Google Inclusion Awards /news/2021/03/24/faculty-staff-honors-energy-efficient-computing-cottrell-scholar-google-inclusion-awards/ Wed, 24 Mar 2021 22:03:42 +0000 /news/?p=73426 Recent honors and achievements for 91̽ faculty include an Intel Corporation award for work to make computers more energy-efficient, a Research Corporation for Science award for chemistry research and education, and two Google inclusion awards to create technology for underrepresented populations, including people with disabilities and Syrian refugees.

Electrical engineering professor Visvesh Sathe receives 2020 Intel Outstanding Researcher Award

Visvesh Sathe

, associate professor of electrical and computer engineering, has received a from Intel Corporation for a project seeking to create more energy-efficient computer architecture.

A professor of biosystems and computing, Sathe was one of 18 researchers to receive the award, which annually recognizes exceptional contributions made by those conducting Intel-sponsored research at universities.

Sathe conducts research in a variety of areas applicable to circuits and architectures for low-power computing and biomedical systems. The research that brought him the award seeks to address computing inefficiencies created by “guard bands,” which are added to computer processors to help them keep operating despite changes in temperature and supply voltage.

Read an on the Department of Electrical and Computer Engineering website.

* * *

Chemistry’s Alexandra Velian named a 2021 Cottrell Scholar

Alexandra Velian

The Research Corporation for Science has named , assistant professor of chemistry, one of its 25 Cottrell Scholars for 2021.

The honors early career teacher-scholars in chemistry, physics and astronomy with discretionary awards for research. Each award comes with $100,000 “to foster advancements in research and educational accomplishments.” The scholars are chosen through a peer-review process based on candidates’ innovative research proposals as well as educational programs.

Velian’s is titled “Synthesis of Functional Metal Chalcogenide Lattices Using Symmetry-Encoded, Atomically Precise Clusters.” Cottrell Scholars are eligible to compete for additional funding later in their careers and meet annually to network and exchange ideas.

Read an on the Department of Chemistry website.

* * *

iSchool’s Karen Fisher, Jacob O. Wobbrock receive Google Inclusion awards

Karen Fisher

and , professors in the Information School, are among 16 of inaugural 2020 Google Awards for Inclusion Research.

The award, to be given annually, supports academic research in computing and technology that addresses the needs of underrepresented populations. Each award comes with $60,000.

Fisher’s grant — which she shares with Yacine Ghamri-Doudane of in France — will support their work in designing culturally sensitive mobile technology for young Syrian refugee

Jacob Wobbrock

women in Jordan. Such devices, and social media, can be crucial lifelines to refugees in the ongoing war.

Wobbrock’s award will support his work in creating an ability-based mobile toolkit to help programmers build applications that are aware of and responsive to the user’s abilities. A professor of human-computer interaction, Wobbrock is the founding co-director of the 91̽Center for Research and Education on Accessible Technology and Experiences, or .

Read more about and on the iSchool website.

]]>
91̽launches new Center for Research and Education on Accessible Technology and Experiences with $2.5 million investment from Microsoft /news/2020/05/28/create-announced/ Thu, 28 May 2020 14:00:48 +0000 /news/?p=68440
Martez Mott works on Smart Touch with Provail participant Ken Frye Photo: Dennis Wise/91̽

The 91̽ today announced the establishment of the Center for Research and Education on Accessible Technology and Experiences (CREATE). Fueled by a $2.5 million inaugural investment from Microsoft, is led by an interdisciplinary team whose mission is to make technology accessible and to make the world accessible through technology.

“We are proud to partner with the 91̽on their journey to build the CREATE center,” said Brad Smith, president of Microsoft. “This is the next step in a longstanding journey to empower people with disabilities with accessibility and technology advancements. 91̽has truly embedded accessibility as part of their culture and we’re proud to support their next step to drive thought leadership on accessibility to empower people with disabilities.”

On the 30th anniversary of the Americans with Disabilities Act’s enactment, there have been enormous strides in the accessibility of public spaces and the availability of personal mobility technologies. Yet, equitable participation in society depends on the successful use of technology, now more than ever.

People with disabilities are dependent on technology and if accessibility is not embedded into the start of the development process then it can leave people behind. Achieving accessibility involves expertise and innovation across a range of disciplines. As a result, the major challenge of developing technology to make a more accessible world is outpacing even the most talented individual researchers and small teams.

“CREATE will help us take accessible technology research and education from small, incremental gains to true breakthroughs. This chance to advance inclusion and participation for people of all abilities is the kind of opportunity that inspires the entire 91̽community,” said 91̽President Ana Mari Cauce.

The 91̽is a global leader in accessible technology research and design. The center will bring together existing areas of excellence and build upon the university’s ability to catalyze progress in education, research and translation. CREATE faculty bring multiple perspectives not just in technology but also disability rights and advocacy.

Read more about CREATE in this with Jacob O. Wobbrock, professor and inaugural co-director of the center.

The CREATE leadership team hails from six campus departments in three different colleges, including the Paul G. Allen School of Computer Science & Engineering, The Information School, Rehabilitation Medicine in the 91̽School of Medicine, Mechanical Engineering, Human Centered Design & Engineering, and the Disability Studies Program.

The center will build upon current projects in prioritizing and automating personalization, transitioning transportation to be accessible; augmenting abilities through wearable technologies; developing inclusive, intelligent systems and data sets; and “do-it-yourself” accessible technology production.

The 91̽and Microsoft have been working together in this space for more than a decade and share the same values and commitment to work with the disability community on driving innovation in accessibility research. This partnership has opened student internship and career opportunities, as well as ongoing research engagements with the at Microsoft Research. Current projects include developing audio-first representations of websites for smart speakers; understanding how perceptions of software developer job candidates with autism may impact hiring decisions; AI-based sign language recognition and translation as well as ongoing work on an ASL to English dictionary; and data-driven mental health apps.

 

See related stories in and .

 

In addition to the impact of Microsoft’s funding for this collaboration, the company’s endorsement of the UW’s accessibility work promises to catalyze additional investment, particularly in the Pacific Northwest, which, ultimately, could generate the full funding needed to provide long-term support for the center. The goal is to raise $10 million for CREATE to provide five years of support. The center employs a consortium model for academic, industry, and community partners.  CREATE is seeking additional partners who are interested in the deployment of accessible technology and the development of inclusive communities.

“The 91̽ has for many years led the field in cutting-edge accessible technology research and design,” said Jacob O. Wobbrock, professor and inaugural co-director of the center. “Our faculty and students are incredibly motivated to tackle the hard problems of accessibility. Now, with CREATE, we will be able to take on even bigger collaborative challenges in this space. I am honored to work with co-director Jennifer Mankoff, and to be supported by such world-class colleagues in the center.”

 

]]>
Faculty/staff honors: Housing association nod, honorary doctorate, distinguished fellow, best conference paper /news/2019/12/02/faculty-staff-honors-housing-association-nod-honorary-doctorate-distinguished-fellow-best-conference-paper/ Mon, 02 Dec 2019 17:41:34 +0000 /news/?p=65055 Recent honors to 91̽ faculty and staff members include an honorary doctorate from the University of Bucharest, membership in an inaugural class of distinguished fellows in pharmacology, and a leadership position in a national student housing association.

Pamela Schreiber, , executive director of  91̽Housing & Food Services and assistant vice president in the  91̽Office of Student Life, has been elected vice president to the executive board of the Association of College of University Housing Officers International.
Pamela Schreiber

Pam Schreiber, HFS director, named to housing association executive board

, who is executive director of 91̽Housing & Food Services and assistant vice president in the 91̽Office of Student Life, has a new role as well — she has been elected vice president to the executive board of the .

The executive board sets policy for the association — which is called ACUHO-I for short — and makes sure there are resources to serve the needs of its 17,000-some members, which represent 1.2 million on-campus students worldwide. Schreiber will be the board’s vice president in 2020, then serve a year as president-elect, and then a year as president.

Schreiber said as a board leader she “will focus on developing relationships grounded in trust and respect, and will practice listening carefully, especially to voices that have historically been silenced.”

She wrote in nomination documents for the position that her work in campus housing connects back to her own “transformational” on-campus experience as a first-generation college student.

“My commitment to the field is unwavering,” she wrote, “and I believe that transforming students’ lives remains our primary purpose.”

Schreiber joined the 91̽in 2009, from Florida Gulf Coast University.

.

* * *

Dan Chirot of Jackson School, sociology receives honorary doctorate from University of Bucharest

Daniel Chirot ,  91̽prof of international studies, has received an honorary doctorate from the University of Bucharest.
Daniel Chirot

, 91̽professor in the Jackson School of International Studies, will receive an honorary doctorate from the University of Bucharest, in Romania, for scholarship on that country and the Balkans.

He was given the honor in a ceremony on Oct. 10, during a three-day conference on “Thirty Years After: Post Communism, Democracy, and Illiberalism,” for which Chirot will be a keynote speaker. The title of his keynote talk will be “The Fall, Rise, and Decline of Democracy in Europe and the World.”

Chirot also will give a talk upon receiving the honorary degree — addressing an all-Romanian audience, which will be titled “Why 20th Century Romanian Sociology and History Are Relevant Today.”

Chirot, who is the Herbert J. Ellison Professor of Russian and Eurasian Studies, is also a professor of sociology. He founded the journal East European Politics and Societies and is the author of several books since his first, , was published in 1976. His next book will be “,” coming in 2020 from Princeton University Press.

, Jackson School director, said, “This is a wonderful and most appropriate recognition of Dan’s seminal work on Romania and the Balkans that date back to his graduate school days.”

* * *

William Catterall named among inaugural fellows of pharmacology society

William Catterall, professor of pharmacology, has been named of its inaugural class of fellows by the American Society for Pharmacology and Experimental Therapeutics.ral class of fellows, established this year,
William Catterall

The has named , professor of pharmacology in the 91̽School of Medicine, a member of its inaugural class of fellows, established this year, honoring its most distinguished members.

The society, called ASPET for short, has 5,000 members worldwide who conduct pharmacological research and work for academia, government or industry; these include neuroscientists, toxicologists, chemical biologists, cardiovascular scientists, pharmacists and more.

Selection as a fellow, the society’s website states, goes to members who have demonstrated excellence in their contributions to “advance pharmacology, through their scientific achievements, mentorship and service to the society.”

Twenty-two individuals were named fellows for 2019, and will be recognized at the society’s business meetings and noted in the society’s quarterly publication, “” magazine.

* * *

Information School’s Jacob O. Wobbrock, student co-authors, honored for paper

Jacob Wobbrock, a professor in the  91̽Information School, and a team of undergraduate researchers has won the Douglas Engelbart Award for Best Paper at ACM Hypertext & Social Media 2019, an annual conference of the Association for Computing Machinery.
Jacob Wobbrock

A paper by , professor in the 91̽Information School, and a team of undergraduate researchers has won the Douglas Engelbart Award for Best Paper at , an annual conference of the .

Wobbrock collaborated with a team of undergraduate students on the paper “.” The paper was selected from 102 submissions in all, 30 of which were accepted for the conference.

Co-authoring the paper were Anya Hsu, and Michael Magee of the iSchool and Marijn Burger of 91̽Bothell, all of whom have since graduated. The paper, Wobbrock said, showed that “the perceived credibility of online news pages is significantly affected by visual design elements even apart from actual content, which has implications for consumers and purveyors of real and fake news.” The conference was held Sept. 17 to 20, in Hof, Germany.

91̽Notebook is a section of the 91̽News site dedicated to telling stories of the good work done by faculty and staff at the 91̽. Read all posts here.

]]>