Jennifer Mankoff – 91̽News /news Tue, 02 Jul 2024 00:10:53 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 ChatGPT is biased against resumes with credentials that imply a disability — but it can improve /news/2024/06/21/chatgpt-ai-bias-ableism-disability-resume-cv/ Fri, 21 Jun 2024 16:10:26 +0000 /news/?p=85764 A hand holds a phone with the ChatGPT app open.
91̽researchers found that ChatGPT consistently ranked resumes with disability-related honors and credentials — such as the “Tom Wilson Disability Leadership Award” — lower than the same resumes without those honors and credentials. But when researchers customized the tool with written instructions directing it not to be ableist, the tool reduced this bias for all but one of the disabilities tested. Photo: Solen Feyissa/Unsplash

While seeking research internships last year, 91̽ graduate student Kate Glazko noticed recruiters posting online that they’d used OpenAI’s ChatGPT and other artificial intelligence tools to summarize resumes and rank candidates. Automated screening has . Yet , a doctoral student in the UW’s Paul G. Allen School of Computer Science & Engineering, studies — such as those against disabled people. How might such a system, she wondered, rank resumes that implied someone had a disability?

In a new study, 91̽researchers found that ChatGPT consistently ranked resumes with disability-related honors and credentials — such as the “Tom Wilson Disability Leadership Award” — lower than the same resumes without those honors and credentials. When asked to explain the rankings, the system spat out biased perceptions of disabled people. For instance, it claimed a resume with an autism leadership award had “less emphasis on leadership roles” — implying the .

But when researchers customized the tool with written instructions directing it not to be ableist, the tool reduced this bias for all but one of the disabilities tested. Five of the six implied disabilities — deafness, blindness, cerebral palsy, autism and the general term “disability” — improved, but only three ranked higher than resumes that didn’t mention disability.

The team presented June 5 at the 2024 ACM Conference on Fairness, Accountability, and Transparency in Rio de Janeiro.

“Ranking resumes with AI is starting to proliferate, yet there’s not much research behind whether it’s safe and effective,” said Glazko, the study’s lead author. “For a disabled job seeker, there’s always this question when you submit a resume of whether you should include disability credentials. I think disabled people consider that even when humans are the reviewers.”

Researchers used one of the study’s authors’ publicly available curriculum vitae (CV), which ran about 10 pages. The team then created six enhanced CVs, each implying a different disability by including four disability-related credentials: a scholarship; an award; a diversity, equity and inclusion (DEI) panel seat; and membership in a student organization.

Researchers then used ChatGPT’s GPT-4 model to rank these enhanced CVs against the original version for a real “student researcher” job listing at a large, U.S.-based software company. They ran each comparison 10 times; in 60 trials, the system ranked the enhanced CVs, which were identical except for the implied disability, first only one quarter of the time.

“In a fair world, the enhanced resume should be ranked first every time,” said senior author , a 91̽professor in the Allen School. “I can’t think of a job where somebody who’s been recognized for their leadership skills, for example, shouldn’t be ranked ahead of someone with the same background who hasn’t.”

When researchers asked GPT-4 to explain the rankings, its responses exhibited explicit and implicit ableism. For instance, it noted that a candidate with depression had “additional focus on DEI and personal challenges,” which “detract from the core technical and research-oriented aspects of the role.”

“Some of GPT’s descriptions would color a person’s entire resume based on their disability and claimed that involvement with DEI or disability is potentially taking away from other parts of the resume,” Glazko said. “For instance, it the concept of ‘challenges’ into the depression resume comparison, even though ‘challenges’ weren’t mentioned at all. So you could see some stereotypes emerge.”

Given this, researchers were interested in whether the system could be trained to be less biased. They turned to the GPTs Editor tool, which allowed them to customize GPT-4 with written instructions (no code required). They instructed this chatbot to not exhibit ableist biases and instead work with and DEI principles.

They ran the experiment again, this time using the newly trained chatbot. Overall, this system ranked the enhanced CVs higher than the control CV 37 times out of 60. However, for some disabilities, the improvements were minimal or absent: The autism CV ranked first only three out of 10 times, and the depression CV only twice (unchanged from the original GPT-4 results).

“People need to be aware of the system’s biases when using AI for these real-world tasks,” Glazko said. “Otherwise, a recruiter using ChatGPT can’t make these corrections, or be aware that, even with instructions, bias can persist.”

Researchers note that some organizations, such as and , are working to improve outcomes for disabled job seekers, who face biases whether or not AI is used for hiring. They also emphasize that more research is needed to document and remedy AI biases. Those include testing other systems, such as Google’s Gemini and Meta’s Llama; including other disabilities; studying the intersections of the system’s bias against disabilities with other attributes such as ; exploring whether further customization could reduce biases more consistently across disabilities; and seeing whether the base version of GPT-4 can be made less biased.

“It is so important that we study and document these biases,” Mankoff said. “We’ve learned a lot from and will hopefully contribute back to a larger conversation — not only regarding disability, but also other minoritized identities — around making sure technology is implemented and deployed in ways that are equitable and fair.”

Additional co-authors were , a 91̽undergraduate in the Allen School; , a 91̽doctoral student in the Allen School; and , who completed this research as a 91̽undergraduate in the Allen School and is an incoming doctoral student at University of Wisconsin–Madison. This research was funded by the National Science Foundation; by donors to the UW’s (CREATE); and by Microsoft.

For more information, contact Glazko at glazko@cs.washington.edu and Mankoff at jmankoff@cs.washington.edu.

]]>
Can AI help boost accessibility? These researchers tested it for themselves /news/2023/11/02/ai-accessibility-chatgpt-midjourney-ableist/ Thu, 02 Nov 2023 16:18:51 +0000 /news/?p=83374 Four AI-generated images show different interpretations of a doll-sized “crocheted lavender husky wearing ski goggles,” including two pictured outdoors and one against a white background.
Seven researchers at the 91̽ tested AI tools’ utility for accessibility. Though researchers found cases in which the tools were helpful, they also found significant problems. These AI-generated images helped one researcher with aphantasia (an inability to visualize) interpret imagery from books and visualize concept sketches of crafts, yet other images perpetuated ableist biases. Photo: 91̽/Midjourney — AI GENERATED IMAGE

Generative artificial intelligence tools like ChatGPT, an AI-powered language tool, and Midjourney, an AI-powered image generator, can potentially assist people with various disabilities. These tools could summarize content, compose messages or describe images. Yet the degree of this potential is an open question, since, in addition to regularly and , these tools can .

This year, seven researchers at the 91̽ conducted a three-month autoethnographic study — drawing on their own experiences as people with and without disabilities — to test AI tools’ utility for accessibility. Though researchers found cases in which the tools were helpful, they also found significant problems with AI tools in most use cases, whether they were generating images, writing Slack messages, summarizing writing or trying to improve the accessibility of documents.

The team presented Oct. 22 at the conference in New York.

“When technology changes rapidly, there’s always a risk that disabled people get left behind,” said senior author , a 91̽professor in the Paul G. Allen School of Computer Science & Engineering. “I’m a really strong believer in the value of first-person accounts to help us understand things. Because our group had a large number of folks who could experience AI as disabled people and see what worked and what didn’t, we thought we had a unique opportunity to tell a story and learn about this.”

The group presented its research in seven vignettes, often amalgamating experiences into single accounts to preserve anonymity. For instance, in the first account, “Mia,” who has intermittent brain fog, deployed ChatPDF.com, which summarizes PDFs, to help with work. While the tool was occasionally accurate, it often gave “completely incorrect answers.” In one case, the tool was both inaccurate and ableist, changing a paper’s argument to sound like researchers should talk to caregivers instead of to chronically ill people. “Mia” was able to catch this, since the researcher knew the paper well, but Mankoff said such subtle errors are some of the “most insidious” problems with using AI, since they can easily go unnoticed.

Yet in the same vignette, “Mia” used chatbots to create and format references for a paper they were working on while experiencing brain fog. The AI models still made mistakes, but the technology proved useful in this case.

Mankoff, who’s spoken publicly about having Lyme disease, contributed to this account. “Using AI for this task still required work, but it lessened the cognitive load. By switching from a ‘generation’ task to a ‘verification’ task, I was able to avoid some of the accessibility issues I was facing,” Mankoff said.

The results of the other tests researchers selected were equally mixed:

  • One author, who is autistic, found AI helped to write Slack messages at work without spending too much time troubling over the wording. Peers found the messages “robotic,” yet the tool still made the author feel more confident in these interactions.
  • Three authors tried using AI tools to increase the accessibility of content such as tables for a research paper or a slideshow for a class. The AI programs were able to state accessibility rules but couldn’t apply them consistently when creating content.
  • Image-generating AI tools helped an author with (an inability to visualize) interpret imagery from books. Yet when they used the AI tool to create an illustration of “people with a variety of disabilities looking happy but not at a party,” the program could conjure only fraught images of people at a party that included ableist incongruities, such as a disembodied hand resting on a disembodied prosthetic leg.

“I was surprised at just how dramatically the results and outcomes varied, depending on the task,” said lead author , a 91̽doctoral student in the Allen School. “In some cases, such as creating a picture of people with disabilities looking happy, even with specific prompting — can you make it this way? — the results didn’t achieve what the authors wanted.”

The researchers note that more work is needed to develop solutions to problems the study revealed. One particularly complex problem involves developing new ways for people with disabilities to validate the products of AI tools, because in many cases when AI is used for accessibility, either the source document or the AI-generated result is inaccessible. This happened in the ableist summary ChatPDF gave “Mia” and when “Jay,” who is legally blind, used an AI tool to generate code for a data visualization. He could not verify the result himself, but a colleague said it “didn’t make any sense at all.”  The frequency of AI-caused errors, Mankoff said, “makes research into accessible validation especially important.”

Mankoff also plans to research ways to document the kinds of ableism and inaccessibility present in AI-generated content, as well as investigate problems in other areas, such as AI-written code.

“Whenever software engineering practices change, there is a risk that apps and websites become less accessible if good defaults are not in place,” Glazko said. “For example, if AI-generated code were accessible by default, this could help developers to learn about and improve the accessibility of their apps and websites.”

Co-authors on this paper are , who completed this research as a 91̽postdoctoral scholar in the Allen School and is now at Rice University; , and , all 91̽doctoral students in the Allen School; and , who completed this work as a 91̽doctoral student in the Information School and is now at the Massachusetts Institute of Technology. This research was funded by Meta, (CREATE), Google, an NIDILRR ARRT grant and the National Science Foundation.

For more information, contact Glazko at glazko@cs.washington.edu and Mankoff at jmankoff@cs.washington.edu.

]]>
From ‘distress’ to ‘unscathed’ — mental health of 91̽students during spring 2020 /news/2021/07/13/mental-health-of-uw-students-during-spring-2020/ Tue, 13 Jul 2021 18:37:33 +0000 /news/?p=74960
To understand how the UW’s transition to online-only classes affected college students’ mental health in the spring of 2020, 91̽researchers surveyed 147 91̽undergraduates over the 2020 spring quarter. Photo:

In early March 2020, the 91̽ became the first four-year U.S. university to transition to online-only classes due to the COVID-19 pandemic.

severe consequences of these physical distancing measures. To understand how this change affected college students’ mental health, 91̽researchers surveyed 147 91̽students over the 2020 spring quarter, which began shortly after the university transitioned to online-only classes. The team compared the students’ responses to a previous survey of 253 students in spring quarter 2019.

The researchers didn’t see much change in average levels of students’ depressive symptoms, anxiety, stress or loneliness between 2019 and 2020 or between the beginning and the end of spring quarter 2020. But these average values were masking large differences in students’ individual pandemic experiences. In general, students who used more problem-focused forms of coping — creating plans, focusing on positive aspects, etc. — experienced fewer mental health symptoms than those who disengaged or ignored a situation that was bothering them.

The researchers June 28 in PLOS ONE.

“During the pandemic, the challenges of online learning were entwined with social isolation, family demands and socioeconomic pressures,” said lead author , an affiliate associate professor in the 91̽Information School. “There’s not a simple answer to the question of how students were affected: Some experienced intense distress while others were unscathed.”

For the past four years, this team has spent spring quarter studying what factors contribute to undergraduates’ overall mental health and well-being. Students are invited to continue participating in each spring quarter study, and the researchers also recruit new students each time. In a previous paper, the researchers found that experiencing discrimination events altered student behavior, such as the amount of sleep or exercise a student got following the event.

For the 2020 cohort, the team used three different survey methods to monitor student health. First, they sent large surveys at the beginning and end of spring quarter. Then participants received two shorter surveys each week that asked them to reflect on how they felt — in terms of stress, loneliness, depressive symptoms — in the moment.

In general, students who reported more mental health symptoms at the beginning of the pandemic continued to experience elevated symptoms during the pandemic.

“Problem-focused coping protected students from the harmful effects of stress (anxiety and depression, for example), even though students who used more problem-focused strategies reported more stress,” said co-author , a 91̽doctoral student in clinical psychology.

“What these findings suggest is that students who coped by actively confronting their challenges, rather than avoiding them, still experienced highly stressful events over the course of the pandemic. However, they were protected from the mental health consequences,” Kuehn said. “It may not always feel pleasant or easy to confront the challenges of daily life, particularly during a pandemic, but doing so is likely to be highly beneficial in terms of reducing anxiety and depressive symptoms.”

Finally, at the end of spring quarter, the team conducted 90-minute in-depth interviews over Zoom with a subset of participants to gain deeper insight into their experiences.

The students described a range of challenges that interfered with learning:

  • Decreased interaction with faculty and peers — students mentioned that having fewer opportunities to interact with faculty and peers left them feeling less engaged. Some students said they felt like part-time students, even when they had full course loads
  • No shared learning environments — students spoke longingly of a table in a dorm or a spot in the library where they used to gather with classmates for impromptu study sessions
  • Family needs — family members’ requests or noise often interrupted studying and even test-taking. Family needs, such as caregiving, were a particular challenge to learning for first-generation college students
  • Interrupted autonomy — some students felt “trapped” back at home and described difficult “power dynamics” with their parents
  • Well-being and mental health — many students described disrupted sleep, decreased motivation, and said that they felt depressed or anxious for periods of time. Students’ feelings of detachment from school sometimes contributed to depression. Similarly, worry about grades sometimes cascaded into anxiety and insomnia that, in turn, made it harder to focus

Students also developed strategies to combat these challenges, including:

  • Self-learning — students used independent online research to figure out answers to their questions and made up their own experiments to explore what they were learning in class
  • Structuring routines and environments — many students created fixed schedules for studying or used physical calendars to mark timelines and assignments
  • Learning with peers — students created remote study groups and held informal remote co-working sessions that combined homework with personal conversations, which helped keep them on task
  • Participating more in online spaces — many students found it less daunting to ask questions in online classes than in large lecture halls, others found it easier to participate in online office hours and meetings with advisers
  • Using communication platforms for emotional wellbeing — some students used telehealth or meditation apps, but almost all of them used video communication to check in with their friends. Students emphasized that these connections were critical for their mental health

“On an optimistic note, students are emerging with critical skills for learning and maintaining connectedness with peers over a distance,” Morris said. “These active coping skills, which include things such as initiating virtual co-working sessions, leveraging online functions to participate in class and checking in on friends in an emotionally sensitive way, will have continued value as we resume in-person and hybrid models of education.”

The team plans to follow students through all four years of their time at the UW. The first study cohort graduated this year, and the second cohort will graduate in spring 2022.

Additional co-authors are Jennifer Brown, an alumnus of the 91̽school of public health who is the research coordinator for this project; , a professor in the 91̽School of Social Work; and , 91̽doctoral students in the Paul G. Allen School of Computer Science & Engineering; , a doctoral student in the Information School; , a 91̽professor of electrical and computer engineering; , professor and dean of the 91̽Information School; , a researcher at Google; and , a professor in the Allen School. This research was funded by the National Science Foundation, the National Institute of Mental Health, Google, the Allen School, 91̽Department of Electrical & Computer Engineering, the 91̽College of Engineering and the 91̽Population Health Initiative.

For more information, contact Morris at margiemm@uw.edu.

Grant numbers: EDA-2009977, CHS-2016365, CHS-1941537, F31MH117827

]]>
91̽launches new Center for Research and Education on Accessible Technology and Experiences with $2.5 million investment from Microsoft /news/2020/05/28/create-announced/ Thu, 28 May 2020 14:00:48 +0000 /news/?p=68440
Martez Mott works on Smart Touch with Provail participant Ken Frye Photo: Dennis Wise/91̽

The 91̽ today announced the establishment of the Center for Research and Education on Accessible Technology and Experiences (CREATE). Fueled by a $2.5 million inaugural investment from Microsoft, is led by an interdisciplinary team whose mission is to make technology accessible and to make the world accessible through technology.

“We are proud to partner with the 91̽on their journey to build the CREATE center,” said Brad Smith, president of Microsoft. “This is the next step in a longstanding journey to empower people with disabilities with accessibility and technology advancements. 91̽has truly embedded accessibility as part of their culture and we’re proud to support their next step to drive thought leadership on accessibility to empower people with disabilities.”

On the 30th anniversary of the Americans with Disabilities Act’s enactment, there have been enormous strides in the accessibility of public spaces and the availability of personal mobility technologies. Yet, equitable participation in society depends on the successful use of technology, now more than ever.

People with disabilities are dependent on technology and if accessibility is not embedded into the start of the development process then it can leave people behind. Achieving accessibility involves expertise and innovation across a range of disciplines. As a result, the major challenge of developing technology to make a more accessible world is outpacing even the most talented individual researchers and small teams.

“CREATE will help us take accessible technology research and education from small, incremental gains to true breakthroughs. This chance to advance inclusion and participation for people of all abilities is the kind of opportunity that inspires the entire 91̽community,” said 91̽President Ana Mari Cauce.

The 91̽is a global leader in accessible technology research and design. The center will bring together existing areas of excellence and build upon the university’s ability to catalyze progress in education, research and translation. CREATE faculty bring multiple perspectives not just in technology but also disability rights and advocacy.

Read more about CREATE in this with Jacob O. Wobbrock, professor and inaugural co-director of the center.

The CREATE leadership team hails from six campus departments in three different colleges, including the Paul G. Allen School of Computer Science & Engineering, The Information School, Rehabilitation Medicine in the 91̽School of Medicine, Mechanical Engineering, Human Centered Design & Engineering, and the Disability Studies Program.

The center will build upon current projects in prioritizing and automating personalization, transitioning transportation to be accessible; augmenting abilities through wearable technologies; developing inclusive, intelligent systems and data sets; and “do-it-yourself” accessible technology production.

The 91̽and Microsoft have been working together in this space for more than a decade and share the same values and commitment to work with the disability community on driving innovation in accessibility research. This partnership has opened student internship and career opportunities, as well as ongoing research engagements with the at Microsoft Research. Current projects include developing audio-first representations of websites for smart speakers; understanding how perceptions of software developer job candidates with autism may impact hiring decisions; AI-based sign language recognition and translation as well as ongoing work on an ASL to English dictionary; and data-driven mental health apps.

 

See related stories in and .

 

In addition to the impact of Microsoft’s funding for this collaboration, the company’s endorsement of the UW’s accessibility work promises to catalyze additional investment, particularly in the Pacific Northwest, which, ultimately, could generate the full funding needed to provide long-term support for the center. The goal is to raise $10 million for CREATE to provide five years of support. The center employs a consortium model for academic, industry, and community partners.  CREATE is seeking additional partners who are interested in the deployment of accessible technology and the development of inclusive communities.

“The 91̽ has for many years led the field in cutting-edge accessible technology research and design,” said Jacob O. Wobbrock, professor and inaugural co-director of the center. “Our faculty and students are incredibly motivated to tackle the hard problems of accessibility. Now, with CREATE, we will be able to take on even bigger collaborative challenges in this space. I am honored to work with co-director Jennifer Mankoff, and to be supported by such world-class colleagues in the center.”

 

]]>
Single discrimination events alter college students’ daily behavior /news/2019/11/04/single-discrimination-events-alter-college-students-daily-behavior/ Mon, 04 Nov 2019 18:16:47 +0000 /news/?p=64672
91̽researchers used data from Fitbit activity trackers to compare how students’ daily activities change when the students experience unfair treatment. Photo: Addie Bjornson/91̽

Discrimination — differential treatment based on an aspect of someone’s identity, such as nationality, race, sexual orientation or gender — is linked to lower success in careers and poorer health. But there is little information about how individual discrimination events affect people in the short term and then lead to these longer-term disparities.

91̽ researchers aimed to understand both the prevalence of discrimination events and how these events affect college students in their daily lives.

Over the course of two academic quarters, the team compared students’ self-reports of unfair treatment to passively tracked changes in daily activities, such as hours slept, steps taken or time spent on the phone. On average, students who encountered unfair treatment were more physically active, interacted with their phones more and spent less time in bed on the day of the event. The team will Nov. 12 at the ACM Conference on Computer-Supported Cooperative Work in Austin, Texas.

“We looked at objective measures of behavior to try to really understand how this experience changed students’ daily life,” said lead author, a doctoral student in the 91̽Paul G. Allen School of Computer Science & Engineering. “The ultimate goal is to use this information to develop changes that we can make both in terms of the educational structure and individual support systems for students to help them succeed both during and after their time in college.”

The project started out as a way to monitor students’ mental health during college.

“I was struck by how many students suffered from mental health issues and depression, due in part to the increased stress of college and being away from home,” said co-author, professor and dean of the 91̽Information School. “Our approach in this paper, using passive sensing and data modeling, really lends itself to studying frequent events. Unfair treatment, or discrimination, might happen repeatedly in a quarter.”

The team recruited 209 first-year 91̽students from across campus for a study over the 2018 winter and spring academic quarters. Of the 176 students who completed the study, 41% were in the College of Engineering while the rest were spread between various academic colleges, 65% identified as women and 29% identified as first-generation college students.

Participants wore Fitbit Flex 2 devices to track daily activities like time asleep and physical activity. The students also had to track location, activity, screen unlocking events and phone call length.

The team sent the students a series of surveys throughout the six-month study, including short “check-in” surveys at least twice a week. During the weeks before midterm and final exams, the students got a variation of this survey four times every day. Among the survey questions: Had the student, in the past 24 hours, been unfairly treated because of “ancestry or national origin, gender, sexual orientation, intelligence, major, learning disability, education or income level, age, religion, physical disability, height, weight or other aspect of one’s physical appearance?”

“We had a very large table comparing everything, such as the number of steps that you’ve had for each day,” Sefidgar said. “We also marked the days for the reports when they exist. Then it’s a matter of determining for each individual whether there are changes for days with discrimination events compared to days with no events.”

Overall, the researchers collected around 450 discrimination events and about one terabyte of data. The team analyzed people’s actions on days when they were and weren’t experiencing discrimination. On average, when students reported an unfair event they walked 500 more steps, had one more phone call in the evening, interacted five more times with their phones in the morning and spent about 15 fewer minutes in bed compared to days when they didn’t experience discrimination.

“It’s so hard to summarize the impact of something like this in a few statistics,” said senior author, a professor in the Allen School. “Some people move more, sleep more or talk on the phone more, while some people do less. Maybe one student is reacting by playing games all day and another student put down their phone and went to hang out with a friend. It’s giving us a lot of questions to follow up on.”

Students listed ancestry or national origin, intelligence and gender as the top three reasons for experiencing unfair treatment.

The study likely didn’t capture all discrimination events, according to the researchers. For example, the survey didn’t include race as a reason for unfair treatment, and the students weren’t surveyed every day.

“This was just a snapshot of some of the things the students experienced on the 40 days we surveyed them,” Mankoff said. “But more than half of them reported experiencing at least one discrimination event, often four or five events.”

The team repeated this study in the 2019 spring quarter, and it plans to continue to gather data on students over the next few years. The researchers have also started interviewing students to get a better understanding of how unfair treatment happens in the context of their other experiences.

“This project is helping us better understand challenges that our students face in real time,” said co-author, the associate dean of diversity and access for the 91̽College of Engineering and the principal investigator for the program. “With this understanding we should be able to design better interventions to improve the climate for all students.”

To learn more about the project, check out the team’s .

The researchers also found that discrimination is associated with increased depression and loneliness, but less so for people with better social support.

“These results help underscore the deep impacts of discrimination on mental health, and the importance of resources like social support in helping to reduce the impact of discrimination in the long term,” said , a professor in the 91̽School of Social Work.

Students who completed the study received up to $245 and were allowed to keep their Fitbits.

“These students are not just giving us data, which sounds like some abstract, unemotional term. They are sharing deeply personal information with us,” Mankoff said. “It’s very important to me that we honor that gift by finding ways to help that don’t place the responsibility to deal with discrimination all on the individual. I’m not going to be satisfied if all we do is say, ‘If you just did X differently…’ Coping strategies are really important, but we also need to ask how we can change the structural things that are leading to these experiences.”

Additional co-authors are, a doctoral student at the University of Michigan who helped run the study after completing his undergraduate degree at the UW;, a clinical psychology doctoral student at the UW;, a professor in the Allen School; and , the founding director of the 91̽Resilience Lab. This research was funded by the National Science Foundation; the National Institute on Disability, Independent Living and Rehabilitation Research; the 91̽College of Engineering; the Allen School; and the 91̽Department of Electrical & Computer Engineering.

For more information, contact the team at uwexperience@uw.edu and Mankoff at jmankoff@cs.washington.edu.

Grant numbers: IIS1816687, IIS7974751, 90DPGE0003-01

]]>
Researchers develop 3D printed objects that can track and store how they are used /news/2018/10/09/3-d-printed-analytics/ Tue, 09 Oct 2018 18:34:25 +0000 /news/?p=59229
Researchers at the 91̽ have developed 3D printed assistive technology that can track and store their use — without using batteries or electronics. Photo: Mark Stone/91̽

Cheap and easily customizable, 3D printed devices are perfect for assistive technology, like prosthetics or “smart” pill bottles that can help patients remember to take their daily medications.

But these plastic parts don’t have electronics, which means they can’t monitor how patients are using them.

For journalists

Now engineers at the 91̽ have developed 3D printed devices that can track and store their own use — without using batteries or electronics. Instead, this system uses a method called backscatter, through which a device can share information by reflecting signals that have been transmitted to it with an antenna.

“We’re interested in making accessible assistive technology with 3D printing, but we have no easy way to know how people are using it,” said co-author , a professor in the UW’s Paul G. Allen School of Computer Science & Engineering. “Could we come up with a circuitless solution that could be printed on consumer-grade, off-the-shelf printers and allow the device itself to collect information? That’s what we showed was possible in this paper.”

The team behind the 3D printed wireless analytics project. Back row (left to right): Vikram Iyer, Jennifer Mankoff, Ian Culhane; Front row: Shyam Gollakota, Justin Chan. Photo: Mark Stone/91̽

The 91̽team will present Oct. 15 at the in Berlin.

Previously the team developed the . These purely plastic devices can measure if a detergent bottle is running low and then automatically order more online.

“Using plastic for these applications means you don’t have to worry about batteries running out or your device getting wet. That can transform the way we think of computing,” said senior author , an associate professor in the Allen School. “But if we really want to transform 3D printed objects into smart objects, we need mechanisms to monitor and store data.”

The researchers tackled the monitoring problem first. In their previous study, their system tracks movement in one direction, which works well for monitoring laundry detergent levels or measuring wind or water speed. But now they needed to make objects that could monitor bidirectional motion like the opening and closing of a pill bottle.

“Last time, we had a gear that turned in one direction. As liquid flowed through the gear, it would push a switch down to contact the antenna,” said lead author , a doctoral student in the 91̽Department of Electrical & Computer Engineering. “This time we have two antennas, one on top and one on bottom, that can be contacted by a switch attached to a gear. So opening a pill bottle cap moves the gear in one direction, which pushes the switch to contact one of the two antennas. And then closing the pill bottle cap turns the gear in the opposite direction, and the switch hits the other antenna.”

Movement is captured when the switch contacts one of the two antennas.

Both of the antennas are identical, so the team had to devise a way to decode which direction the cap was moving.

These gears’ teeth encode specific messages. Photo: Kiyomi Taguchi/91̽

“The gear’s teeth have a specific sequencing that encodes a message. It’s like Morse code,” said co-author , a doctoral student in the Allen School. “So when you turn the cap in one direction, you see the message going forward. But when you turn the cap in the other direction, you get a reverse message.”

In addition to tracking, for example, pill bottle cap movement, this same method can be used to monitor how people use prosthetics, such as 3D printed . These mechanical hands, which attach at the wrist, are designed to help children with hand abnormalities grasp objects. When children flex their wrists, cables on the hand tighten to make the fingers close. So the team 3D printed an e-NABLE arm with a prototype of their bidirectional sensor that monitors the hand opening and closing by determining the angle of the wrist.

The researchers also wanted to create a 3D printed object that could store its usage information while out of Wi-Fi range. For this application, they chose an insulin pen that could monitor its use and then signal when it was getting low.

“You can still take insulin even if you don’t have a Wi-Fi connection,” Gollakota said. “So we needed a mechanism that stores how many times you used it. Once you’re back in the range, you can upload that stored data into the cloud.”

This method requires a mechanical motion, like the pressing of a button, and stores that information by rolling up a spring inside a ratchet that can only move in one direction. Each time someone pushes the button, the spring gets tighter. It can’t unwind until the user releases the ratchet, hopefully when in range of the backscatter sensor. Then, as the spring unwinds, it moves a gear that triggers a switch to contact an antenna repeatedly as the gear turns. Each contact is counted to determine how many times the user pressed the button.

Each time someone pushes the button, a spring inside the ratchet gets tighter.

These devices are only prototypes to show that it is possible for 3D printed materials to sense bidirectional movement and store data. The next challenge will be to take these concepts and shrink them so that they can be embedded in real pill bottles, prosthetics or insulin pens, Mankoff said.

“This system will give us a higher-fidelity picture of what is going on,” she said. “For example, right now we don’t have a way of tracking if and how people are using e-NABLE hands. Ultimately what I’d like to do with these data is predict whether or not people are going to abandon a device based on how they’re using it.”

, an undergraduate student in the 91̽Department of Mechanical Engineering, is also a co-author on this paper. This research was funded by the National Science Foundation and Google Faculty Awards.

###

For more information, contact the research team at printedanalytics@cs.washington.edu.

]]>
Screen reader plus keyboard helps blind, low-vision users browse modern webpages /news/2018/04/18/screen-reader-plus-keyboard-helps-blind-low-vision-users-browse-modern-webpages/ Wed, 18 Apr 2018 16:30:31 +0000 /news/?p=57296 Browsing through offerings on Airbnb means clicking on rows of photos to compare options from prospective hosts. This kind of table-based navigation is increasingly central to our digital lives – but it can be tedious or impossible for people who are blind or have low vision to navigate these modern webpages using traditional screen readers.

A new approach developed by engineers at the 91̽ and Carnegie Mellon University uses the keyboard as a two-dimensional way to access tables, maps and nested lists. Results to be presented April 25 at the conference in Montreal find this tool lets blind and low-vision users navigate these kinds of sites much more successfully than screen readers alone.

fingers above keyboard with computer screen above
A mockup shows how a user could press keys to select a top-level menu, submenu, and then click through options on a nested list to book a sightseeing activity through Airbnb. Photo: 91̽

“We’re not trying to replace screen readers, or the things that they do really well,” said senior author , a professor in the UW’s Paul G. Allen School of Computer Science. “But tables are one place that it’s possible to do better. This study demonstrates that we can use the keyboard to bring tangible, structured information back, and the benefits are enormous.”

The new tool, Spatial Recognition Interaction Techniques, or SPRITEs, maps different parts of the keyboard to areas or functions on the screen. A research trial asked 10 people, eight of whom were blind and two with low vision, to complete a series of tasks using their favorite screen reader technology, and then using that technology plus SPRITEs. After a 15-minute tutorial, three times as many participants were able to complete spatial web-browsing tasks within the given time limit using SPRITEs, even though all were experienced with screen readers.

The SPRITEs tool uses keys to navigate a webpage. The top three rows activate menu and submenu items. The keys along the top row and outside edges act as horizontal and vertical coordinates for a table or map. Photo: 91̽

The tool has users press keys to prompt the screen reader to move to certain parts of the website. For instance, number keys, along the top of the keyboard, map to menu buttons. Double-clicking on a number opens that menu item’s submenu, and then the top row of letters lets the user select each item in the submenu. For tables and maps, the keys on the outside edge of the keyboard act like coordinates that let the user navigate to different areas of the two-dimensional feature.

Tapping a number key might open an icon for each Airbnb menu option, for example. Then tapping the letter “u” could read out the entry that says whether this host will accept pets. (The AirBnB example illustrates how the system could work; the system’s current implementation is confined to wiki-style webpages.)

“Rather than having to browse linearly through all the options, our tool lets people learn the structure of the site and then go right there,” Mankoff said. “You can learn which part of the keyboard you need to jump right down and check, say, whether dogs are allowed.”

Most of the test participants couldn’t complete a task such as find an item in a submenu or find specific information in a table using their favorite screen reader, but could complete it using SPRITEs.

More study participants could complete tasks involving menus, tables and maps by using SPRITEs (orange bars) compared to using a screen reader alone (blue bars). Photo: 91̽

“A lot more people were able to understand the structure of the webpage if we gave them a tactile feedback,” said co-author , a doctoral student at Carnegie Mellon University who conducted the tests in Pittsburgh. “We’re not trying to replace the screen reader, we’re trying to work in conjunction with it.”

For straightforward text-based tasks such as finding a given section header, counting headings in a page or finding a specific word, participants were able to complete them successfully using either tool.

SPRITEs is one of a suite of tools that Mankoff’s group is developing to help visually impaired users navigate items on a two-dimensional screen. An ethnographic in 2016 led by doctoral student Mark Baldwin and faculty member Gillian Hayes, both at the University of California, Irvine, observed about a dozen students over four months while they learned to use accessible computing tools, in order to find areas for improvement in screen reading technology.

Now that the team has developed and tested SPRITEs, it plans to make the system more robust for any website and then add it to , a free, online screen reader . Adding SPRITEs would let users navigate with their keyboard while using the WebAnywhere plugin to read information displayed on a webpage. The team also plans to develop a similar technique that would augment screen-reading technology on mobile devices.

“We hope to deploy something that will make a difference in people’s lives,” Mankoff said.

Other co-authors of the paper presented at the CHI meeting are Duncan McIsaac and Elliot Lockerman at Carnegie Mellon University. The research was funded by the U.S. Department of Health and Human Services.

###

For more information, contact Mankoff at jmankoff@cs.washington.edu.

HHS grant: 90DP5004-01-00

]]>