Technology for older adults and their care partners, typically marketed for safety and security, has been evolving for years, and now includes everything from location trackers to companion robots.
But often left out of decisions about how and why to use that technology are the older adults themselves, says , associate professor of social work at the 91探花. Berridge studies issues facing older adults, in particular technology that can support care or a person鈥檚 ability to live independently. She recently published two articles related to older adults and : In articles for the and , Berridge explores older adults鈥 opinions of companion robots, finding that such devices may not provide the blanket comfort or utility that creators presume 鈥 and that older adults have an interest in data protections.
鈥淥lder adults have been learning about, adapting and integrating technology solutions into their lives for longer than anyone,鈥 said Berridge. 鈥淥lder adults鈥 feelings about technologies on offer to them for care and living at home, and their creative use, resistance and other interactions with these technologies should be taken seriously. So much research, time and money has been focused on pushing acceptance of technologies that could be better spent by older adults over direction, purpose and design.鈥
Berridge spoke with 91探花News about the importance of involving older adults in the design and use of technology.
听
What do you think is important for people outside the field to understand?
CB: One of the themes in my research is that older adults are rarely empowered to refuse or negotiate how technology is used in their care. That doesn鈥檛 prevent many from and negotiating nevertheless 鈥 older adults are not passive users 鈥 but that this kind of engagement is often discouraged in design and implementation. My research on long-term care in peoples鈥 homes and in residential facilities has found that people are not meaningfully engaged in decisions about how and what data should be collected about them. There鈥檚 a misperception that most don鈥檛 want to be involved or consulted. Older adults are often configured as passive data points.

This matters because embedded in these technologies are certain values (e.g., that safety justifies invasion of privacy) or limitations on the older adult (i.e., they can鈥檛 deviate from routine, like younger adults can, without triggering an intervention). With the algorithmic management of care, the person being targeted by technology may not share the priorities embedded in the devices themselves. When the technology practice enables older adults to be controlled, rather than enabling them to have control, this intensifies unbalanced power dynamics in care. It can mean restrictions or exercise of control over their lives. There鈥檚 a lot of at play in how technologies used in care are developed, hyped and implemented. And ableism, particularly when it comes to .
听
People might assume that surveillance technology is a means of protection. Can you talk about the positive and negative aspects of this kind of technology?
CB: Different people are going to weigh the range of implications differently 鈥 in a single day I have talked with a person who likened her daughter using sensors to track her routines to the Gestapo, and another person without a support network who 鈥渨ish[ed] someone cared enough about me to watch me on camera!鈥
听As many 鈥 including faculty at the 91探花鈥 have , power is implicated in all the decisions (what data is collected, what鈥檚 excluded, how categories are determined, what behavior is allowed or prohibited 鈥 the very definition of the problem the technology is targeting). We have been in a wakeup call moment, as COVID further exposed the dramatic racial disparities in health among older people and the lack of access to home care. Where鈥檚 the meaningful policy action? Where鈥檚 the influx of resources needed to combat social exclusion and racial inequities in health and long-term care access? What we see are efforts to give lonely older adults AI companionship, deploy digital contact tracing in nursing homes, and employ AI to manage people more efficiently. Whose problem is that addressing? What are the opportunity costs of pursuing sideshow solutions that don鈥檛 touch the underlying problems?
听
You note in a recent study that some state aging agencies are distributing companion robots to assist with loneliness. Are they a solution?
CB: Prompted by the pandemic, a lot of states have passed out furry pet-like or speaking robots with the goal of mitigating loneliness.
Together with researchers at Oregon Health & Science University, I surveyed more than 800 people, half of whom were over age 64, about whether they thought an artificial companion robot 鈥 one that can 鈥渢alk and listen,鈥 so not like Paro or other plush non-talking robots 鈥 would help them feel less lonely, if they were. Only 3% replied 鈥渄efinitely yes.鈥 A quarter answered 鈥渄efinitely no鈥 and a significantly greater percentage answered 鈥減robably no鈥 than 鈥減robably yes.鈥 We also asked about comfort with small companion robots and different types of data use on the horizon.
This comment we heard in the study echoed a common sentiment among participants. A participant told us: 鈥淥ne of the problems I see with how we care for the elderly is the lack of contact with others. I am afraid that these measures would lead to less and less human contact for these folks. It might become easier and cheaper for the care system to use these measures and for our elderly to become more and more isolated.鈥 Others acknowledged positive potential uses of AI-enabled robots to assist with physical tasks 鈥 capacity that is limited at the moment 鈥 and drew a line at social interaction.
But the most-raised issue was invasion of privacy, and perception that artificial companion robot-based data collection is excessive monitoring. Many participants also raised concerns about data security, third party use, or exploitation of data. For example, a participant explained that an AI companion or device could be especially helpful in alerting emergency personnel, but they were concerned about how data from such a device would be stored and used by third-party companies. Clearly, older adults would benefit from new regulation of AI, though their interests haven鈥檛 been represented very well so far in those discussions.
How might older adults be more included in designing technologies?
CB: Last spring, I had the pleasure of working with design students from the 91探花School of Art + Art History + Design with my colleague, , associate professor of interaction design at the UW. Her Interface Design 2 students designed fun and creative IoT (Internet of Things) system prototypes, many of which were better targeted to the breadth of desires and needs of older adults than what the current 鈥淎geTech鈥 space is producing. Dr. Desjardins had students each interview older people and then as a class share back and collectively identify themes 鈥 before conceiving of their products. And unlike many marketed devices, students provided their imagined users with transparent information about data flows.
You鈥檝e developed a prototype web app, Let鈥檚 Talk Tech, that would help older adults with dementia and their caregivers make joint decisions about technology. What are you finding from that approach?
CB: This is the first-of-its-kind tool to help families meaningfully engage people living with dementia in decisions about technology use. I worked with the 91探花 (CIRG) to develop Let鈥檚 Talk Tech as a web app and with doctoral student Natalie Turner to pilot the app with volunteers from the 91探花Alzheimer鈥檚 Disease Research Center.
Families or others supporting people living with dementia don鈥檛 have those conversations about what the person would or would not want used with them, for understandable reasons. So the families need help, because they鈥檙e left to navigate a complex technological landscape without a map. Having an informed conversation with the person living with mild-stage Alzheimer鈥檚 disease about their hopes and concerns and feelings about various technologies and other support options gave our pilot study participants a significantly better feeling of . They told us it provided the structure and direction for hard conversations that may not have otherwise happened. Many reported that it was not an easy conversation, but all noted that any discomfort was worth what was gained.
For more information, contact Berridge at clarawb@uw.edu.