Department of Industrial & Systems Engineering – 91探花News /news Tue, 24 Feb 2026 23:09:07 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 Statewide effort to put more whole grains on shelves and plates gets $19M boost /news/2025/10/29/uw-wsu-grains-grant/ Wed, 29 Oct 2025 21:06:35 +0000 /news/?p=89755 A person uses a dough scraper to work a lump of bread dough.
A worker at WSU鈥檚 BreadLab shapes dough into a loaf. Credit: Washington State University

A statewide initiative to put more healthy, climate-friendly grains on people鈥檚 plates has received a $19 million boost, which will sustain every step in building a network from the field to the fork.

The initiative, a public-private partnership led by Washington State University with support from the 91探花, received a $10 million BioInnovation Grant from the and matching funds from several other organizations, including more than $3 million from the Washington Grain Commission.

It targets a global health problem: the lack of whole grains in people鈥檚 diets, which contributes to widespread health problems.

The funding will allow WSU researchers to continue developing new crop varieties for farmers. It will fuel efforts to bring more whole grains to the public, including into school lunchrooms and will expand Washington state鈥檚 commercial infrastructure for storing, transporting, milling and marketing whole grains. The funding will also support the establishment of a commercial kitchen at the 91探花to help entrepreneurs bring whole-grain foods to market.

鈥淭his work is about making sure that nutritious grains reach the people who need them,鈥 said , a professor of environmental and occupational health sciences at the UW. 鈥淏y understanding the policies, systems, and human decisions that shape food production and the supply chain for school meals, we can help bridge the gap between innovation and impact.鈥

Two teams of 91探花researchers will contribute to this effort.听

, professor of industrial & systems engineering and of mechanical engineering at the UW, will help lead development of ready-to-eat meals and will support private organizations using 91探花facilities to produce sample meals for school breakfast and lunch.听

The UW鈥檚 implementation science team, which includes Otten, assistant professor of environmental and occupational health sciences , and assistant professor of epidemiology , will examine how innovations in grain breeding and food product development can be successfully adopted in school settings. They will study what policy, budgetary, and social factors help ensure that new whole grain and legume varieties are embraced across the supply chain and, ultimately, by school-aged children who rely on them for the nutrition they need to grow and thrive.

This team will also lead study-away programs, where students can learn about new whole grains and legumes in both urban and rural areas of Washington state. Curriculum from these five-week summer programs will be made publicly available.听

鈥淭he timing of the grant is perfect,鈥 said , a WSU professor of international seed and cropping systems and director of the WSU Breadlab, who will lead the grant work. 鈥淲e鈥檙e right at the stage where we鈥檝e got a critical mass of cross-disciplinary research, encompassing a range of agricultural, food and health sciences. Now we can start commercializing, getting these crop varieties to farmers, getting whole grains on our plates and into schools.鈥

The grant funding will be matched by contributions from the Washington Grain Commission, the USA Dry Pea and Lentil Council, the American Heart Association, The Land Institute, and food and technology companies.

鈥淭his is truly a historic investment for Washington farmers,鈥 said Casey Chumrau, CEO of the Washington Grain Commission.

Adapted from . For more information or to reach the project team, contact Alden Woods at acwoods@uw.edu or WSU鈥檚 Shawn Vestal at shawn.vestal@wsu.edu.

]]>
12 91探花professors elected to Washington State Academy of Sciences /news/2025/07/21/wsas-2025/ Mon, 21 Jul 2025 17:03:41 +0000 /news/?p=88625  

A photo collage featuring headshots of 12  91探花faculty members.
Pictured in order, starting from the top left: Rona Levy, Horacio de la Iglesia, Jashvant Unadkat, Eric Steig, Kai-Mei Fu, Julie Kientz, Magdalena Balazinska, David Hertzog, Cynthia Chen, Shelly Sakiyama-Elbert, Scott Ramsey, Donald Chi. Photo collage credit: Alex Bartick

Twelve faculty members at the 91探花 have been elected to the Washington State Academy of Sciences. They are among 36 scientists and educators from across the state July 17 as new members. Election recognizes the new member鈥檚 鈥渙utstanding record of scientific and technical achievement and willingness to assist the Academy in providing the best available scientific information and technical understanding to inform complex policy decisions in Washington.鈥

The 91探花faculty members were selected by current WSAS members or by their election to national science academies. Eleven were voted on by current WSAS members:

, professor, Bill & Melinda Gates Chair, and director of the Paul G. 听Allen School for Computer Science & Engineering, for 鈥渃ontributions in data management for data science, big data systems, cloud computing and image/video analytics and leadership in data science education.鈥

professor of civil & environmental engineering and of industrial & systems engineering, for 鈥減ioneering work in human mobility analysis and infrastructure resilience, which have transformed transportation systems in terms of both demand and supply, and shaped the future directions of transportation systems research on community-based solutions and disaster resilience.鈥

Lloyd and Kay Chapman Endowed Chair for Oral Health and associate dean for research in the 91探花School of Dentistry, and professor in the Department of Health Systems & Population Health, for 鈥渓eadership in understanding and addressing children’s oral health inequities through community-based socio-behavioral interventions and evidence-based policies.鈥

professor of biology, for 鈥渋nternationally recognized leadership in the biology of sleep, including groundbreaking research on molecular and genetic aspects of the brain, human behavioral studies on learning under varied sleep schedules, and contributions that have shaped policy on school schedules and standard time.鈥

, the Virginia and Prentice Bloedel professor of physics and of electrical & computer engineering, for 鈥渇oundational contributions to fundamental and applied research on the optical and spin properties of quantum point defects in crystals and for service and leadership in the quantum community.鈥

, professor and chair of human centered design and engineering, for 鈥渁ward-winning leadership in HCI computing, whose research has advanced health and education technology, influenced policy, and shaped the HCI field of through impactful scholarship, interdisciplinary collaboration and inclusive, real-world technology design.鈥

, professor and associate dean for research in the 91探花School of Social Work, for 鈥渃ontributions to understanding psychosocial and physiological factors that moderate the effectiveness of their interventions and ultimately improve the health of children with abdominal pain disorders.鈥

, professor of medicine in the 91探花School of Medicine and of pharmacy, 鈥渇or leadership in health economics and cancer research, including work on financial toxicity, cost- effectiveness, and healthcare policy that has influenced national discussions, improved cancer care access, and shaped policies for equitable and sustainable healthcare.鈥 Ramsey is also Director of the Cancer Outcomes Research Program at Fred Hutch.

, professor of bioengineering and Vice Dean of Research and Graduate Education in the 91探花School of Medicine, for 鈥渘ational leadership in biomedical research, research policy, and graduate education, including pioneering novel drug delivery approaches for regenerative medicine applications in the nervous system and other tissues such as bone, cartilage, tendon and skin.鈥

, Rabinowitz Endowed Professor of Earth and space sciences, for 鈥渞evolutionizing our understanding of climate change in Antarctica through pioneering ice core extractions under hazardous Antarctic conditions and their subsequent analyses over two decades, and for applying that expertise to advance climate research in Washington State.鈥

, professor of pharmaceutics, for 鈥減ioneering contributions to pharmaceutical and translational sciences, including groundbreaking research on drug transporters, PBPK modeling and maternal-fetal pharmacology that have helped shaped drug safety policies.鈥

The Academy also welcomed new members who were selected by virtue of their election to the National Academies of Science, Engineering or Medicine. Among them is , the Arthur B. McDonald professor of physics and director of the Center for Experimental Nuclear Physics and Astrophysics. Hertzog was elected to the National Academy of Sciences last year.

]]>
Q&A: Helping robots identify objects in cluttered spaces /news/2024/02/07/qa-helping-robots-identify-objects-in-cluttered-spaces/ Wed, 07 Feb 2024 18:12:44 +0000 /news/?p=84356
Researchers at the 91探花 have developed a method that teaches a low-cost robot to identify objects on a cluttered shelf. For the test, the robot (shown here in the center of the photo) was asked to identify all objects on the shelf in front of it. Photo: Samani and Banerjee/IEEE Transactions on Robotics

Imagine a coffee cup sitting on a table. Now, imagine a book partially obscuring the cup. As humans, we still know what the coffee cup is even though we can’t see all of it. But a robot might be confused.

Robots in warehouses and even around our houses struggle to identify and pick up objects if they are too close together, or if a space is cluttered. This is because robots lack what psychologists call “object unity,” or our ability to identify things even when we can’t see all of them.

Researchers at the 91探花 have developed a way to teach robots this skill. The method, called THOR for short, allowed a low-cost robot to identify objects 鈥 including a mustard bottle, a Pringles can and a tennis ball 鈥 on a cluttered shelf. In published in IEEE Transactions on Robotics, the team demonstrated that THOR outperformed current state-of-the-art models.

91探花News reached out to senior author , 91探花associate professor in both the industrial & systems engineering and mechanical engineering departments, for details about how robots identify objects and how THOR works.

Ashis Banerjee Photo: 91探花

How do robots sense their surroundings?

Ashis Banerjee: We sense the world around us using vision, sound, smell, taste and touch. Robots sense their surroundings using one or more types of sensors. Robots “see” things using either standard color cameras or more complex stereo or depth cameras. While standard cameras simply record colored and textured images of the surroundings, stereo and depth cameras also provide information on how far away the objects are, just like our eyes do.

On their own, however, the sensors cannot enable the robots to make “sense” of their surroundings. Robots need a visual perception system, similar to the visual cortex of the human brain, to process images and detect where all the objects are, estimate their orientations, identify what the objects might be and parse any text written on them.

Why is it hard for robots to identify objects in cluttered spaces?

AB: There are two main challenges here. First, there are likely a large number of objects of varying shapes and sizes. This makes it difficult for the robot鈥檚 perception system to distinguish between the different object types. Second, when several objects are located close to each other, they obstruct the views of other objects. Robots have trouble recognizing objects when they don’t have a full view of the object.

Are there any types of objects that are especially hard to identify in cluttered spaces?

AB: A lot of that depends on what objects are present. For example, it is challenging to recognize smaller objects if there are a variety of sizes present. It is also more challenging to differentiate between objects with similar or identical shapes, such as different kinds of balls, or boxes. Additional challenges occur with soft or squishy objects that can change shape as the robot collects images from different vantage points in the room.

So how does THOR work and why is it better than previous attempts to solve this problem?

AB: THOR is really the brainchild of lead author , who completed this research as a 91探花doctoral student. The core of THOR is that it allows the robot to mimic how we as humans know that partially visible objects aren’t broken or entirely new objects.

THOR does this by using the shape of objects in a scene to create a 3D representation of each object. From there it uses topology, an area of mathematics that studies the connectivity between different parts of objects, to assign each object to a “most likely” object class. It does this by comparing its 3D representation to a library of stored representations.

Check out a .

THOR does not rely on training machine learning models with images of cluttered rooms. It just needs images of each of the different objects by themselves. THOR does not require the robot to have specialized and expensive sensors or processors, and it also works well with commodity cameras.

This means that THOR is very easy to build, and is, more importantly, readily useful for completely new spaces with diverse backgrounds, lighting conditions, object arrangements and degree of clutter. It also works better than the existing 3D shape-based recognition methods because its 3D representation of the objects is more detailed, which helps identify the objects in real time.

How could THOR be used?

AB: THOR could be used with any indoor service robot, regardless of whether the robot operates in someone鈥檚 home, an office, a store, a warehouse facility or a manufacturing plant. In fact, our experimental evaluation shows that THOR is equally effective for warehouse, lounge and family room-type spaces.

While THOR performs significantly better than the other existing methods for all kinds of objects in these cluttered spaces, it does the best at identifying kitchen-style objects, such as a mug or a pitcher, that typically have distinctive but regular shapes and moderate size variations.

Green boxes shown here surround the objects that the robot correctly identified. Red boxes surround incorrectly identified items. Photo: Samani and Banerjee/IEEE Transactions on Robotics

What’s next?

There are several additional problems that need to be addressed, and we are working on some of them. For example, right now, THOR considers only the shape of the objects, but future versions could also pay attention to other aspects of appearance, such as color, texture or text labels. It is also worth looking into how THOR could be used to deal with squishy or damaged objects, which have shapes that are different from their expected configurations.

Also, some spaces may be so cluttered that certain objects might not be visible at all. In these scenarios, a robot needs to be able to decide to move around to “see” the objects better, or, if allowed, move around some of the objects to get better views of the obstructed objects.

Last but not least, the robot needs to be able to deal with objects it hasn’t seen before. In these scenarios, the robot should be able to place these objects into a “miscellaneous” or “unknown” object category, and then seek help from a human to correctly identify these objects.

This research was funded in part by an Amazon Research Award.

For more information, contact Banerjee at ashisb@uw.edu.

]]>
Research led by 91探花undergrad shows ultrafine air pollution reflects Seattle鈥檚 redlining history /news/2023/07/05/research-led-by-uw-undergrad-shows-ultrafine-air-pollution-reflects-seattles-redlining-history/ Wed, 05 Jul 2023 15:59:47 +0000 /news/?p=81812
DEOHS student Magali Blanco, a co-author of the ultrafine particle study, checks mobile monitoring equipment used to gather air samples in the Seattle area.听 Photo: Sarah Fish.

Despite their invisibly small size, ultrafine particles have become a massive concern for air pollution experts. These tiny pollutants 鈥 typically spread through wildfire smoke, vehicle exhaust, industrial emissions and airplane fumes 鈥 can bypass some of the body鈥檚 built-in defenses, carrying toxins to every organ or burrowing deep in the lungs.听

New research from the 91探花 found that those effects aren鈥檛 felt equitably in Seattle. The most comprehensive study yet of long-term ultrafine particle exposure found that concentrations of this tiny pollutant reflect the city鈥檚 decades-old racial and economic divides.

The study, in Environmental Health Perspectives, also found that racial and socioeconomic disparities in ultrafine particle exposure are larger than those observed in more commonly studied pollutants, like fine particles (PM 2.5) and nitrogen dioxide (NO2).

The study used mobile monitoring 鈥 a car loaded with air pollution sensors driving around the city for the better part of a year 鈥 to examine long-term average levels of four pollutants: soot (or black carbon), fine particles (PM 2.5), nitrogen dioxide (NO2) and ultrafine particles. Researchers found the highest concentrations of all four pollutants on census blocks with median household incomes under $20,000 and those with proportionately larger Black populations.

Disparities in concentrations of ultrafine particles 鈥 which are less than 0.1 micron in diameter, or 700 times thinner than the width of a single human hair 鈥 were especially stark. Blocks with median incomes under $20,000 had long-term UFP concentrations 40% higher than average. Blocks where median incomes are over $110,000, meanwhile, saw UFP concentrations 16% lower than average.听

鈥淲e found greater disparities with this pollutant of emerging interest, a pollutant that hasn鈥檛 been well-characterized. That鈥檚 very interesting,” said senior author , a 91探花professor in the Department of Environmental and Occupational Health Sciences. 鈥淥ur work has shown the highest ultrafine particle concentrations are north of the airport and below common aircraft landing paths, downtown, and south of downtown where there are port and other industrial activities.”

The study also found that modern-day air pollution disparities mirror Seattle鈥檚 history of redlining, the racist practice that denied racial minorities and low-income residents access to bank loans, homeownership and other wealth-building opportunities in more 鈥渄esirable鈥 areas. The practice shaped American cities throughout the early 20th century, building a foundation of segregation and environmental racism.

Today, neighborhoods once classified as 鈥渉azardous鈥 are still exposed to higher concentrations of pollution than those once labeled 鈥渄esirable,鈥 the study found. This was true for all sizes of particles. The spatial disparities were largest, however, in Seattle neighborhoods that received no label because they were once considered industrial areas.听

In those previously industrial areas, ultrafine particle concentrations were 49% above average.听

鈥淭hese results are important because air pollution exposure has been shown to lead to detrimental health effects, and these health effects disproportionately impact racialized and low-income communities,鈥 said , the study鈥檚 lead author, who graduated from the 91探花in 2022 with a degree in industrial and systems engineering. 鈥淣otably, air pollution is just one factor, and there are plenty of other examples of how systemic racism is detrimental to people’s health and well-being.鈥

Bramble said the results didn鈥檛 surprise her. She was raised in Tacoma, in a neighborhood near Interstate 5, where the constant crush of cars and diesel trucks spewed pollution into the air. And as a student journalist at the UW, she researched the relationship between redlining, green spaces, heat and air pollution.听

鈥淚n the case of air pollution exposures, these policies affect the health of real people. I think at a time where the teaching of systemic racism is a controversial topic in this country, being ignorant is not going to reduce the number of children who suffer from asthma due to air pollution,鈥 Bramble said. 鈥淚nstead, I hope we can have conversations about how past policies affect us today, to drive efforts toward a healthier, sustainable society.鈥

Bramble proposed and carried out this study for the grant program, which provides National Institute of Environmental Health Sciences funding and mentorship to undergraduates from underrepresented backgrounds to pursue research. She joined the program in June 2020 under Sheppard鈥檚 mentorship.听

Other 91探花authors are Magali Blanco, Annie Doubleday and Amanda Gassett of the Department of Environmental and Occupational Health Sciences, Anjum Hajat of the Department of Epidemiology and Julian Marshall of the Department of Civil and Environmental Engineering.听

For more information, contact Sheppard at sheppard@uw.edu.听

]]>
Video: Using ‘Street View’ to track pandemic in Seattle over time /news/2020/10/05/video-using-street-view-to-track-seattles-pandemic-over-time/ Mon, 05 Oct 2020 20:29:55 +0000 /news/?p=70844

As the city of Seattle shut down in March 2020 to try to slow the spread of COVID-19, a group of 91探花 researchers decided to track how the city would react.

Driving a pre-determined route around the city with a 360-degree camera, the team collected thousands of images, not unlike Google Street View. By doing this regularly over 15 months, they hope to literally map Seattle’s recovery. The route goes through economically diverse neighborhoods, passing businesses, schools, churches and hospitals. The project began in May and is expected to go through fall of 2021.

A computer program was developed to identify and sort objects like pedestrians, cars and buildings. Employing artificial intelligence to count objects from the massive collection of photos allows the images to be made into quantifiable data.

With these numbers, the study hopes to answer questions such as: How many people are out and about at different points in time? Are restaurants open, and where? What kinds of vehicles are on the road?

Different neighborhoods may recover at different rates. Researchers hope they’ll get insight into what factors make communities resilient and how to better prepare for potential future pandemics and other disasters.

More on the Seattle Street View Campaign here.

]]>
91探花researchers driving around Seattle to track COVID-19 response over time /news/2020/09/30/uw-researchers-drive-around-seattle-track-covid-19-response-over-time/ Wed, 30 Sep 2020 15:49:54 +0000 /news/?p=70754
91探花researchers developed a project that scans the streets every few weeks to document how Seattle has reacted to the pandemic and what recovery looks like. The team is developing algorithms to help identify things such as cars, people and whether they are physically distancing in each frame. Photo: 91探花

As the city of Seattle shut down in March 2020 to try to slow the spread of COVID-19, a group of 91探花 researchers got to work.

For journalists

The team developed a project that scans the streets every few weeks to document what’s happening around the city 鈥 answering questions such as: Are people outside? Are restaurants open? This project, which began in May and will continue until at least fall of 2021, collects images of how Seattle has reacted to the pandemic and what recovery looks like. This creates a massive dataset that documents what was happening at any particular point in time. The researchers hope the data will help answer questions about what makes a city resilient and how to better prepare for potential future pandemics and other disasters.

The team will present this project Oct. 1 at the through the 91探花School of Public Health.

“We talk about resilience a lot in disaster sciences. There are lots of theories about what makes a community resilient to natural hazards, but we don’t fully understand resilience to pandemics, partially because we just haven’t been through these events at this scale,” said co-lead researcher , an assistant professor of environmental and occupational health sciences. “This project provided us with an opportunity to see what’s important for resilience in this context. What are people doing? Where are they recreating? Are they following distancing and mask-wearing recommendations? And how do their activities change as the pandemic progresses?”

Video footage taken from the team’s first drive on May 1, 2020.

To track what’s happening in Seattle, the researchers drive a car with a camera similar to Google Street View on top throughout the city.

“This is an amazing tool for quickly gathering highly perishable data from across the city,” said co-lead researcher , a professor of civil and environmental engineering. “Unless we capture these scenes now, these sights 鈥 and the rich data they contain 鈥 will be lost forever. I can already see a significant difference between the May dataset and what’s happening now. For example, when we first drove past Harborview Medical Center, no one was present on the block. Now it’s beginning to look like it used to.”

A photo of an intersection by Harborview Medical Center. Only three people are in the frame.
A photo of an intersection by Harborview Medical Center. There are more people in the frame than in June.
A photo of an intersection by Harborview Medical Center. There are more people in the frame than in July.

The team captured this series of photos from outside Harborview Medical Center between June and August 2020. The June photo shows very few people in the area. In July, there are people waiting at the bus stop. By August, there are more people at the bus stop and the surrounding areas.听Credit: 91探花

The team’s route takes between eight and 11 hours to drive each time.

“We wanted the route to capture different aspects of the city 鈥 such as restaurants, hospitals, schools, parks and museums 鈥 and also make sure we had an equal representation across a variety of neighborhoods,” said co-lead researcher , a senior principal research scientist in the human centered design and engineering department.

The researchers try to start the drive at 8 a.m. on Friday, every few weeks, to maintain a consistent schedule, but it depends on weather, specifically the camera doesn’t work in the rain. They also drive on some Sundays to try to capture any variation between weekdays and weekends.

The Street-View-like camera creates huge datasets 鈥 each drive is turned into tens of thousands of images that make up an almost 2-terabyte file. So the researchers are developing algorithms to help them identify things such as cars, people and whether they are physically distancing in each frame. Identities 鈥 such as human faces and vehicle license plates 鈥 will be blurred.

“When people study disaster recovery, they often look at location data from smartphones or transaction data from debit or credit cards,” said co-lead researcher , an assistant professor of industrial and systems engineering. “But these data points do not necessarily capture everyone in a community. By looking at our images, I hope we are creating a dataset that better represents all people who live and work in Seattle.”

Any insights gained from this project, such as how people respond to mask recommendations or which populations might need more resources, can help other cities better understand their own recovery trends the researchers said.

“People talk about this as a 100-year pandemic, because the last major pandemic was in 1918,” Errett said. “Now conditions are much different 鈥 we have increased population density, climate change and more. I don’t think we’re going to be waiting another hundred years. So whatever we can do to learn from this experience will help us develop better policies and plans for the future.”

Jaqueline Peltier, an operations specialist in civil and environmental engineering; , a doctoral student in industrial and systems engineering; Christopher Salazar, a master’s student in industrial and systems engineering; and Vanessa Yang, an undergraduate student in statistics and informatics, are also part of this project. This research is funded by the National Science Foundation.

For more information, contact Errett at nerrett@uw.edu, Wartman at wartman@uw.edu, Miles at milessb@uw.edu and Choe at ychoe@uw.edu.

Grant number: 听CMMI-2031119

]]>
91探花team developing model to help lower COVID-19 infections in King County, guide eventual vaccine distribution /news/2020/08/14/uw-team-developing-model-to-help-lower-covid-19-infections-in-king-county/ Fri, 14 Aug 2020 17:08:43 +0000 /news/?p=69871
The novel coronavirus has created a lot of uncertainty about how to keep people safe. Here a 91探花Medicine employee interacts with a patient. Photo: Dennis Wise/91探花

Policymakers continue to have uncertainties on how to answer important questions about the novel coronavirus 鈥 such as when and how to reopen businesses and schools, and how to distribute a vaccine once one becomes available.

Now a 91探花 team has to develop a model that uses local data to generate policy recommendations that could help lower COVID-19 infections in King County.

“We will be simulating the impact of various interventions 鈥 including social distancing measures, school closure policies, testing capacity, contact-tracing strategies and mask wearing 鈥 on population health outcomes,” said lead researcher , a 91探花associate professor of industrial and systems engineering. “Once a vaccine becomes available, we plan to expand the model to simulate vaccination rollout and coverage, and optimize for the best delivery configuration, such as vaccination priority if supply is limited.”

The 91探花team is one of nine groups from around the world that received a grant from the . These projects will last four to six months and focus on developing models to help inform policymakers.

Although the UW’s model will be specific to King County, the team’s methods and policy insights should be generalizable to other urban areas, Liu said. The researchers plan to make their final model accessible to other researchers online.

“There are quite a few good COVID-19 forecasting models that provide useful predictions on future trends, but our model adds the decision-making capability from an operations research perspective,” Liu said. “Our approach optimizes public health policies using a large-scale simulation model and provides actionable insights on the best interventions to save lives and minimize social disruptions in King County and beyond.”

Additional researchers on this grant are , an industrial and systems engineering professor; , an assistant professor in the Foster School of Business; , professor and chair of global health; and , an acting professor of global health.

For more information, contact Liu at liushan@uw.edu.

]]>
How ergonomic is your warehouse job? Soon, an app might be able to tell you /news/2019/08/19/ergonomics-machine-learning/ Mon, 19 Aug 2019 15:54:31 +0000 /news/?p=63553 A factory ceiling with low hanging lights
91探花researchers have used deep learning to develop a new system that can monitor factory or warehouse workers and tell them how risky their behaviors are in real time.

In 2017 there were nearly 350,000 incidents of workers taking sick leave due to injuries affecting muscles, nerves, ligaments or tendons 鈥 like carpal tunnel syndrome 鈥 according to the . Among the workers with the highest number of incidents: people who work in factories and warehouses.

For journalists

Musculoskeletal disorders happen at work when people use awkward postures or perform repeated tasks. These behaviors generate strain on the body over time. So it’s important to point out and minimize risky behaviors to keep workers healthy on the job.

Researchers at the 91探花 have used machine learning to develop a new system that can monitor factory and warehouse workers and tell them how risky their behaviors are in real time. The algorithm divides up a series of activities 鈥 such as lifting a box off a high shelf, carrying it to a table and setting it down 鈥 into individual actions and then calculates a risk score associated with each action.

The team June 26 in IEEE Robotics and Automation Letters and will present the findings Aug. 23 at the in Vancouver, British Columbia.

“Right now workers can do a where they fill out their daily tasks on a table to estimate how risky their activities are,” said senior author , an assistant professor in both the industrial & systems engineering and mechanical engineering departments at the UW. “But that’s time consuming, and it’s hard for people to see how it’s directly benefiting them. Now we have made this whole process fully automated. Our plan is to put it in a smartphone app so that workers can even monitor themselves and get immediate feedback.”

For these self-assessments, people currently use a snapshot of a task being performed. The position of each joint gets a score, and the sum of all the scores determines how risky that pose is. But workers usually perform a series of motions for a specific task, and the researchers wanted their algorithm to be able to compute an overall score for the entire action.

Moving to video is more accurate, but it requires a new way to add up the scores. To train and test the algorithm, containing 20 three-minute videos of people doing 17 activities that are common in warehouses or factories.

A GIF of people moving boxes from shelves to a table
To train and test the algorithm, the team created a dataset containing 20 three-minute videos of people doing 17 activities that are common in warehouses or factories. Photo: 91探花

“One of the tasks we had people do was pick up a box from a rack and place it on a table,” said first author , a 91探花mechanical engineering doctoral student. “We wanted to capture different scenarios, so sometimes they would have to stretch their arms, twist their bodies or bend to pick something up.”

The researchers captured their dataset using a Microsoft Kinect camera, which recorded 3D videos that allowed them to map out what was happening to the participants’ joints during each task.

Using the Kinect data, the algorithm first learned to compute risk scores for each video frame. Then it progressed to identifying when a task started and ended so that it could calculate a risk score for an entire action.

The algorithm labeled three actions in the dataset as risky behaviors: picking up a box from a high shelf, and placing either a box or a rod onto a high shelf.

Now the team is developing an app that factory workers and supervisors can use to monitor in real time the risks of their daily actions. The app will provide warnings for moderately risky actions and alerts for high-risk actions.

Eventually the researchers want robots in warehouses or factories to be able to use the algorithm to help keep workers healthy. To see how well the algorithm could work in a hypothetical warehouse, the researchers had a robot monitor two participants performing the same activities. Within three seconds of the end of each activity, the robot showed a score on its display.

A robot arm with an output display that shows the scores of each activity a human subject performed
The researchers had a robot (white arm) monitor participants performing activities in a warehouse-like setting. At the end of each activity, the robot showed a score on its display (right). Photo: Parsa et al./IEEE Robotics and Automation Letters

“Factories and warehouses have used automation for several decades. Now that people are starting to work in settings where robots are used, we have a unique opportunity to split up the work so that the robots are doing the risky jobs,” Banerjee said. “Robots and humans could have an active collaboration, where a robot can say, ‘I see that you are picking up these heavy objects from the top shelf and I think you may be doing that a lot of times. Let me help you.'”

Additional co-authors are , and , who are 91探花mechanical engineering doctoral students; , who completed this research as a summer intern at the UW; and , a professor in the 91探花mechanical engineering department. Funding and support for this project has been provided by the State of Washington, Department of Labor and Industries, Safety and Health Investment Projects. This research was also funded by a gift from Amazon Robotics.

###

For more information, contact Banerjee at ashisb@uw.edu and Parsa at behnoosh@uw.edu.

]]>
Two 91探花professors elevated to IEEE Fellows /news/2017/11/28/two-uw-professors-elevated-to-ieee-fellows/ Tue, 28 Nov 2017 16:44:00 +0000 /news/?p=55585 Two faculty members in the 91探花 College of Engineering have been elected as 2018 fellows of the .

, professor of industrial and systems engineering, was honored for 鈥渓eadership in virtual and augmented reality鈥 and , professor in the Paul G. Allen School of Computer Science & Engineering, was recognized for 鈥渃ontributions to robotic manipulation and human-robot interaction.鈥

The IEEE Fellow distinction is reserved for select members who exhibit an extraordinary record of accomplishments in any of the IEEE fields of interest, which include aerospace systems, biomedical engineering, computing, consumer electronics, energy, telecommunications and more. Nominated by peers and conferred by the IEEE Board of Directors, fellowship is considered both a prestigious honor and a noteworthy career achievement within the technical community. The total number selected in any one year does not exceed one-tenth of 1 percent of the Institute’s total voting membership.

Tom Furness

Furness is a pioneer in human interface technology and grandfather of virtual reality. In addition to his ISE professorship, he holds adjunct professorships in electrical engineering, mechanical engineering and human-centered design and engineering. He is the founder of the听听(HIT Lab) at 91探花and sister HIT Labs at the University of Canterbury in Christchurch, New Zealand, and the University of Tasmania, in Australia. He is also the founder of the Virtual World Society, which is dedicated to bringing together hearts and minds through virtual reality to solve pervasive problems in the world.

Prior to joining the faculty at the 91探花in 1989, Furness served a combined 23 years as a U.S. Air Force officer and civilian scientist developing advanced cockpits and virtual interfaces for the Department of Defense. Furness lectures and speaks widely on virtual reality innovations and holds 21 patents in advanced sensor, display and interface technologies.

Siddhartha Srinivasa

Srinivasa听听this past fall as the Boeing Endowed Professor from the faculty of Carnegie Mellon University, where he was a member of the Robotics Institute and founding director of the Personal Robotics Lab. He has made pioneering contributions to two fundamental areas of robotics, robotic manipulation and human-robot interaction (HRI), with the aim of enabling robots to perform complex tasks with and around people. A full-stack roboticist, Srinivasa has built several end-to-end systems that integrate perception, planning and control in the real world.

Srinivasa鈥檚听听in manipulation has enabled robots to push, pull and sweep objects under conditions of clutter and uncertainty through non-prehensile, physics-based interactions. He also is credited with having created the field of algorithmic HRI through his efforts to build the formal mathematical foundations of human-robot interaction. To that end, Srinivasa and his team built HERB, the Home Exploring Robot Butler, to serve as a realistic testbed for new algorithms enabling human-robot collaboration. In addition to his role in the lab, HERB has become an ambassador of sorts for Srinivasa and his team 鈥 and for the field of robotics, generally.

]]>