American Sign Language – 91探花News /news Tue, 27 Oct 2020 17:44:57 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 Crowdsourcing site compiles new sign language for math and science /news/2012/12/07/crowdsourcing-sit-compiles-new-sign-language-for-math-and-science/ Fri, 07 Dec 2012 18:00:14 +0000 /news/?p=20670 A multimedia feature published this week in the New York Times, “,” outlines efforts in the United States and Europe to develop sign language versions of specialized terms used in science, technology, engineering and mathematics.

The article shares newly defined signs for terms like “light-year,” “organism” and “photosynthesis.” It also describes a successful crowdsourcing effort started at the 91探花 in 2008 that lets members of the deaf and hard-of-hearing community build their own guide to the evolving lexicon of science.

A screen capture from the ASL-STEM Forum.

“It鈥檚 not a dictionary,” explained , a 91探花professor of computer science and engineering. “The goal of the forum is to be constantly changing, a reflection of the current use.”

A scientific and technical dictionary for American Sign Language has existed since the late 1990s. 聽It is called Lexicon, launched by , an early proponent of science in the deaf community and a professor at the National Technical Institute for the Deaf at Rochester Institute of Technology.

But a dictionary can鈥檛 include the newest terms, Ladner said, and many graduate students won鈥檛 find the specialized terms used in their chosen fields. For example, Ladner helped organize a 2008 workshop where a deaf scientist said only about one-quarter of his field鈥檚 specialized terms existed in his native language, American Sign Language, or ASL. Many workshop participants reported that at some point they had had to work with their interpreters to develop their own code words.

That year, with funding from Google Corp. and the National Science Foundation, Ladner鈥檚 group launched the , an online compilation of signs used in science, technology, engineering and math that is more like Wikipedia or the Urban Dictionary.

“The goal was to have one place where all these signs could be,” Ladner said. “We鈥檙e not trying to decide on new signs but just collect the ones that are in current use.”

The site lists 6,755 terms from biology, chemistry, engineering, math and computer science textbooks. Of those, about 2,800 have video entries, some with multiple entries. Partnerships with the country鈥檚 two largest higher education institutions for deaf and hard-of-hearing students have helped provide content.

Collaborators include , a 91探花alumna and biology professor at Gallaudet University in Washington, D.C., and Lang and at Rochester’s National Technical Institute for the Deaf.

Visit the forum to see a sign for “,” “” and “,” none of which is listed in the Science Signs dictionary. Terms still seeking an ASL translation include “” and “.”

Anyone can visit the forum, but to add signs a user must create a free account then record a short video using a computer鈥檚 camera that can be reviewed and uploaded. People also can rate and comment on signs uploaded by other users.

Richard Ladner and students
Richard Ladner with students in a 2007 summer computing program. Photo: Mary Levin, UW

Ladner hopes the recent article will spur interest and encourage people to suggest more entries among the remaining terms. He is seeking funding to update the site, and hopes it will reach critical mass among ASL speakers in scientific and technical fields.

Between 2006 and 2010, U.S. institutions awarded 301 doctorate degrees in STEM fields to people who are deaf or hard-of-hearing, Ladner said. Because that number includes hard-of-hearing, the number of science PhDs who use ASL is likely much lower. Many members of that community are geographically scattered, and to make matters worse, American Sign Language and British Sign Language have their own technical lexicons.

“I hope ASL-STEM Forum helps more deaf students become scientists and engineers,” Ladner said. “And as more deaf students enter these fields they will hopefully contribute to the forum, making it sustainable and useful over time.”

“,” CHI 2010

Now working on the forum at the 91探花are , a doctoral student in computer science and engineering, and John Norberg, a 91探花undergraduate in math who is minoring in ASL. Early members of the 91探花team include computer science and engineering doctoral students , now working on accessibility projects at Google, and , now an accessibility researcher at the University of Rochester in New York; and former 91探花undergraduates , Michelle Shepardson and Jessica Dewitt.

Ladner runs a national to encourage deaf and hard-of-hearing students to pursue careers in computer science, and he leads , a larger UW-based national effort to encourage people with disabilities to pursue computing fields. His group is also involved in a number of that combine computing, mobile technology and accessibility.

###

For more information, contact Ladner at 206-543-9347 or ladner@cs.washington.edu.

]]>
Deaf, hard-of-hearing students do first test of sign language by cell phone /news/2010/08/19/deaf-hard-of-hearing-students-do-first-test-of-sign-language-by-cell-phone-2/ Thu, 19 Aug 2010 00:00:00 +0000 /news/2010/08/19/deaf-hard-of-hearing-students-do-first-test-of-sign-language-by-cell-phone-2/ Editor's note: Each year the summer academy hosts a premier of the students' animated short films.]]>

Josiah Cheslik, a 91探花junior and volunteer in the MobileASL field study, demonstrates using the phone to communicate in his native language. He is signing with Pete Michor, seen in the background, another participant in the study.

The 91探花field test is using phones imported a couple of years ago from Europe, but MobileASL software could potentially run on any device.

91探花 engineers are developing the first device able to transmit American Sign Language over U.S. cellular networks. The tool is just completing its initial field test by participants in a 91探花summer program for deaf and hard-of-hearing students.

聽“This is the first study of how deaf people in the United States use mobile video phones,” said project leader Eve Riskin, a 91探花professor of electrical engineering.

The MobileASL team has been working to optimize compressed video signals for sign language. By increasing image quality around the face and hands, researchers have brought the data rate down to 30 kilobytes per second while still delivering intelligible sign language. MobileASL also uses motion detection to identify whether a person is signing or not, in order to extend the phones’ battery life during video use.

Transmitting sign language as efficiently as possible increases affordability, improves reliability on slower networks and extends battery life, even on devices that might have the capacity to deliver higher quality video.

This summer’s field test is allowing the team to see how people use the tool in their daily lives and what obstacles they encounter. Eleven participants are testing the phones for three weeks. They meet with the research team for interviews and occasionally have survey questions pop up after a call is completed asking about the call quality.

The field test began July 28 and concludes this Wednesday. In the first two and a half weeks of the study, some 200 calls were made with an average call duration of a minute and a half, researchers said. A larger field study will begin this winter.

“We know these phones work in a lab setting, but conditions are different in people’s everyday lives,” Riskin said. “The field study is an important step toward putting this technology into practice.”

Participants in the current field test are students in the 91探花Summer Academy for Advancing Deaf and Hard of Hearing in Computing. The academy accepts academically gifted deaf and hard-of-hearing students interested in pursuing computing careers. Students spend nine weeks at the 91探花taking computer programming and animation classes, meeting with deaf and hard-of-hearing role models who already work in computing fields, 91探花graduate students and visiting local computer software and hardware companies.

Most study participants say texting or e-mail is currently their preferred method for distance communication. Their experiences with the MobileASL phone are, in general, positive.

“It is good for fast communication,” said Tong Song, a Chinese national who is studying at Gallaudet University in Washington, D.C. “Texting sometimes is very slow, because you send the message and you’re not sure that the person is going to get it right away. If you’re using this kind of phone then you’re either able to get in touch with the person or not right away, and you can save a lot of time.”

Josiah Cheslik, a 91探花undergraduate and past participant in the summer academy who is now a teaching assistant, agreed.

“Texting is for short things, like 鈥業’m here,’ or, 鈥榃hat do you need at the grocery store?'” he said. “This is like making a real phone call.”

As everyone knows, text-based communication can also lead to mix-ups.

“Sometimes with texting people will be confused about what it really means,” Song said. “With the MobileASL phone people can see each other eye to eye, face to face, and really have better understanding.”

Some students also use video chat on a laptop, home computer or video phone terminal, but none of these existing technologies for transmitting sign language fits in your pocket.

Cheslik recounts that during the study one participant was lost riding a Seattle city bus and the two were able to communicate using MobileASL. The student on the bus described what he was seeing and Cheslik helped him navigate where he wanted to go.

Newly released high-end phones, such as the iPhone 4 and the HTC Evo, offer video conferencing. But users are already running into hitches — broadband companies have blocked the bandwidth-hogging video conferencing from their networks, and are rolling out tiered pricing plans that would charge more to heavy data users.

The 91探花team estimates that iPhone’s FaceTime video conferencing service uses nearly 10 times the bandwidth of MobileASL. Even after the anticipated release of an iPhone app to transmit sign language, people would need to own an iPhone 4 and be in an area with very fast network speeds in order to use the service. The MobileASL system could be integrated with the iPhone 4, the HTC Evo, or any device that has a video camera on the same side as the screen.

“We want to deliver affordable, reliable ASL on as many devices as possible,” Riskin said. “It’s a question of equal access to mobile communication technology.”

Jessica Tran, a doctoral student in electrical engineering who is running the field study, is experimenting with different compression systems to extend the life of the battery under heavy video use. Electrical engineering doctoral student Jaehong Chon made MobileASL compatible with H.264, an industry standard for video compression. Tressa Johnson, a master’s student in library and information science and a certified ASL interpreter, is studying the phones’ impact on the deaf community.

The MobileASL research is primarily funded by the National Science Foundation, with additional gifts from Sprint Nextel Corp., Sorenson Communications and Microsoft Corp. Collaborators at the 91探花are Richard Ladner, professor of computer science and engineering, and Jacob Wobbrock, assistant professor in the Information School.

The Summer Academy for Advancing Deaf and Hard of Hearing in Computing is applying for a third round of funding from the National Science Foundation. Additional support for this year’s program came from the Johnson Family Foundation, the Bill and Melinda Gates Foundation, Cray Corp., Oracle Corp., Google Corp. and SignOn Inc.

 

]]>
‘Can you see me now?’ Sign language over cell phones comes to United States /news/2008/08/21/can-you-see-me-now-sign-language-over-cell-phones-comes-to-united-states/ Thu, 21 Aug 2008 00:00:00 +0000 /news/2008/08/21/can-you-see-me-now-sign-language-over-cell-phones-comes-to-united-states/

Doctoral student Anna Cavender, who learned sign language after joining the MobileASL group, demonstrates the device. Users can hold the phone in front of them and sign with one hand but most prefer to set the phone on a table and sign with both hands.

A group at the 91探花 has developed software that for the first time enables deaf and hard-of-hearing Americans to use sign language over a mobile phone. 91探花engineers got the phones working together this spring, and recently received a National Science Foundation grant for a 20-person field project that will begin next year in Seattle.

This is the first time two-way real-time video communication has been demonstrated over cell phones in the United States. Since posting a video of the working prototype on YouTube, deaf people around the country have been writing on a daily basis.

“A lot of people are excited about this,” said principal investigator Eve Riskin, a 91探花professor of electrical engineering.

The video is posted at .

For mobile communication, deaf people now communicate by cell phone using text messages. “But the point is you want to be able to communicate in your native language,” Riskin said. “For deaf people that’s American Sign Language.”

Video is much better than text-messaging because it’s faster and it’s better at conveying emotion, said Jessica DeWitt, a 91探花undergraduate in psychology who is deaf and is a collaborator on the MobileASL project. She says a large part of her communication is with facial expressions, which are transmitted over the video phones.

Low data transmission rates on U.S. cellular networks, combined with limited processing power on mobile devices, have so far prevented real-time video transmission with enough frames per second that it could be used to transmit sign language. Communication rates on United States cellular networks allow about one tenth of the data rates common in places such as Europe and Asia (sign language over cell phones is already possible in Sweden and Japan).

Even as faster networks are becoming more common in the United States, there is still a need for phones that would operate on the slower systems.

“The faster networks are not available everywhere,” said doctoral student Anna Cavender. “They also cost more. We don’t think it’s fair for someone who’s deaf to have to pay more for his or her cell phone than someone who’s hearing.”

The team tried different ways to get comprehensible sign language on low-resolution video. They discovered that the most important part of the image to transmit in high resolution is around the face. This is not surprising, since eye-tracking studies have already shown that people spend the most time looking at a person’s face while they are signing.

The current version of MobileASL uses a standard video compression tool to stay within the data transmission limit. Future versions will incorporate custom tools to get better quality. The team developed a scheme to transmit the person’s face and hands in high resolution, and the background in lower resolution. Now they are working on another feature that identifies when people are moving their hands, to reduce battery consumption and processing power when the person is not signing.

The team is currently using phones imported from Europe, which are the only ones they could find that would be compatible with the software and have a camera and video screen located on the same side of the phone so that people can film themselves while watching the screen.

Mobile video sign language won’t be widely available until the service is provided through a commercial cell-phone manufacturer, Riskin said. The team has already been in discussion with a major cellular network provider that has expressed interest in the project.

The MobileASL team includes Richard Ladner, a 91探花professor of computer science and engineering; Sheila Hemami, a professor of electrical engineering at Cornell University; Jacob Wobbrock, an assistant professor in the UW’s Information School; 91探花graduate students Neva Cherniavsky, Jaehong Chon and Rahul Vanam; and Cornell graduate student聽Frank Ciaramello.

 

###

 

For more information, contact Riskin at (206) 685-2313 or riskin@u.washington.edu.

 

More details on the MobileASL project are at . A video demonstration is posted at .

]]>
Sign of caring: Ladner learns parents鈥 language, contributes to their community /news/2004/05/20/sign-of-caring-ladner-learns-parents-language-contributes-to-their-community/ Thu, 20 May 2004 00:00:00 +0000 /news/2004/05/20/sign-of-caring-ladner-learns-parents-language-contributes-to-their-community/

Richard Ladner converses in American Sign Language with student Patty Liang.

In a way, Richard Ladner inherited his volunteer interests.

The hearing son of deaf parents, the 91探花computer science and engineering professor has donated considerable time to the deaf community for the last 24 years. But it was a legacy he didn鈥檛 pick up until long after he鈥檇 left home and started a career of his own.

That鈥檚 when he started to talk to his parents in their language for the first time.

鈥淚 didn鈥檛 learn sign language growing up,鈥 Ladner said. 鈥淭hat was in the age of oralism, when all deaf people were encouraged to speak and to read the lips of hearing people instead of sign. There wasn鈥檛 the recognition of signing as a language then, or the encouragement to learn it.鈥

So, both his parents spoke. His father hadn鈥檛 become deaf until the age of 4, so he had already learned to speak, Ladner explained, and his mother 鈥 deaf from 6 months 鈥 learned to speak and read lips. Both parents used American Sign Language as well, and taught at the California School for the Deaf in Berkeley, but only one of their four children 鈥 Ladner鈥檚 older sister 鈥 learned how to sign.

That made her the interpreter for the family, especially between Ladner鈥檚 father and his kids because he didn鈥檛 read lips. But at the time there was no thought of teaching all the children sign language.

鈥淚n those days, deaf parents just wanted their hearing kids to be like other kids,鈥 Ladner said.

And indeed, his parents succeeded at that. Ladner said he didn鈥檛 really feel 鈥渄ifferent鈥 until adolescence, when he began to take notice of the difficulties his parents encountered in the hearing world.

Still, when he left home, not speaking sign language and with no deaf friends, he wasn鈥檛 really anticipating connections to the deaf community.

It was in 1980, when Ladner was early in his career at the UW, that he began to reconsider learning to sign. His parents were getting older, and he wanted to be able to communicate with them more fully.

鈥淜nowing the power of language, it became more important to me,鈥 he said. 鈥淗aving a common language between people is incredible. I think that goes across cultures. If my parents spoke Japanese, it would be the same thing.鈥

So Ladner enrolled in American Sign Language classes at Seattle Central Community College. And although it was easier for him to learn because of his early exposure to his parents and their friends, he still found the language challenging. He said that鈥檚 because you use your eyes to receive it and hearing people are not used to doing that. Even today, he claims his fluency is only 鈥渕oderate.鈥

鈥淚 have a hard time with finger spelling and colloquialisms,鈥 he said.

Ladner鈥檚 fluency was challenged a few years later when he applied for and received a Guggenheim fellowship to teach at Gallaudet, the only university in the country that is primarily for the deaf. He was teaching the theory of computation, which requires the use of mathematical concepts.

鈥淭here are signs for technical terms in classes like calculus, but not for something like this,鈥 he said. 鈥淚 had to work with the students to invent some signs to use.鈥

But he enjoyed the experience. Gallaudet was his parents鈥 alma mater, so he鈥檇 heard about it for years. He鈥檚 returning to the school this month to see a friend receive an honorary degree.

The friend is Marilyn Smith, who was one of Ladner鈥檚 first sign language teachers, and also a person who recruited him to several volunteer causes.

鈥淚 did some fund raising for the American Association of the Deaf Blind convention that was here in Seattle in 1984,鈥 Ladner recalled. 鈥淚 was on a committee that worked on that. It was quite exciting.鈥

More recently, he鈥檚 become involved with fund raising for the Abused Deaf Women鈥檚 Advocacy Services (ADWAS). As its name implies, this is an organization that helps abused women who are also deaf to get out of their abusive situations. Domestic violence is a particular problem for deaf women, Ladner said, because these women are likely to feel isolated. If they go to a regular domestic violence agency, there may not always be an interpreter available to help them communicate.

Right now, ADWAS is trying to raise $7.6 million to build transitional housing that will be run by the YWCA. When completed, the facility will be the first of its kind in the nation. Ladner is a co-chair of the steering committee for the capital campaign.

Except for his stint at Gallaudet, Ladner鈥檚 interest in deaf concerns hasn鈥檛 intersected much with his career. As a theorist (his degree is in math), he isn鈥檛 usually involved with technological advances that can help the deaf, although back in the 80s, he worked on an IBM project to develop a network for the deaf-blind using Braille displays and large print.

And now another possibility is in the offing. 鈥淎 couple of my colleagues came to me with the idea that we should try to do data compression for video over cell phones,鈥 Ladner said. 鈥淰ideo already exists for cell phones but it鈥檚 low quality. What we鈥檇 like to do is work on higher quality.鈥

High quality video would permit signing over the phone. The group has applied to the National Science Foundation for a grant to fund the project.

In the meantime, Ladner continues to work on ADWAS fund raising and lobbies hard to get the 91探花to offer classes in American Sign Language.

Currently, the University accepts American Sign Language as a foreign language for admission and for graduation but doesn鈥檛 offer the language itself.

It鈥檚 not surprising Ladner would be passionate about sign language. He still remembers the emotional day he went home and used the language with his parents for the first time.

鈥淚t cemented a bond between us that hadn鈥檛 really been there before,鈥 he said.

]]>