It should be just as easy to use a robotic arm as it is to use your own hand. That鈥檚 the thinking behind 91探花 startup , which is taking telerobotics — controlling robots from a distance — to a new level: underwater.

Using technology developed by 鈥檚 lab in the Department of Electrical Engineering, a team of 91探花scientists and engineers working at the Applied Physics Laboratory is creating a control system for underwater remotely operated vehicles, or ROVs.
These instruments can perform a variety of undersea tasks too dangerous — or even impossible — for humans, including oil and gas exploration, biohazard clean-up and mining, and environmentally sensitive scientific research.
In June, the 91探花and BluHaptics team will travel to Washington, D.C. to showcase this technology at the SmartAmerica Challenge, as part of the Smart Emergency Response Systems team. The will be a three-day event, including a White House presentation, a technology exposition and a technical-level meeting.
The 91探花research team is working with a 鈥渟ubmersible manipulator test bed鈥 at the APL, which is made up of specialized, submersible equipment similar to what鈥檚 used in the oil and gas industry for offshore operations. This equipment is submerged in a large water tank for a realistic test environment.
鈥淓ssentially, we鈥檙e combining the spatial awareness of a computer system with the perceptive capability of a human operator,鈥 said聽, a senior engineer in the Department of Ocean Engineering and part of the BluHaptics team. 鈥淭o do this, we use what鈥檚 called a haptic device.鈥
Haptics describes feedback technology that takes advantage of the sense of touch by applying forces, vibrations or motions to the user. The haptic device is used both to control the robot and to provide force feedback to the user.聽 This feedback guides the human operator to the desired location, pushing back on the hand to avoid collisions or other mistakes.
The haptic input device is similar to using a mouse with a computer, Stewart said, 鈥渂ut it鈥檚 giving three-dimensional input, so you鈥檙e actually defining a point in space where you want the robotic arm to go.鈥
鈥淗aptics does for the sense of touch what computer graphics do for vision,鈥 said Chizeck, who co-directs the .
The technology creates a virtual representation based on a combination of sonar, video and laser inputs — sensory feedback that enhances the human-robotic interface and speeds up operations. This translates into tackling the task at hand safely and more efficiently, while greatly reducing the risk of damage to the environment.
The BluHaptics robotic control system is based on key algorithms developed by Fredrik Ryden in electrical engineering as part of his doctoral work. This work was originally directed to robotic surgery, which allows surgeons to operate remotely via a computer connected to a robot — a surgical alternative for certain medical procedures that can mean enhanced precision and less trauma for the patient, and decreased fatigue for the surgeon. BluHaptics is now applying and modifying these same algorithms to underwater robotics.
Read the about BluHaptics on the Center for Commercialization’s website.