Featured Researchers

Andy Stewart

Assistant Director, Defense and Industry Programs and Associate Director, NNMREC

OE Department

APL-UW

Assistant Professor, Mechanical Engineering

Howard Chizeck

Professor

UW Electrical Engineering

Fredrik Ryden

VP for Engineering

BluHaptics, Inc.

Don Pickering

CEO

BluHaptics, Inc.

Funding

NSF

Grant #1416444

SERDP

BluHaptics

Safer and More Efficient Undersea Robotic Operations

Dockside Demonstrations
Telerobotics & Haptics

It shouldn't be harder to use a robot than to use your hand. We would like to bring some of that sense of touch back from the robot to the hand so it can be used to guide precision operations.

When you control the robot using the haptic device, you actually get force feedback through the same device. Force feedback assists to guide you to places where you should go by pushing you there and preventing you from making mistakes or a collision by pushing back on your hand.

Haptics

ROV Applications

Haptics describes tactile feedback technology that takes advantage of the sense of touch by applying forces, vibrations, or motions to the user.

It does for the sense of touch what computer graphics do for vision.

The global energy industry is developing ever more sophisticated underwater Remotely Operated Vehicles (ROVs). BluHaptics improves the ROV–pilot interface, making operations safer and more efficient.

Current systems using haptic feedback from contact or force sensors require collisions with sensitive and expensive equipment before the operator receives the input. BluHaptics innovations solve this problem.

Innovations

BluHaptics gives the operator a more immersive experience in underwater operations so they get better spatial awareness and a better sensation of tele-presence and therefore make better decisions and operations.

BluHaptics improves the ROV pilot's operational performance by improving spatial perception of the work area and then providing haptic feedback for maipulator guidance. The system creates a virtual representation based upon a combination of sonar, video, and laser inputs. The perceptive capability of the human pilot is leveraged by the system's 3-D video spatial representation and by the forces and torques fed back to the pilot's hand.

More

Virtual Haptic Fixture Tools

Record of Invention Number: 46853

21 Feb 2014

Methods for Underwater Haptic Rendering Using Non-contact Sensors

Record of Invention Number: 46396

7 Feb 2013

Virtual Fixtures for Subsea Technology

Record of Invention Number: 46397

7 Feb 2013

Close

 

Close