Helping the Blind Feel a City

Computer scientists in Greece are incorporating haptic technology into touchable maps for the blind. To create a map, researchers shoot video of a real-world location, either an architectural model of a building or a city block. Software evaluates the video frame by frame to determine the shape and location of every object. The data results in a three-dimensional grid of force fields for each structure. Using a haptic interface device, a blind person can feel these forces and, along with audio cues, get a much better feel of a city's or building's layout.

Applications of Haptic Technology

It's not difficult to think of ways to apply haptics. Video game makers have been early adopters of passive haptics, which takes advantage of vibrating joysticks, controllers and steering wheels to reinforce on-screen activity. But future video games will enable players to feel and manipulate virtual solids, fluids, tools and avatars. The Novint Falcon haptics controller is already making this promise a reality. The 3-D force feedback controller allows you to tell the difference between a pistol report and a shotgun blast, or to feel the resistance of a longbow's string as you pull back an arrow.

Graphical user interfaces, like those that define Windows and Mac operating environments, will also benefit greatly from haptic interactions. Imagine being able to feel graphic buttons and receive force feedback as you depress a button. Some touchscreen manufacturers are already experimenting with this technology. Nokia phone designers have perfected a tactile touchscreen that makes on-screen buttons behave as if they were real buttons. When a user presses the button, he or she feels movement in and movement out. He also hears an audible click. Nokia engineers accomplished this by placing two small piezoelectric sensor pads under the screen and designing the screen so it could move slightly when pressed. Everything -- movement and sound -- is synchronized perfectly to simulate real button manipulation.

Although several companies are joining Novint and Nokia in the push to incorporate haptic interfaces into mainstream products, cost is still an obstacle. The most sophisticated touch technology is found in industrial, military and medical applications. Training with haptics is becoming more and more common. For example, medical students can now perfect delicate surgical techniques on the computer, feeling what it's like to suture blood vessels in an anastomosis or inject BOTOX into the muscle tissue of a virtual face. Aircraft mechanics can work with complex parts and service procedures, touching everything that they see on the computer screen. And soldiers can prepare for battle in a variety of ways, from learning how to defuse a ­bomb to operating a helicopter, tank or fighter jet in virtual combat scenarios.

Haptic technology is also widely used in teleoperation, or telerobotics. In a telerobotic system, a human operator controls the movements of a robot that is located some distance away. Some teleoperated robots are limited to very simple tasks, such as aiming a camera and sending back visual images. In a more sophisticated form of teleoperation known as telepresence, the human operator has a sense of being located in the robot's environment. Haptics now makes it possible to include touch cues in addition to audio and visual cues in telepresence models. It won't be long before astronomers and planet scientists actually hold and manipulate a Martian rock through an advanced haptics-enabled telerobot -- a high-touch version of the Mars Exploration Rover.

­On the next page, we'll take a look at how haptic technology has gained in its importance and is becoming essential in some applications.