Visteon Arms Drivers with Gestures to Control Nearly Everything Onboard

By Oliver Kirsch, Innovation Project Manager

When a new technology is introduced in a vehicle cockpit, it usually requires a new set of controls. That means drivers must divert their attention from the road to manage yet another touch screen option or group of buttons. However, a recent advancement in gesture technology offers a seemingly magical way to control systems—from media players and ventilation to windows and glove boxes—in a manner that allows drivers to keep their eyes steadily on the road.

This new approach, called Time of Flight (ToF) Gesture Control, uses a 3-D sensor, a camera and infrared light from several LEDs to detect gestures made by the driver or passenger. Similar technology is employed in the Microsoft Kinect 3-D gaming camera and Google Tango motion tracking, and is now appearing in the automotive space.

Time of Flight refers to the physics on which the system is modeled. It essentially measures the distance between the sensor and the gesturing hand based on the time it takes the light to travel from the LED source to the object and back to the sensor (i.e., its time of flight).

Typically, the ToF camera is mounted in the vehicle headliner so it has a 3-D view of the center portion of the instrument panel and console. The sensor is exceptionally accurate in detecting the position of not just hands but also each finger. Much like the way sign language can be translated into words, ToF interprets finger, hand and arm motions or static hand poses to command various functions inside the car. The video below demonstrates how it all works.

With ToF, any surface, or even a plane in the air, can suddenly perform like a mechanical button or touch screen. By touching certain areas of the surface, the driver can activate infotainment and other systems. The surface itself is not issuing these commands; they are initiated by the 3-D sensor picking up the reflected infrared light and the associated processing unit.

ToF can also be programmed to allow the driver to perform one set of commands by touching a display and an entirely different set of operations by moving his or her finger in the air a few centimeters in front of the display. A driver in an adjacent lane might conclude that the ToF user was an orchestra conductor practicing for a concert as they raise an extended hand to increase the media player volume or lower their hand to make it pianissimo. The same gesture applied to the climate control increases or decreases the fan speed or temperature. A “stop” gesture pauses a function, while a hand wave overhead turns the dome lights on or off.

Images: upper left: Time of Flight camera – amplitude image
upper right: Time of Flight camera – distance image augmented with a feature overlay (segmented hand, finger lines and arm graph
lower left: Overhead view of 3-D point cloud of segmented hand
lower right: Gesture interaction in front of center information display

ToF further enables surfaces within the vehicle to act as virtual buttons. Any area within the infrared system’s line of sight can become a smart surface, activating a function with the touch of a finger.

The real magic of Time of Flight becomes evident when drivers use it to control objects that are well beyond arm’s reach. By opening a hand toward the glove box, its door will open. By pointing an outstretched arm toward the passenger-side window – it opens or closes.

Time of Flight technology is expected to significantly change the way drivers interact with their cars, opening new doors for human machine interaction. Similar to how finger swipes became intuitive for smartphone owners, Time of Flight gestures will become intuitive within the vehicle cockpit.

By the time the next generation of vehicles is designed in 2020, Time of Flight technology will likely be handing drivers a new way to command and control their vehicles more safely.

Oliver Kirsch applies over 20 years of automotive engineering experience to his current position, in which he investigates advanced camera technologies for interior applications with a focus on computer vision algorithms for Time of Flight based 3-D hand gesture recognition. Previously, he held roles in other areas of cockpit electronics including instrument clusters and head-up displays. Oliver earned a degree in electrical engineering from Bergische Universität Wuppertal in Germany.