Researchers have been trying all sorts of ways to obviate the need for motorists to touch auxiliary controls, including voice actuation for tasks such as operating climate controls, turning on the radio or even switching channels. Brian Byrne reports.
But the latest one is coming out of a GM-backed programme at Carnegie Mellon University in Pittsburgh . . . using hand gestures. Programme head Prof Tsuhan Chen believes an appropriate gesture using an equivalent of sign language could be a much better way of operating many auxiliary systems in a car.
The "gesture" pilot has been developed using small cheap video cameras to track a series of gestures. From that they write computer software algorithms so that similar mini cameras can be used to recognise the gestures.
The researchers began with very simple gestures, such as pointing with the index finger. Then they gradually expanded the "vocabulary" of the system to recognise things like an open palm and a fist. Chen says once an "alphabet" of gestures is in place, combinations can be used to do a wider variety of jobs.
Ed Schlesinger, co-director of the GM Collaborative Laboratory at Carnegie Mellon, says gesturing is much better than "fumbling with dials", which are often distracting to drivers. Some of the issues to be dealt with are inadvertant operation of the system by passengers, and both Chen and Schlesinger envisage that the winning way could well be a combination of voice and gesture commands.