The seeing car: how sensors and drivers become a great team

Sensors, radar and laser systems take exact measures, are quick as lightning and do not get tired. Driving assistance systems already help to avoid collisions and they will be able to handle more situations, step by step- therefore, in the future cars must also keeps drivers in view.

Sometimes it seems like a miracle, when traffic flows easily. People only use few of their senses when driving. Touching, tasting, smelling, these are not used. Hearing is only used in a limited way. It is only the sight that we employ to recognise obstacles, measure distances and speed.

Precisely the word: measure. “People often measure very wrongly the relative speed between two cars”, says Karl-Heinz Glander, Director of the automated driving programmes at Auto Supplier ZF. His team develops sensors and data fusion systems for the “seeing car”.

The more technologies are employed, the better

“We are not trying to teach cars to see but to recognise relevant objects in the surroundings”, says Glander. Cameras and ultrasound, radar and infrared techniques as well as laser sensors scan the area around cars in order to recognise objects, using all the possibilities offered by physics. This is happening,even though some systems are still too expensive for a mass production.

“By itself, a sensor can also deliver erroneous information”, says Glander. Not only the single sensor but the whole sensor system needs to be one hundred percent reliable: “If the camera has a problem with reflecting light, the radar will inform, ‘I see a car!’. The camera will then look for reasons and will answer:‘ I can’t see a thing because of the sun, rather stop the car!”

Driver and driver assistance need to communicate

Emergency brakes that react in case of a lack of attention are widespread today. Increasingly, also the lane departure warning systems: If a car drives over a line without blinking, rumble strips are stimulated through the vibrations or the car turns directly back into the lane.

In difficult situations, it must be clear who is in command, the system or the driver. There is research going on to find the best combinations of optical, acoustic or haptic signals, with clear symbols. “It should not end up looking like an airplane cockpit”, says Glander. “It would be nice, of course, if the car could talk to us, as in ‘Knight Rider’. This may be less science fiction than one thinks”.

According to Glander, one of the next steps will be the use of information on traffic lights and crossings. The car automatically awaits in order to turn, until the incoming traffic has passed by. The more the car is capable of doing, the more important it is for the driver to continue fulfilling his job. Glander thinks that if the automatisation of longitudinal and lateral axes (speed or track) is combined in the future, then there is a need to observe the car’s interior with cameras and sensors as well. Although it is possible that advanced automation may someday allow the driver to take care of other things during the ride, he must still be able to intervene at any moment.

Getting out as in a train ride

The aim is not to do away with the driver, but to avoid accidents caused by human errors. Karl-Heinz Glander also remarks: “If I drive for 500 kilometres without using the brakes, giving gas or steering much, my leg and neck muscles remain very relaxed. I get off the car as if I had taken the train. One can still enjoy driving in the future through winding roads, perhaps along the coast. Glander may be right when he says, “Driving for hours on the highway cannot be very relaxing”.

Recommendations