The unintended safety problems with partially automated vehicles

Monday, January 21, 2019

TORONTO — The weak link when it comes to partially-automated cars might be you. Humans aren’t good at paying attention when it feels like there’s nothing much to do, in part because we’re too quick to trust computers to do things for us.

That presents a safety problem for current Level 2 (partially-automated) vehicles that combine lane-centring steering and adaptive cruise control. While such systems — including Tesla’s Autopilot and similar tech from Audi, BMW, Mercedes, Infiniti and others — likely help make cars safer, this technology is also introducing new dangers.

“The danger is that people might tune out from the task of driving because they get mislead that the system is better than it actually is. That’s over-trust in a system and there is research showing that problem does exist,” said Krzysztof Czarnecki, professor of electrical and computer engineering and co-lead of autonomous driving research at the University of Waterloo.

Read the full story published in the January 18 edition of The Chronicle Herald.