Driverless cars

Following the news item that a pedestrian had been killed by a driverless car on test in the USA, Nic Houslip outlines his thoughts on driverless cars.

Posted: 180328

Photo: Uber
Driverless cars have worried me for some time. I know that the technical content of an autonomous vehicle is very complex, but mankind stands alone in its ability to learn and store apparently unrelated information and then use that information in a completely different context. Computers cannot do this, to decide they must be told in advance what to look out for.

If you pull the ethernet cable out of your PC and try to access a website, the error message you get will just tell you there is a loss of connectivity, but it cannot know [because no one thought to write a few lines of code that looked for an unplugged ethernet cable and add it to the browser software] that the cable is unplugged. And only if the hardware engineers had requested an added "bit" in an accessible register that flagged the unplugged cable as problem.
The best description of the problem was from a TV programme on Artificial Intelligence I watched many years ago. A computer programmed to be heuristic and able to answer questions would struggle with the question, Victor is in New York, where is his left leg? Unless it had been programmed to know that your left leg is attached to you, which we all know, mostly by instinct and from childhood play, it couldn't know where his leg was. And if you had a prosthetic leg, that would require a whole new approach in software. Interesting that the Police and the developers are trying to fend off accusations by saying the lady was crossing the road between traffic lights. But that is just the sort of unexpected things that pedestrians do. And it doesn't say much for the test driver in the vehicle, he didn't expect or see her, probably because he wasn't paying attention! Which makes me think that if you are going to ride in a driverless car and you need to pay attention to avoid the unexpected, why bother to have a driverless car!

But getting back to the accident in Arizona I quote from an article in a respected electronics journal EE Times in the USA under the headline Robo-Uber: What Went Wrong. It says "for the tech community, it is past time to start thinking about what could have prevented the autonomous car from killing a woman crossing the street. Footage of the collision shows that the self-driving car not only did not stop; it didn't even slow down when a human - whose movements were neither fast nor sudden - crossed its path". Mike Demler, senior analyst, The Linley Group, calls it "the million-dollar question." Mike Demler asked: "Uber needs to answer - what was the purpose of the driver being on the road at that hour? Was it a night time test? Were the radar/lidar functioning? Is their software just totally incapable of responding to emergency conditions?"

Shocking to many automotive experts is that none of the sensors - including radars, lidars, vision - that were embedded inside Uber's self-driving car, a Volvo XC90, seemed to have their eyes on the road. Nor - as indicated by the fully functional driver-facing camera - did the so-called "safety driver. It would appear that there are many questions to answer on this subject, something I find rather worrying as a former electronic engineer with an understanding of reliability of complex electronic systems.

You can read the article here . The readers' comments are interesting. More information here
See an interesting item on the BBC Radio 4 programme Law in Action. More