Technology
The road self-travelled
No-one is expecting the competence of Knight Rider, but there is still distance to cover on the road to autonomous vehicles
It’s a bright, spring afternoon in Williston, Florida when tragedy strikes. A tractor trailer turns across the highway, perpendicular to oncoming traffic, and Joshua Brown’s car does not brake. The windshield of his car impacts the bottom of the trailer, and the car passes underneath, veering off the road through two fences before finally colliding with a power pole. Tragically, Brown had not seen the white trailer against the bright sky. Neither had the car, a Tesla Model S electric sedan, which had been in Autopilot mode at the time of the accident. On the 7th of May 2016, Joshua Brown, 40, of Canton Ohio, passed away, the first known fatality involving a self-driving vehicle.
The Autopilot feature of the Model S uses a forward-facing camera, a forward RADAR, ultrasonic sensors and GPS, to facilitate navigation by identifying road markings, signs, objects and distances (NYT). Disabled by default, and making frequent checks to confirm that the drivers hands remain on the wheel, the Autopilot system is, despite its name, currently only an ‘assist feature’ in a public beta test phase. While the exact details of the accident will likely not be clear until the National Highway Traffic Safety Administration (NHTSA) completes its investigation, statements made by Tesla and its CEO, Elon Musk, do provide some insight (Tesla). Due to the truck’s ride-height, and the positioning of Brown’s car, both perpendicular to the truck, and between the its wheels, the camera failed to visually identify the white truck against the bright sky while the RADAR misidentified the truck as an overhead road sign. The car, as a result, failed to brake. Had the positioning been slightly different, even in the event of a collision, the result would have been far less severe. The accident on May 7th may have been a combination of unlikely events, but autonomous navigation systems must be able to deal with the unfortunate and infinite variation of the road, if they are to become the norm.
And they will see widespread, mainstream use. Autonomous vehicle testing is already permissible in parts of the UK, parts of the US, the Netherlands, Germany, Japan and both manufacturers and technology companies such as Nissan and Google have stated that they expect to produce entirely self-driving vehicles within the next decade (IEEE Spectrum). At this point self-driving vehicles are an inevitability. What’s left to judge is their practicality and liability.
Tesla’s Autopilot system requires explicit acknowledgement from the user, who assumes legal liability and is expected to be prepared to take control, with their hands on the wheel. While this accident was precipitated by the actions of the truck driver, it has re-sparked the debate around blame. Human fallibility is something we are all too familiar with. But who is to blame when a machine misjudges? Earlier this year, the first instance of an accident caused by an autonomous vehicle was recorded, when a Google Car managed to drive itself into a bus at a speed of less than two miles per hour (You can read all about the low speed, low stakes fender bender in the DMV report). In California, vehicles being tested for self-driving capabilities are already required, in the event of a collision, to provide the Department of Motor Vehicles (DMV) with all sensor data obtained in the 30 seconds prior to the event (IEEE Spectrum). It seems certain that future lawyers and litigators, will be defending and critiquing the safety standards of autonomous vehicles in excruciating levels of detail. And the legal landscape will have to evolve to keep pace. The roads of the future will have self-driven and human-driven vehicles, both making mistakes.
It is assumed that Tesla will administer a Software Fix to specifically address the issues experienced in Brown’s accident. At the very least it is all but certain that Tesla will adjust the RADAR conditions around overhead signs. But what else should be done? Even with sophisticated image processing software, cameras are often susceptible to ambient lighting conditions. It’s one of the reasons that nearly all other autonomous vehicle projects incorporate LIDAR, a form of laser based range-finding technology, into their sensor array. LIDAR provides high accuracy over substantial distances, and is also far less prone to lighting issues. This does, however, come at substantial, though decreasing, financial and power costs, and Musk has expressed skepticism regarding LIDAR’s necessity.
For full autonomy you’d really want to have a more comprehensive sensor suite and computer systems that are fail proof. That said, I don’t think you need LIDAR. I think you can do this all with passive optical and then with maybe one forward RADAR.
CEO of Tesla
It is entirely possible that improvements to the existing sensor suite and navigation software are sufficient. However, given the widespread industry acceptance of LIDAR and the fact that it may have averted this particular accident, it seems worthwhile for Tesla to consider front-facing LIDAR to complement its existing sensor network. But LIDAR aside, increased redundancy in sensors, systems and in software processes, are upgrades that all autonomous vehicle manufacturers should be considering, in order to continue to improve safety standards (IEEE Spectrum).
Earlier this year, the Autopilot system was described by Tesla CEO, Elon Musk, as “probably better than a person right now.” Statistically speaking, he is not wrong. The safety record of self-driven vehicles is better than that of manually driven vehicles, with Brown’s fatal accident being the first in 210 million kms of active Autopilot, compared to a US average of 145 million km and a worldwide average of 100 million km (IEEE Spectrum). There will undoubtedly be other autonomous accidents on the road ahead, but self-driving systems, already safer than we are, will also continue to improve. A software fix from a single accident has the potential to make all other cars safer. Would that we humans could learn from these tragedies too.
If you are interested in the topic, it is worth checking out the IEEE Spectrum Self-Driving Section.

Nikhil Mathew is a Sydney-based writer and the creator of the Prolix zine. He first published this on 06 Jul 2016 in Prolix.