In 2016 article in the Newsletter about driverless cars I noted that Google founder Sergey Brin had predicted that driverless cars would be available for consumers by 2017. When this failed to transpire, Ford claimed that it would be selling a self-driving vehicle with “no gas pedal” and “no steering wheel” by 2021.
Half way through 2021 and every car sold is still designed to be controlled primarily by a human driver. And whilst many things can be blamed on the pandemic, the failure to achieve the hubristic timelines for getting autonomous automobiles on the roads cannot be attributed to a strain of Covid.
Although Tesla claims that it’s been manufacturing cars with driverless hardware since the end of 2016 – and demonstrated their impressive capability in a video in 2019 – the features are currently considered to be only Level 2 autonomous. Level 2, on a scale of 0 to 5 (as defined by the Society of Automobile Engineers), means that the car can automatically steer, accelerate and brake – but the human driver must constantly supervise the functions and be ready to take control at all times. There have been multiple instances of drivers being prosecuted for failing to stay attentive or even climbing into the passenger seat whilst autopilot mode is operational. And there have been several fatal accidents.
One of the few truly driverless cars currently in use on public roads is the Waymo taxi service, a subsidiary of Alphabet – and perhaps Brin’s attempt to atone for his woeful predictions. But it’s limited to a meticulously mapped 50 mile square radius in the Phoenix Metro area with optimal weather conditions.
What’s the hold up?
A few years ago, many people thought that regulatory approval would pose the most significant roadblock to realising the vision of a driverless future. However, despite the likes of Tesla motoring along with autonomous ambitions, it now seems that it’s the technological constraints which are keeping things in the slow lane. Although autonomous systems work pretty well on motorways on a dry clear day, city roads and inclement weather can confuse the AI used in the technology. Poor lighting and reflections can impede the sensors, whilst complex roads, graffitied signs and any obstacles which have not been programmed for can throw the system out of whack. It can even get thwarted by orange cones.
The challenges facing driverless AI can be extrapolated to most AI technologies. The fact is that artificial intelligence is simply not very intelligent in its current forms. Even the artificial part of AI only works under specific conditions which can be foreseen and programmed for. Ask a legal AI chatbot to order a pizza and it will be stumped. The problem with “stupid” AI in a driverless context is that it can be fatal – such as when it fails to distinguish between a brightly lit sky and a white tractor-trailer.
But despite Level 5 fully autonomous driverless cars (which work outside the Phoenix Metro area) still looking like a distant mirage, governments around the world have nevertheless been gearing up for the autonomous revolution by drawing up regulations.
The driving force of the law
In the UK, there are two noteworthy pieces of legislation currently making their way through Parliament:
Automated Lane Keeping System
In a pragmatic move, legislators are introducing rules specifically dealing with Automated Lane Keeping System (ALKS) technology, essentially paving the way for autonomous features which are currently available. The consultation paper – Rules on safe use of automated vehicles on GB roads – notes that ALKS “will be the first commercially available system designed to enable the driver to safely hand over control to the vehicle” and proposes changes to The Highway Code to support its safe use. Interestingly, the proposals appear to provide for Level 3 automation:
“While an automated vehicle is driving itself, you are not responsible for how it drives, and you do not need to pay attention to the road … If the vehicle is designed to require you to resume driving after being prompted to, while the vehicle is driving itself, you MUST remain in a position to be able to take control. For example, you should not move out of the driving seat. You should not be so distracted that you cannot take back control when prompted by the vehicle.”
Considering that Teslas are currently only considered Level 2, it would appear that these ALKS proposals are attempting to create a regulatory environment which caters for future technological progress. But there is a danger that drivers may assume a car sold with ALKS to be Level 3 instead of Level 2, so it may be necessary for the final legislation to clarify this point.
A wider regulatory framework for driverless cars is also being worked on by the Law Commission of England and Wales and the Scottish Law Commission, as part of a multiple consultation project which started in 2018 and is due to conclude at the end of 2021. Various different issues have been considered by this project, including:
- categorisation of automated vehicles;
- safety of driverless cars;
- legal responsibilities related to autonomous vehicles, including civil and criminal liability for accidents; and
- regulation of driverless taxis, known as Highly Automated Road Passenger Services (HARPS).
Law Commission: Automated Vehicles consultation
Department for Transport: Rules on safe use of automated vehicles on GB roads
Society for Computers and Law: Autonomous Vehicles Consultation: are we in the right lane?
Image cc by Mike Mackenzie via www.vpnsrus.com.