Fatal confusion

INSUBCONTINENT EXCLUSIVE:
Image copyrightReutersImage caption Walter Huang, 38, was killed in a crash while using Tesla's Autopilot function
The confusion between fully autonomous self-driving cars and those that simply offer driver assistance technologies is leading to
deaths on the road
Who is to blame and what should be done about itSelf-driving cars already exist and there will be more of them in future, but the chances
are that you won't be driven by one any time soon
You may, however, already be using a car that can steer, brake or park by itself
The fear is that the hype around driverless cars has led some drivers to test the limits of existing technology in ways that are downright
reckless.A Tesla driver in the UK, for example, was recently prosecuted for climbing into the passenger seat of his car while it was moving
at around 40mph (64km/h) in motorway traffic.He was using Tesla's Autopilot, a system that does allow the car to accelerate, brake and steer
by itself on major roads, but is not designed to completely replace a driver.Media captionBhavesh Patel was filmed by a passenger in another
carOther manufacturers such as Volvo, Mercedes-Benz and Cadillac have similar mechanisms
But none of them is designed to be fully autonomous
Hands on the wheelUsing standard criteria established by the US engineering organisation SAE International, cars can be placed into six
broad categories depending on the level of automation they contain.They range from Level Zero, where the vehicle is not automated at all, to
Level 5, which means it can drive itself on all roads and in all conditions, making the human behind the wheel - and the steering wheel
itself - redundant
Current "driver assistance" systems are Level 2 in the jargon, and the driver is meant to keep his or her hands firmly on the wheel.But
getting that message across has clearly been a challenge
Image copyrightGetty ImagesImage caption Drivers who rely too much on driver assistance tech risk killing themselves and
others Tesla's Autopilot in particular has been implicated in a number of high profile crashes, two of them fatal
The company denies claims that the Autopilot name itself encourages drivers to hand over control, and has rejected demands from the German
government to stop using the term.It says feedback from its customers has shown that "they have a very clear understanding of what Autopilot
is, how to properly use it and what features it consists of"
Nevertheless, since 2016 Tesla's systems have warned drivers to keep their hands on the wheel, and can turn the system off entirely if they
fail to do so.That same year, Mercedes faced criticism over advertising that suggested its E-Class was a "self-driving car"
It later withdrew the adverts, in response, it said, to claims that customers could find them confusing.Eliminating humansAlthough
ride-sharing firms like Lyft and Uber have been working hard on developing fully autonomous technology - as have many mainstream
manufacturers - Level 5 cars are still some way off
Waymo appears to be closer than most.Later this year Google's sister company is planning to introduce a driverless taxi service in Phoenix,
Arizona
Unlike several other autonomous taxi services being trialled around the world, this one will not need a safety driver in the car.Image
copyrightReutersImage caption China's Tencent has is now licensed to test its self-driving car on public roads in
Shenzhen But the service will only operate in a relatively small "geo-fenced" area of Phoenix that the company has
intensively mapped
It is still, in effect, a test-bed
There is a big step between this kind of limited service and something that can safely negotiate a densely populated mega-city in all
weathers.Test drive"Testing and development is different from bringing onto the market," explains Prof Walter Brenner of the University of
St Gallen in Switzerland and co-author of Autonomous driving - how the driverless revolution will change the world."They are completely
different worlds
The tests are useful because they show both the strengths and the limits of this technology, but they are just tests."Media captionUber
dashcam footage shows moment before fatal impactEarlier this year, a woman was killed in Arizona by an Uber test car being driven in
autonomous mode
It failed to stop when she moved into its path.Clearly, despite all the research being carried out and the money being spent, there is still
a lot of work to do before full autonomy becomes a safe, everyday reality.ResponsibilitySafety experts believe car companies need to take
more responsibility for ensuring consumers don't make mistakes."Calling this kind of technology Autopilot… that's very misleading for
consumers," says Matthew Avery of Thatcham Research - a group that tests vehicles on behalf of the UK insurance industry
"They might think 'I just need to push this button and I can let the car drive'."More Technology of BusinessImage copyrightMagnum PhotosHe
also thinks manufacturers should take further steps to ensure the technology isn't abused, such as having cameras monitoring the driver
But he remains convinced that automation itself has vital safety benefits.There is already evidence that automatic emergency braking and
pedestrian detection systems are reducing the number of accidents
But more sophisticated systems can take that process a step further.Media captionTheo Leggett: 'The car brought us to a controlled
halt'"What the best systems are doing is integrating lane control, stopping people veering out of their lane, with braking control and
distance control
"That can really help keep people out of trouble," he says.'Harsh punishments'Walter Brenner believes there's a need for drivers - and
people selling cars - to be better educated about what semi-automated systems can do
There is a risk, he concedes, that even with that knowledge some people might deliberately choose to let the technology do more than it
should - to experiment with it, or even to show off.In those cases, he thinks, punishments should be harsh."There's a big difference between
trying out a new feature on an iPhone and playing with technology in a car when you're travelling at 100km/h (62mph) on a public road," he
says."Those people have to be punished because they're risking other people's lives."Follow Technology of Business editor Matthew Wall on
Twitter and Facebook