Fleet News

US to investigate death of driver killed in Tesla on Autopilot mode

Tesla Model S

US transport authorities are to investigate after a driver was killed in a Tesla driving on Autopilot mode.

A Tesla statement said: "What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.

"The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.

"Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents."

The manufacturer's Autopilot system is currently in 'beta' development phase, disabled by default, and requires drivers to keep their hands on the wheel at all times.

This is the first death involving the Autopilot system, according to Tesla.

Login to comment


  • Winston - 01/07/2016 11:03

    Tragic loss of life because the driver relied on computers to "drive" the vehicle. Surely all safety features are there to enhance and support the driver's capability, not replace them?

  • Derek webb - 01/07/2016 11:26

    First of many "rare" circumstances that result in death or serious injury in a self driving car. Its time somebody called a halt to this technological nonsense and spent the savings on training, or removing the license of, poor/incompetent drivers.

  • Robberg - 01/07/2016 13:48

    Winston - you're spot on as, unfortunately, too many people, and I include mature adults in this, are happy to let technology take over their brain and common sense.

  • Buckets - 01/07/2016 15:02

    Every journey involves "extremely rare circumstances" and our brains just make adjustments and it is no longer an incident. Software cannot use it's experiences of near misses to learn like our brains can. Stop this wasteful investment in an ultimate goal that is unnecessary and dangerous. Insurance companies should declare "never will these types of vehicles/systems be covered by our company". Having systems that add security layers is ok as it supplements the intelligence of the driver and not the other way around.

  • Ste - 04/07/2016 11:36

    Until the investigation is over you should all stop making assumptions and judgements on what might have happened and whether the driver or the tech or something else was at fault.

Compare costs of your company cars

Looking to acquire new vehicles? Check how much they'll cost to run with our Car Running Cost calculator.

What is your BIK car tax liability?

The Fleet News car tax calculator lets you work out tax costs for both employer and employee