Connect with us

Hi, what are you looking for?

Technology

Autopilot Causes First Fatal Accident For Tesla Driver

Already we have spoke on TweaksForGeeks.com about the main risks involving self driving cars. At some point in time, a self driving car will be put into a situation where it will have to choose for the pedestrian to die or the driver to die based on a car accident scenario. Which one is the right choice, if there is a right choice for this horrible situation?

Self driving cars are seemingly the future. But, they have taken a fairly large blow in the past week when it was announced that the first fatal car accident has been caused by a self driving car: to the point where it killed the driver.

 

Tesla’s Autopilot

For those that do not know, Tesla cars are all equipped with a software feature called ‘Autopilot’, which uses a multitude of sensors, cameras and telemetry to keep the car within the lines of the road and a safe distance to the car in front (it can even change lanes at the flick of the wrist to the indicator stalk). As it stands, it is the closest thing we have seen to a self driving car on the road.

However, while a Tesla Model S was in autopilot mode, Joshua Brown’s self driving car slotted behind a lorry with a trailer. On a sunny Florida day, the light reflected off the trailer at such an angle which caused the cameras and sensors on the Model S to presume there was not a trailer there at all. From this, the Model S sped up presuming nothing was in front of it, causing the front of the car to slot underneath the trailer and causing the first ever self driving fatality.

 

Of course, this is an awful time for self driving cars: especially with the fact that it was software that caused Joshua’s death. However, Tesla have since said that the driver, when using the Autopilot feature, needs to consistently maintain control of the car at all times and have both hands on the wheel at all times. It is a software under development and should not be trusted with.

So it may have been that Joshua had fallen asleep at the wheel and had no direct control over the car. It may have been that he did and still the car overrode his controlled inputs. Either way, I think it is clear that Autopilot is a software that is going to make people trust cars – after all, if a car can drive itself, why does the driver need to be fully aware of what is going on around the car and on the road?

Advertisement. Scroll to continue reading.
AIAD

This is the main problem I have with self driving cars. They invoke a sense of ‘I don’t need to concentrate as hard while driving’ no matter how much OEMs claim the drivers do. The first example of this was when cruise control was first installed on cars. Now we have self driving cars, we are going to see this unnatural trust for software happen far more often.

Click to comment

You must be logged in to post a comment Login

Leave a Reply

You May Also Like