My morality says both are accountable. The driver, and Tesla. Tesla for damage caused by their system, and the driver for and if he does not retake control of the vehicle given the chance.
But does the driver have a reasonable chance with adequate timeframe to regain control?
Like what happened with Boeing 737 Max MCAS incident, Boeing expects the pilot to disengage the trim motor in mere 4 seconds, which accoriding to a pilot “a lot to ask in an overwheming situation” or something similar.
Normal people in soon-to-crash situation are likely to freeze for a second or two, and the fear kicks up. How the driver reacts next is hard to predict. Yet, at the speed most US drivers love to go (I saw 70+ mph on freeway is the norm), the time avalible for them to make an well thought out decision I guess is quite short.
In my head, the reason is not specifically to punish the driver, but to make drivers always be aware and ready to take control again. Yes 100 ppl will have 1000 different ways to react to such a software error, but you need ppl to pay attention, and in law the only way is to use punishment. Obviously this needs to be well calculated but either you have multiple lines of defense (the software, the driver, maybe even additional safety features) or you have to remove the autonomous system.
It doesn’t matter for practical purposes you can’t make people pay attention as if driving without the actual engagement of driving. There is going to be a delay in taking over and in a lot of cases it wont matter by the time the human is effectively in control.
Imagine you are going along a straight road, not too much traffic, the speed limit is high and you are enjoying it. Suddenly your assistant software decides to turn your steering wheel hard to the left.
You will have no chance.
What have you done wrong? What is it what you are accountable for?
My morality says both are accountable. The driver, and Tesla. Tesla for damage caused by their system, and the driver for and if he does not retake control of the vehicle given the chance.
But does the driver have a reasonable chance with adequate timeframe to regain control?
Like what happened with Boeing 737 Max MCAS incident, Boeing expects the pilot to disengage the trim motor in mere 4 seconds, which accoriding to a pilot “a lot to ask in an overwheming situation” or something similar.
Normal people in soon-to-crash situation are likely to freeze for a second or two, and the fear kicks up. How the driver reacts next is hard to predict. Yet, at the speed most US drivers love to go (I saw 70+ mph on freeway is the norm), the time avalible for them to make an well thought out decision I guess is quite short.
You made me think about this for a second.
In my head, the reason is not specifically to punish the driver, but to make drivers always be aware and ready to take control again. Yes 100 ppl will have 1000 different ways to react to such a software error, but you need ppl to pay attention, and in law the only way is to use punishment. Obviously this needs to be well calculated but either you have multiple lines of defense (the software, the driver, maybe even additional safety features) or you have to remove the autonomous system.
deleted by creator
It doesn’t matter for practical purposes you can’t make people pay attention as if driving without the actual engagement of driving. There is going to be a delay in taking over and in a lot of cases it wont matter by the time the human is effectively in control.
Imagine you are going along a straight road, not too much traffic, the speed limit is high and you are enjoying it. Suddenly your assistant software decides to turn your steering wheel hard to the left.
You will have no chance.
What have you done wrong? What is it what you are accountable for?
For mine
So did the car think there was an impending collision? That should be obvious in the logs and the only reason for sudden maneuvers
Cars do not think LOL