Self driving car kills pedestrian.

fastpedaller
Posts: 2119
Joined: 10 Jul 2014, 1:12pm
Location: Norfolk

Re: Self driving car kills pedestrian.

Postby fastpedaller » 22 Mar 2018, 2:25pm

Having now looked at the video...... I'd have at least slammed the brakes on, maybe even swerved left as well - It's been reported the Uber didn't even brake at all - Some serious questions over this tech!

Vorpal
Moderator
Posts: 17316
Joined: 19 Jan 2009, 3:34pm
Location: Not there ;)

Re: Self driving car kills pedestrian.

Postby Vorpal » 22 Mar 2018, 2:45pm

Stevek76 wrote:Well if this was the UK potentially both, the individual is at fault but also the company might have insufficient hse practices. No idea how US law works here.

There is no US federal regulation that covers stuff like this. One must refer to Arizona law. In general, in Arizona pedestrians who are crossing outside a crosswalk (pedestrian crossing) should give way to vehicles. At the same time, the drivers of motor vehicles should generlaly give way to pedestrians in the roadway (whether they are using a formal crossing). There are similar laws in place in most states, and interpretation can vary significantly. In Midwestern states, the driver of a vehicle that hit a pedestrian in such circumstances would be held liable and penalised in some way (fine, points on licence, revocation of driving licence, etc.), but the pedestrian would probably be considered to have contributed to the incident through negligence (not using the pedestrian crossing). The proportion which the pedestrian would be considered to have contributed would depend upon the circumstances. Some states have limitations on the amount of contributory negligence a vulnerable road user can be assigned.
“In some ways, it is easier to be a dissident, for then one is without responsibility.”
― Nelson Mandela, Long Walk to Freedom

User avatar
RickH
Posts: 4542
Joined: 5 Mar 2012, 6:39pm
Location: Horwich, Lancs.

Re: Self driving car kills pedestrian.

Postby RickH » 22 Mar 2018, 4:45pm

thirdcrank wrote:It's in this morning's Daily Telegraph business section that Toyota is taking what's spun as "time out" from the testing, largely to reduce the current stress on the test drivers. :roll:

Reading between the lines, marketing bods getting cold feet till they know which way public opinion goes.

(Sorry I can't link to the original.)

There was someone on radio 4 earlier in the week being interviewed about the incident & car automation in general & they commented that partial automation but still having to pay attention is actually more stressful & harder than being in complete control as the person in charge tends to let their attention slip from the things that they should be doing as well as the things they don't need to bother with.

reohn2
Posts: 36400
Joined: 26 Jun 2009, 8:21pm

Re: Self driving car kills pedestrian.

Postby reohn2 » 22 Mar 2018, 4:51pm

Vorpal wrote:She didn't run out. She was pushing a bike with shopping on it. And she was most of the way across, as well. We cannot know at all what would have happened with a human fully in control of the car, but I expect in this particular circumstance, a human would have had an easier time recognising the pedestrian for what she was. Yes, it was dark, but that's what headlights are for. I would like to think that if a human did not recognise something in the road, they would slow down and be prepared to stop. That's what competent drivers should do. Even in the dark.

I don't think that this is a reason to halt the development. I suspect that the safety record of automated cars is better than human controlled ones, and will continue to improve. I do think that this incident deserves full investigation, in order to prevent similar incidents in the future.

I agree.
I'll trot out this personal story once again.
On an unlit dual carriageway at 2am on a very dark night,no moonlight and cloud coverm,I saw the faintest red glow appear then disappear on my side of the road some 400 to 600m up the road,I flicked on my main beam only to see a ninja cyclist pedalling toward me in my lane ie; his wrong side of the road.
I indicated,slowed down and moved over into lane two to avoid a collision.
The red glow was a fag in his gob,and I concluded it was getting bright when he took a drag!

I'm confident a capable and alert driver would've seen this woman crossing in front of the vehicle in enough time to avoid a collision,as I'm as confident the driver wasn't capable or alert and was letting the car drive itself whilst she did something else!

That doesn't mean driverless cars are doomed but it does hi-light the fact that the technology isn't perfected yet.
Last edited by reohn2 on 22 Mar 2018, 5:16pm, edited 1 time in total.
-----------------------------------------------------------
I cycle therefore I am.

User avatar
gaz
Posts: 13745
Joined: 9 Mar 2007, 12:09pm
Location: Kent, car park of England

Re: Self driving car kills pedestrian.

Postby gaz » 22 Mar 2018, 5:00pm

gaz wrote:The human eye sees much better in poor light/darkness than a camera, ...

Link to picture at the site, presumably at a similar time of day with a better camera. If that picture is a better representation of the lighting then IMO the driver failed to see a person on the road because of their own inattention, nothing more. I still can't give an opinion on the failure of the robot.
Hand wash only. Do not iron.

Mr Evil
Posts: 174
Joined: 21 Feb 2016, 11:42pm
Contact:

Re: Self driving car kills pedestrian.

Postby Mr Evil » 22 Mar 2018, 5:42pm

According to various people on the internet who claim to drive along that road, it's not as dark as the video. This comparison image* taken in the same location is purportedly more representative of what humans can see:
eSre3hL.jpg

If accurate, then it looks like the accident occured because the car's sensors are hopelessly inadequate.

* Source: https://i.imgur.com/eSre3hL.png and https://www.youtube.com/watch?v=1XOVxSCG8u0
Last edited by Mr Evil on 22 Mar 2018, 5:48pm, edited 1 time in total.

brynpoeth
Posts: 11285
Joined: 30 Nov 2013, 11:26am

Re: Self driving car kills pedestrian.

Postby brynpoeth » 22 Mar 2018, 5:47pm

Was the vehicle really exceeding the maximum speed limit?
Entertainer, kidult, curmudgeon
Cycling-of course, but it is far better on a Gillott
We love safety cameras, we love life

Stevek76
Posts: 510
Joined: 28 Jul 2015, 11:23am

Re: Self driving car kills pedestrian.

Postby Stevek76 » 22 Mar 2018, 6:09pm

Slightly misleading picture in that it suggests that dash cam footage was used as part of the driving software which would be very surprising. However the lighting on those seems a far more realistic representation of how it would have looked in reality.

The various lidar, radar etc sensors would have picked her up for certain even in pitch black, clearly this was a processing and identification problem. An issue with AVs is getting them to tell the difference between something like this that should be stopped for/avoided and a plastic bag floating over the road which should not. This will take much learning and the whole point of the driver is to intervene when it has it wrong.*

I'm not sure what the various companies policies are regarding what they expect of these drivers and what training they get but I'd bet a few quid that uber come bottom here, pay minimum wage and get any old person behind the wheel just to tick that box. If this is the case (and it rather fits their general MO) then it's probably short sighted from a development point of view as well.

Frankly the quality of the dash cam footage seems bad even for dash cams. It looks like output from a cheap fleabay model from a few years ago. If you've got a car with many $$$s of sensor tech on it is really that much more to spend a few hundred on some good cameras? My old galaxy S2 took better vids in the dark than that in the windscreen holder! Or, edging into the conspiracy side, did uber digitally downgrad the footage before providing it to the police to make it look less incriminating?

*Side note on this but Google is clearly using it's captcha service to help with this, note how it's often asking for you to identify road signs and vehicles etc. Every time you do those you're helping it's image recognition work, just as in the past they were using it to help process the parts of old books the OCR was struggling with.

User avatar
661-Pete
Posts: 9183
Joined: 22 Nov 2012, 8:45pm
Location: Sussex

Re: Self driving car kills pedestrian.

Postby 661-Pete » 22 Mar 2018, 9:30pm

Whatever emerges from the arguments - I suggest that all self-driving motor vehicles, whatever their purpose, in whatever country they're operating (are there any in Britain? In Europe?) should be grounded forthwith. No exceptions. That's my view, and I don't say this lightly.

Speaking as someone who's spent nearly 40 years of his working career in embedded software - for much of that time in safety-critical systems (not transport-related) - I have to say that no software - at least no software as complex as this must be - can be absolutely 100% bug-free. I've allowed bugs to 'escape' in the systems I've worked on. Although most are detected at the development and testing phases, it's possible for a bug to remain in the delivered product. Luckily, none of my mistakes have led to any deaths or injuries. Not yet! Even in my retirement, I feel a heavy burden of responsibility, though. I may not be working on them any more, but many of the systems I did contribute to, are still in operation.

If it does emerge that a bug 'escaped' in the Uber system, and it's pinned down to a specific software engineer, or team of engineers, they won't be sleeping easily in their beds!
Suppose that this room is a lift. The support breaks and down we go with ever-increasing velocity.
Let us pass the time by performing physical experiments...
--- Arthur Eddington (creator of the Eddington Number).

User avatar
RickH
Posts: 4542
Joined: 5 Mar 2012, 6:39pm
Location: Horwich, Lancs.

Re: Self driving car kills pedestrian.

Postby RickH » 22 Mar 2018, 10:42pm

661-Pete wrote:Whatever emerges from the arguments - I suggest that all self-driving motor vehicles, whatever their purpose, in whatever country they're operating (are there any in Britain? In Europe?) should be grounded forthwith. No exceptions. That's my view, and I don't say this lightly.

Speaking as someone who's spent nearly 40 years of his working career in embedded software - for much of that time in safety-critical systems (not transport-related) - I have to say that no software - at least no software as complex as this must be - can be absolutely 100% bug-free. I've allowed bugs to 'escape' in the systems I've worked on. Although most are detected at the development and testing phases, it's possible for a bug to remain in the delivered product. Luckily, none of my mistakes have led to any deaths or injuries. Not yet! Even in my retirement, I feel a heavy burden of responsibility, though. I may not be working on them any more, but many of the systems I did contribute to, are still in operation.

If it does emerge that a bug 'escaped' in the Uber system, and it's pinned down to a specific software engineer, or team of engineers, they won't be sleeping easily in their beds!

The trouble is that human drivers are not "bug-free" either. The evidence I have seen suggests that autonomous cars are far less likely to get it wrong than human driven ones. If they are much safer then what argument do you have for not "ground[ing] forthwith" all the human controlled ones as well. Is it better to stop the autonomous cars & carry on as we are (over 1 million deaths a year worldwide) or accept that they won't be perfect, and will steadily get better, but will cut the rate of incidents to a huge extent (I would expect more than 90% but even if it was only, say, 50% would that not be an improvement).

I suspect in this instance there will have been a particular set of circumstances where the software did, tragically, get it wrong. It may in some ways be like the fatality of a Tesla driver (who was expecting the autopilot software to allow him to not pay attention which it had expressly not been cleared to do). In that instance I understand the Tesla system failed to distinguish between the side of a white truck side & the sky.

The big difference between autonomous systems & human driven ones is that the system can be modified so that all vehicles will not make the same error in future.

On the basics - knowing what traffic (including pedestrians & cyclists) is around, obeying road signs & speed limits, keeping a safe distance, etc. - the autonomous vehicles seem to have it pretty much nailed. It is a few "outliers" - fairly rare events - that the systems are laregely having to be refined for.

brynpoeth
Posts: 11285
Joined: 30 Nov 2013, 11:26am

Re: Self driving car kills pedestrian.

Postby brynpoeth » 23 Mar 2018, 5:06am

661-Pete wrote:Whatever emerges from the arguments - I suggest that all self-driving motor vehicles, whatever their purpose, in whatever country they're operating (are there any in Britain? In Europe?) should be grounded forthwith. No exceptions. That's my view, and I don't say this lightly.

Speaking as someone who's spent nearly 40 years of his working career in embedded software - for much of that time in safety-critical systems (not transport-related) - I have to say that no software - at least no software as complex as this must be - can be absolutely 100% bug-free. I've allowed bugs to 'escape' in the systems I've worked on. Although most are detected at the development and testing phases, it's possible for a bug to remain in the delivered product. Luckily, none of my mistakes have led to any deaths or injuries. Not yet! Even in my retirement, I feel a heavy burden of responsibility, though. I may not be working on them any more, but many of the systems I did contribute to, are still in operation.

If it does emerge that a bug 'escaped' in the Uber system, and it's pinned down to a specific software engineer, or team of engineers, they won't be sleeping easily in their beds!

On the railway systems are fail-safe, if a driver dies at the controls the train is stopped, other trains are stopped too
Is the fail-safe principle applied to software?
Entertainer, kidult, curmudgeon
Cycling-of course, but it is far better on a Gillott
We love safety cameras, we love life

Vorpal
Moderator
Posts: 17316
Joined: 19 Jan 2009, 3:34pm
Location: Not there ;)

Re: Self driving car kills pedestrian.

Postby Vorpal » 23 Mar 2018, 5:56am

brynpoeth wrote:On the railway systems are fail-safe, if a driver dies at the controls the train is stopped, other trains are stopped too
Is the fail-safe principle applied to software?

It depends on the software & system. As a general rule, engineers are taught to consider whether fail safe is appropriate. It is a requirement for safety systems. For example, an accelerator that fails should fail 'off' rather than 'on'. If the software design methods are robust, failures that end users experience are more likely to come from the mechanical and electrical components than the software.
Induced failure testing (what happens when things go wrong) is an important part of evaluating safety-related systems. To go back to the accelerator example, if the switch for a accelerator pedal stops working, what happens? How does the software detect this failure and respond?
“In some ways, it is easier to be a dissident, for then one is without responsibility.”
― Nelson Mandela, Long Walk to Freedom

brynpoeth
Posts: 11285
Joined: 30 Nov 2013, 11:26am

Re: Self driving car kills pedestrian.

Postby brynpoeth » 23 Mar 2018, 5:59am

-1?!
Just going to drive the Autobahn to work, most of the throttles are jammed on there :(
Entertainer, kidult, curmudgeon
Cycling-of course, but it is far better on a Gillott
We love safety cameras, we love life

thirdcrank
Posts: 28648
Joined: 9 Jan 2007, 2:44pm

Re: Self driving car kills pedestrian.

Postby thirdcrank » 23 Mar 2018, 7:13am

Society is already used to the idea that motor traffic is not risk-free. All that's going to happen is a balancing act to see how much risk is acceptable with this developing technology. Cue for somebody to mention Adams Risk. Many users of these vehicles won't want them slowing to a crawl every time they approach some potential problem with vulnerable users. They'll expect the vehicle to be as selfish as they are, with the added advantage of being able to blame the vehicle if something goes wrong.

I could easily imagine the emergence of a two tier system where your basic autonomous vehicle, including most goods vehicles, would be "deferential" to something more forceful. Emergency vehicles would need priority, but there will almost certainly be a method, formal or otherwise, for other important traffic to have priority. I'm thinking of a variation of the Russian migalka, the blue light fitted to the limo's of important officials, also available to the self-important rich. No need for all that nee-naw, just a system for one car to tell another to get out of the *********ing road.

User avatar
gaz
Posts: 13745
Joined: 9 Mar 2007, 12:09pm
Location: Kent, car park of England

Re: Self driving car kills pedestrian.

Postby gaz » 23 Mar 2018, 8:44am

Hand wash only. Do not iron.