Jump to content

General Chat


AVFCLaura

Recommended Posts

The car I'm driving at the moment (not literally), is semi autonomous. It has adaptive cruise control, which will slow down and speed up if a car appears in front, it'll keep you in the lane by automatically steering the car, an change lane just by indicating, it also brakes far quicker than I can in a emergency. I can literally go on the M6 from Birmingham to Manchester for example, without doing a thing, although you have to still hold the steering wheel, as it senses when you take your hands off, which at the moment isn't allowed.

  • Like 1
Link to comment
Share on other sites

1 hour ago, Genie said:

The software is exactly the issue. We’ve seen lots of examples where autonomous cars don’t recognise a lorry or the suns reflection off the wet ground freaked it out. There’s thousands of these scenarios waiting to be discovered every day.

 

Humans get dazzled by the Sun, get distracted, fall asleep even. That’s why I said “better than humans” not “perfect” is the right question to ask.

As an aside software does not “fail”. It can’t. The issue with software “failure” is errors in compiling or writing it or corruption. Pedantry I know, but there are astoundingly rigorous process and procedures for the highest design assurance level (A) software validation and verification. Ditto for the hardware used in safety critical functions.

None of which means it’s perfect and the Boeing crash scandal exposed how if those standards are not followed and the V&V is not properly done very bad things can happen.

But then of course similar failings can make “normally driven” cars equally dangerous. Toyota and was it Ford Pinto? are examples of that.

  • Like 2
Link to comment
Share on other sites

I had a pretty average Joe accessible car 5 years ago, a first generation Hyundai Ionic. If the thing could’ve schmoozed Clients I could have sent it to meetings on its own.

It had been basically given a safety device in that it could sense if I’d taken my hands off the wheel for an extended period of time. Other than that, it stayed in lane and adapted its speed to road conditions. It decided when it needed the petrol engine or switched to electric. It had sat nav. It had auto braking and auto stop on collision detection. It clearly had lots of tech that was ‘close’ to being self drive.

Having said that, heavy rain or snow and it basically flashed up a message that said ‘over to you, big boy, I’m out’.

 

  • Like 1
Link to comment
Share on other sites

It just be a great weight validating these systems - I've worked on a few software medical devices, and the idea of being usutally responsible for a death if we missed a bug used to keep me up at night.

  • Like 1
Link to comment
Share on other sites

10 minutes ago, blandy said:

Humans get dazzled by the Sun, get distracted, fall asleep even. That’s why I said “better than humans” not “perfect” is the right question to ask.

As an aside software does not “fail”. It can’t. The issue with software “failure” is errors in compiling or writing it or corruption. Pedantry I know, but there are astoundingly rigorous process and procedures for the highest design assurance level (A) software validation and verification. Ditto for the hardware used in safety critical functions.

None of which means it’s perfect and the Boeing crash scandal exposed how if those standards are not followed and the V&V is not properly done very bad things can happen.

But then of course similar failings can make “normally driven” cars equally dangerous. Toyota and was it Ford Pinto? are examples of that.

I thank those developers compiling or writing code every day for keeping me in wine and cheese. Without them I would have nothing to QA. 

  • Like 1
Link to comment
Share on other sites

Look at the biggest tech companies in the world, Apple, Google, Microsoft etc pushing patch after patch after patch at their products. Fixings bugs on the fly.

Cars losing their shit and driving into oncoming traffic because the temporary roadworks markings are freaking out the cameras… they’ll catch that in the next update.

The rear end crash detection system getting lost when it should do something but it is conflicted by a data race condition on the ASIC and instead plies into the car, they’ll get that one next time too… and a hundred others.

 

  • Haha 1
Link to comment
Share on other sites

1 minute ago, Genie said:

Look at the biggest tech companies in the world, Apple, Google, Microsoft etc pushing patch after patch after patch at their products. Fixings bugs on the fly.

Cars losing their shit and driving into oncoming traffic because the temporary roadworks markings are freaking out the cameras… they’ll catch that in the next update.

The rear end crash detection system getting lost when it should do something but it is conflicted by a data race condition on the ASIC and instead plies into the car, they’ll get that one next time too… and a hundred others.

 

There’s a world of difference between personal device consumer or business software and safety critical software. Same for hardware.

secondly I don’t think anyone is saying it’s ready yet, at least I’m not, but it’s not too far off being better than a human in most conditions. When manufacturers believe it is good to go, it then has to pass the regulators. Then it’ll come into our world. In many ways regulatory approval is the biggest hurdle, because regulators are always terrified of something going wrong and them getting blamed.

Link to comment
Share on other sites

6 minutes ago, blandy said:

There’s a world of difference between personal device consumer or business software and safety critical software. Same for hardware.

Not really, they think they have considered, tested against every scenario but they haven’t and have to patch on the fly. 

It’ll be the same with cars. 

Link to comment
Share on other sites

10 minutes ago, Genie said:

Not really, they think they have considered, tested against every scenario but they haven’t and have to patch on the fly. 

Yes really. Completely different. If you want to know how different google something like “DO-178B DAL A software”, then do the same for “DAL D” or “no DAL software”

  • Like 1
Link to comment
Share on other sites

35 minutes ago, blandy said:

Yes really. Completely different. If you want to know how different google something like “DO-178B DAL A software”, then do the same for “DAL D” or “no DAL software”

So you say but you won't catch me in an aeroplane approaching midnight in the year 2999.  You know the buggers won't have factored in a number 3 at the beginning. 

I am assuming mankind has solved the ageing gene by then. 

Edited by sidcow
  • Haha 1
Link to comment
Share on other sites

I've never worked in automated driving/plane tech, but I can tell you that there is absolutely just no similarity between the quality gates in place between a medical device and a web widget for consumers. Even when it comes to something "critical" like finance apps, there's a "suck it and see" philosophy that just does not fly when you're working with software which directly affects patient safety, and I expect things are even tighter with these products where there could be mass casualties as a direct consequence of a software bug. The triple-redundancy tech in planes is actually what we used as inspiration for some of our most critical prescribing and dosing logic - three completely separately designed algorithms written by different teams with  no shared resources, all executed, and an exception if a single one of them didn't come up with the same result as the others.

The big tech companies essentially use their users as beta testers for their mobile devices, apps. productivity software, etc, and there's nowhere near the same emphasis on building in quality and safety from the ground up.

No software more complex than a single-purpose console app is bug free, of course, but there's orders of magnitude of difference between the levels of design and testing that goes into assuring safety of critical systems like this.

  • Like 2
Link to comment
Share on other sites

1 hour ago, Genie said:

 

Thats a pretty poor example as that car didn't have a "Pedestrian Detection functionality" So is that the cars fault or the dumbass driver?

Link to comment
Share on other sites

1 hour ago, Tegis said:

Thats a pretty poor example as that car didn't have a "Pedestrian Detection functionality" So is that the cars fault or the dumbass driver?

The engineers believed it was safe to use.

Link to comment
Share on other sites

I've had some involvement with automated drones. In order to have widespread autonomous use, they will have to be able to demonstrate that they are as safe as commercial aircraft and that will likely need whole new frameworks for traffic control and accident investigation.

It took us three years to get legal approval to fly autonomously over one motorway on one route (not as a regular service). IIRC the final step to get the consent was automatic airbag and parachute deployment if the triple redundant compute platform failed to achieve quorum. I think that brought us to one predicted failure per 1m flying hours. There are several more magnitudes of improvement required before they can be fully autonomous, but this was about three years ago so I don't know how far it has progressed.

They'll be in cities within 10 years and probably flying in beacon defined corridors by the end of the decade. The use cases will rarely be delivering parcels to people's houses, despite the media regularly getting excited about this. 

  • Like 4
Link to comment
Share on other sites

7 hours ago, Genie said:

You think cars will be able to perfectly read every single scenario and react perfectly? 

 

The selfishness, idiocy, arrogance and foolishness I see every day on the road by humans compared to a possible once in a while software bug - I will take it to be honest.

I was nearly driven into yesterday while I was stationary waiting to make a turn. That was caused by 2 people doing dumb things and I would have been seriously injured if the car that nearly hit me had slightly less wear left on their brakes.

What you have to remember is that if every car is fully automated, the software won't have to deal with humans acting like they are Nigel Mansell or an 80 year old woman decided it's ok to pull out at 4 miles per hour into oncoming traffic.

  • Like 1
Link to comment
Share on other sites

I took that video at face value but it seems like something is odd about it as nobody seems to know the people in it but can confidently say it didn’t have the optional extra for crash avoidance.

Who would do a crash test without knowing the car had the feature?

Anyway, luckily Volvo have more than one autonomous braking epic fail in their archive.

 

Link to comment
Share on other sites

12 minutes ago, TheAuthority said:

What you have to remember is that if every car is fully automated, the software won't have to deal with humans acting like they are Nigel Mansell

They won't suddenly bring in automated cars and ban everyone from driving

The automated car debate always forgets the political aspect of it's potential implementation

Link to comment
Share on other sites

×
×
  • Create New...

exclamation-mark-man-user-icon-with-png-and-vector-format-227727.png

Ad Blocker Detected

This site is paid for by ad revenue, please disable your ad blocking software for the site.

Â