Reply to topic  [ 13 posts ] 
Researchers Find a Malicious Way to Meddle with Auto. Cars 
Author Message
What's a life?
User avatar

Joined: Thu Apr 23, 2009 6:27 pm
Posts: 12251
Reply with quote
Quote:
While automakers focus on defending the systems in their cars against hackers, there may be other ways for the malicious to mess with self-driving cars. Security researchers at the University of Washington have shown they can get computer vision systems to misidentify road signs using nothing more than stickers made on a home printer.


http://blog.caranddriver.com/researcher ... mous-cars/

_________________
All the best,
Paul
brataccas wrote:
your posts are just combo chains of funny win

I’m on Twitter, tweeting away... My Photos Random Avatar Explanation


Fri Aug 11, 2017 2:48 pm
Profile
Legend

Joined: Sun Apr 26, 2009 12:30 pm
Posts: 45931
Location: Belfast
Reply with quote
Yeah, there's a lot of work to be done.

_________________
Plain English advice on everything money, purchase and service related:

http://www.moneysavingexpert.com/


Fri Aug 11, 2017 5:17 pm
Profile
What's a life?
User avatar

Joined: Thu Apr 23, 2009 7:26 pm
Posts: 17040
Reply with quote
In theory, properly autonomous cars don't need pretty much any road signs. They're for people. Because people forget the route or don't know quite where they are or what restrictions are in place at the next junction. Automated cars never forget any of that stuff. The nav system would be pre programmed with the information the sign is giving out, so the sign is effectively redundant. The only time you'd need to do sign recognition is on roads outside it's navigation database (which by 2040 or whatever won't be many) or with temporary signage. I'd like to see them vandalise a matrix sign 30 feet up in the air on a slippery metal pole.

Any system can be broken. Literally. It's always a question of risk vs utility. If they can find a way to make an autonomous car misrecognise traffic lights, then maybe we can talk.


Sat Aug 12, 2017 12:17 am
Profile
What's a life?
User avatar

Joined: Thu Apr 23, 2009 8:46 pm
Posts: 10022
Reply with quote
I would have thought that a better way would be to have a central GPS map that gets updated and this then gets downloaded as an update to all cars. I don't use Google Maps for satnav but I gather it can use data from multiple users to build up a picture of traffic flow and hence suggest alternative routes. That is the sort of thing I envisaged - shared information - so your autonomous vehicle travels along the most optimum route.

_________________
Image
He fights for the users.


Sat Aug 12, 2017 6:19 am
Profile
What's a life?
User avatar

Joined: Thu Apr 23, 2009 6:27 pm
Posts: 12251
Reply with quote
jonbwfc wrote:
In theory, properly autonomous cars don't need pretty much any road signs. They're for people. Because people forget the route or don't know quite where they are or what restrictions are in place at the next junction. Automated cars never forget any of that stuff. The nav system would be pre programmed with the information the sign is giving out, so the sign is effectively redundant. The only time you'd need to do sign recognition is on roads outside it's navigation database (which by 2040 or whatever won't be many) or with temporary signage. I'd like to see them vandalise a matrix sign 30 feet up in the air on a slippery metal pole.

Any system can be broken. Literally. It's always a question of risk vs utility. If they can find a way to make an autonomous car misrecognise traffic lights, then maybe we can talk.


There are some very complex junctions in some places. Either they would need to be simplified so that software can cope, or there would need to be some kind of markers in that location to help autonomous vehicles to find their way around, be in the right lane, etc..

Currently, though, it seems that the simple option is to have the software read exiting signage, rather than rely on GPS to tell them what the signs say.

_________________
All the best,
Paul
brataccas wrote:
your posts are just combo chains of funny win

I’m on Twitter, tweeting away... My Photos Random Avatar Explanation


Sat Aug 12, 2017 8:31 pm
Profile
What's a life?
User avatar

Joined: Thu Apr 23, 2009 7:26 pm
Posts: 17040
Reply with quote
paulzolo wrote:
There are some very complex junctions in some places. Either they would need to be simplified so that software can cope, or there would need to be some kind of markers in that location to help autonomous vehicles to find their way around, be in the right lane, etc..

Even the most complex junction works according to rules set down in the traffic laws and the highway code. All of them do. That's not that complex. What makes them complex is that humans don't know the rules, or forget the rules, or just choose to ignore the rules because they are inconvenient. This is why the transitional phase between partial automation and full automation is so tricky - because you've got robot cars that ALWAYS obey the rules and humans who SOMETIMES obey the rules and that's a recipe for conflicts. The cars still don't have to 'understand' road signs even then though, they have to understand that humans don't always obey road signs and how to react to that. Which is itself incredibly complex, admittedly.

paulzolo wrote:
Currently, though, it seems that the simple option is to have the software read exiting signage, rather than rely on GPS to tell them what the signs say.

Sorry, but no. I've been involved in some aspects of this stuff professionally in the past and what you're suggesting might feel intuitively correct, but it's actually completely arse backwards. Computer vision is immensely complex, as is the AI to interpret the results ("is that a thing? Is it a road sign? How far away is it? what does it say? What does that mean?") . As oppose to 'The database says the right hand junction in thirty yards (according to the GPS) has priority for turning traffic'. Obviously the database needs to be complete and accurate, but we nevertheless can do that now. We can't do 'recognise road signs and react to them' now.

Computer vision (and interpretation of the results) has masses of uses - think about all the times you 'look at things' in the day and what information all those events give you - and implementations for driving are nice, defined area to try stuff out in. However they're just not the most viable solution to the 'roads have signs' problem. Knowing what the signs are meant to tell you in advance - even if, as has been said, that's 'downloaded from the internet just before you get there' rather than 'programmed in in the factory' is still a much easier solution.

The fact is for a large number of cities round the world we already have that data anyway - it's google street map. It would actually be a very feasible task for someone (with a spare few hundred grand say) to set up some cloud compute to 'walk' each city's streets in street map, find any road sign images and grab them, then scan them with a dumb pattern recognition system to figure out what they say. There you go, database of the location and details of every road sign in say.. London - done.

In fact, it would really, really surprise me if Google weren't already doing that and feeding the data into their autonomous car program.


Sun Aug 13, 2017 9:59 am
Profile
Has a life
User avatar

Joined: Fri Dec 13, 2013 6:44 pm
Posts: 90
Location: Polesworth
Reply with quote
The problem is when the signs don't match the reality. A human can usually tell what to do pretty quickly, whereas the robot will likely just blindly follow what was "supposed" to be there.

_________________
I cannot remember the last time I forgot something ;)


Sun Aug 13, 2017 3:29 pm
Profile
I haven't seen my friends in so long
User avatar

Joined: Fri Apr 24, 2009 6:37 am
Posts: 6954
Location: Peebo
Reply with quote
Burn_IT wrote:
The problem is when the signs don't match the reality. A human can usually tell what to do pretty quickly, whereas the robot will likely just blindly follow what was "supposed" to be there.

Just don't mention the idiots who blindly follow their SATNav, especially trucks.

_________________
When they put teeth in your mouth, they spoiled a perfectly good bum.
-Billy Connolly (to a heckler)


Sun Aug 13, 2017 4:25 pm
Profile
What's a life?
User avatar

Joined: Thu Apr 23, 2009 6:27 pm
Posts: 12251
Reply with quote
Burn_IT wrote:
The problem is when the signs don't match the reality. A human can usually tell what to do pretty quickly, whereas the robot will likely just blindly follow what was "supposed" to be there.

Or what was the when the picture was taken. While things may not change between each photo run, they can, and updates to what's on the map needs to happen when the sign/road markings etc. change.

_________________
All the best,
Paul
brataccas wrote:
your posts are just combo chains of funny win

I’m on Twitter, tweeting away... My Photos Random Avatar Explanation


Sun Aug 13, 2017 6:28 pm
Profile
What's a life?
User avatar

Joined: Thu Apr 23, 2009 7:26 pm
Posts: 17040
Reply with quote
Burn_IT wrote:
The problem is when the signs don't match the reality.

In that case being able to read the signs won't help much.

Burn_IT wrote:
A human can usually tell what to do pretty quickly, whereas the robot will likely just blindly follow what was "supposed" to be there.

Well, no. Nobody is claiming that's ALL an autonomous car will do. Such vehicles will still have to be aware of their surroundings, because the universe is indeed infinitely variable. There's no road sign for 'child will run out into the road here'. Nor is that going to be on street view. So they will still have RADAR/LIDAR and vision based systems and complex rules to follow given the data returned from those systems. And they most probably will have dynamically updated navigation systems - heck, your phone can do that for you today.

I was merely pointing out that spending a bunch of time and money getting cars to recognise and process road signs, which in the vast majority of cases is effectively data which is static over time at any reasonable timescale, is rather a poor example of anything to do with what autonomous cars will spend the vast majority of their time doing. And therefore sabotaging it probably isn't going to make that much difference to anything.

If you can muck up recognition of traffic lights, or make an autonomous car not obey a speed limit, then it'll be interesting.

As it is the practical upshot of the story is something equivalent to them coming out and saying '"hey, we can fix it so every time you hear the word 'archipelago' something will go wrong". Well, you know, I might just survive.


Sun Aug 13, 2017 7:45 pm
Profile
What's a life?
User avatar

Joined: Thu Apr 23, 2009 8:25 pm
Posts: 10691
Location: Bramsche
Reply with quote
You don't need to photograph the signs and create a database that way. The state and communities already know exactly what signs are where and how those streets are laid out. This is already in databases. When roadworks are made, the signage usually has to be agreed on in advance - the workers who set up the roadwork have to book out signs and take them with them.

That covers 99% of signs. Accidents and mobile road / verge clearance being exceptions.

That means that all the information about junctions, lanes for different exits etc. is pre-defined. The car just needs this accurate information, that saves it having to read most signs. Only emergency signage needs to be taken note of - but even that means that "hacking" the car through signs is still possible, just the same as using GPS blockers or GPS overrides to shift positioning can cause havoc now. The same is true for wireless tyre pressure valves, these communicate with the central computer over an unencrypted link and this can be hijacked to get into the system; you need to be very close, the range is only a couple of meters, but you can still do it on the move.

BMW had a major problem with the security on their mobile data enabled systems, allowing anyone to access the vehicle. That is a bigger problem today than reading signs.

I have driven a number of vehicles with signage recognition - my Qashqai does it and I driven a couple of Fords and Kias with similar systems. What is interesting is that the Nissan and Ford system seem to use a combination of roadside cameras and the navigation system to display the correct signage on the cockpit display. The Kia on the other hand seems to rely on the signs, or its navigation database is not very accurate - it showed the wrong speeds and directions a lot more often than the others.

_________________
"Do you know what this is? Hmm? No, I can see you do not. You have that vacant look in your eyes, which says hold my head to your ear, you will hear the sea!" - Londo Molari

Executive Producer No Agenda Show 246


Sun Aug 13, 2017 8:46 pm
Profile ICQ
What's a life?
User avatar

Joined: Thu Apr 23, 2009 8:25 pm
Posts: 10691
Location: Bramsche
Reply with quote
I dug a little deeper. There are several problems with the story.

1. they couldn't get the level of access needed to any existing sign recognition machine learning system from car manufacturers, so they wrote their own
2. the system was deliberately not 100% accurate to start with (91% accuracy on signs taken out of the US equivalent of the highway code).
3. you can't "hack" the car in any way (well, there wasn't any car to hack, it was a model running on a computer being fed with JPEG images), it just confused it.

By placing black and white stickers on a stop sign, they moved them around, until the car confused the stop sign for a 45 mph speed limit sign. They also managed to get a right turn sign to be confused with an additional lane sign or a stop sign.

As in my previous post, this should not confuse a vehicle, because it should not be relying on posted signs, but comparing them to what is in the database. This means that, in general, the system is only looking for "emergency" signage and fooling a vehicle into thinking the road speed is 45mph shouldn't affect the vehicle stopping at the junction, because it already knows the junction is there and it should be using a mixture of GPS + mapping software, plus sign recognition, plus road markings.

In the case of the Stop sign being mistaken for a 45mph sign, you would need to ensure that the GPS+mapping software was disabled and you would have to remove the road markings in order for the system to ignore the fact that it should stop. The only thing that might happen is that you can get it to speed at 45mph in a 30mph zone after it has cleared the junction, for example - but the mapping software should let it know that there is a discrepancy there and the vehicle should stick to the proper speed limit and, if it is done properly, query the master database as to whether an undocumented change in speed has been made here.

The researchers were at pains to point out, that the paper was done in a lab, that they did not use any realworld vehicles and they do not believe that real autonomous vehicles can be fooled by the signs they messed around with, it was just a laboratory proof of concept, using a non-robust machine learning system they wrote themselves to prove the concept...

_________________
"Do you know what this is? Hmm? No, I can see you do not. You have that vacant look in your eyes, which says hold my head to your ear, you will hear the sea!" - Londo Molari

Executive Producer No Agenda Show 246


Wed Aug 16, 2017 8:45 am
Profile ICQ
Doesn't have much of a life
User avatar

Joined: Wed Apr 29, 2009 9:33 am
Posts: 667
Reply with quote
In other words, a complete non-story...

_________________
UltraSonic f***erPhonic ZombieShockin TrailerRockin BabyBoomin GaitorGroomin InterStellar LadyRaiders


Wed Aug 16, 2017 8:54 am
Profile WWW
Display posts from previous:  Sort by  
Reply to topic   [ 13 posts ] 

Who is online

Users browsing this forum: No registered users and 29 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
Jump to:  
cron
Powered by phpBB® Forum Software © phpBB Group
Designed by ST Software.