Autonomous cars, drones cheerfully obey prompt injection by road sign

theregister.com

216 points by breve 3 days ago


falcor84 - 2 days ago

I should probably confess that as someone who lives in an area with a lot of construction work, I'm also very vulnerable to "prompt injection" when there's a person standing on the middle of the road holding a sign telling me to change course.

Klaus23 - 2 days ago

They are analysing VLM here, but it's not as if any other neural network architecture wouldn't be vulnerable. We have seen this in classifier models that can be tricked by innocuous-looking objects, we have seen it in LLMs, and we will most likely see it in any end-to-end self-driving model.

If an end-to-end model is used and there is no second, more traditional safety self-driving stack, like the one Mercedes will use in their upcoming Level 2++ driving assistant, then the model can be manipulated essentially without limit. Even a more traditional stack can be vulnerable if not carefully designed. It is realistic to imagine that one printed page stuck on a lamppost could cause the car to reliably crash.

cucumber3732842 - 3 days ago

One year in my city they were installing 4-way stop signs everywhere based on some combination of "best practices" and "screeching Karens". Even the residents don't like them in a lot of places so over time people just turn the posts in the ground or remove them.

Every now and the I'll GPS somewhere and there will be a phatom stop sign in the route and I chuckle to myself because it means the Google car drove through when one of these signs was "fresh".

_diyar - 3 days ago

Are any real world self-driving models (Waymo, Tesla, any others I should know?) really using VLM?

mgraczyk - 2 days ago

The headline seems false, should we change it? It doesn't look like they showed any case where any autonomous car or drone obeyed prompt injections

randycupertino - 3 days ago

> In a new class of attack on AI systems, troublemakers can carry out these environmental indirect prompt injection attacks to hijack decision-making processes.

I have a coworker who brags about intentionally cutting off Waymos and robocars when he sees them on the road. He is "anti-clanker" and views it as civil disobedience to rise up against "machines taking over." Some mornings he comes in all hyped up talking about how he cut one off at a stop sign. It's weird.