GhostStripe attack haunts self-driving cars by making them ignore road signs
Six boffins mostly hailing from Singapore-based universities have proven it's possible to interfere with autonomous vehicles by exploiting their reliance on camera-based computer vision and cause them to not recognize road signs.
The technique, dubbed GhostStripe [PDF], is undetectable to the human eye, but could be deadly to Tesla and Baidu Apollo drivers as it fools the type of sensors employed by both brands – specifically CMOS camera sensors.
It basically involves using LEDs to shine pattern...
Read more at theregister.com