Recent research has detected some bugs and vulnerabilities in the autopilot system included in Tesla cars, which could put hundreds of drivers at risk worldwide.
A group of Israeli-based investigators detected a flaw where automatic detection of objects in front of the car could make mistakes, causing the vehicle to stop suddenly.
The research focused mainly on the safety of Tesla Autopilot, as some traffic accidents have already been detected when using this feature. The aim of the researchers was to identify the objects that the autopilot system can detect using its multiple cameras; according to the researchers, this information could compromise the function of autopilot, based on the detection of blind spots and other factors.
Although Tesla is trying to include the latest technological advances in its cars, it should be mentioned that its systems are not fail-proof. A couple of years ago, a group of experts at Ben Gurion University published an article entitled “Phantom of the ADAS”, focusing on the security of Advanced Driver Assistance Systems (ADAS), detailing the finding of various flaws.
In this research, experts managed to trick the autopilot system using some light projections that appeared for fractions of a second, confusing the operation of the ADAS system.
The team at Ben Gurion also discovered that by displaying images pretending to be road signs or even pedestrian silhouettes on billboards, it puts Tesla pedestrians and drivers at risk. As if that weren’t enough, this hacking variant could leave very few traces to identify an attacker.
He is a cyber security and malware researcher. He studied Computer Science at Miami and started working as a cyber security analyst in 2008. He is actively working as an cyber security investigator. He also worked for security companies like Cisco. His everyday job includes researching about new cyber security incidents. Also he has deep level of knowledge in enterprise security implementation.