In early October, Shawn Hudson crashed his Tesla Model S with the Autopilot setting on, driving at 80 miles per hour. He hit the rear of a Ford Fiesta that had stalled in the lane he was driving in. Although his car suffered exorbitant damages, Hudson escaped the accident relatively unhurt.

Yet, Hudson was unimpressed by his car’s ability to protect him to do its superior crash worthiness. He hired a lawyer and is suing Tesla, claiming that they manipulate and dupe their clients and that the way they market their vehicles makes it seem as though they need “minimal input and oversight.”

In response to the suit Tesla made a statement that said: “We don’t like hearing about any accidents in our cars, and we are hopeful that those involved in this incident are recovering. In this case, the car was incapable of transmitting log data to our servers, which has prevented us from reviewing the vehicle’s data from the accident. However, we have no reason to believe that Autopilot malfunctioned or operated other than as designed. When using Autopilot, it is the driver’s responsibility to remain attentive to their surroundings and in control of the vehicle at all times. Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents, and Tesla goes to great lengths to provide clear instructions about what Autopilot is and is not, including by offering driver instructions when owners test drive and take delivery of their car, before drivers enable Autopilot and every single time they use Autopilot, as well as through the Owner’s Manual and Release Notes for software updates.”

“Tesla’s sales representative reassured Hudson that all he needed to do as the driver of the vehicle is to occasionally place his hand on the steering wheel and that the vehicle would ‘do everything else,’” the lawsuit claims. His lawyer, Mike Morgan, says that if the vehicle were any larger than a Ford Fiesta, Hudson would not have survived the crash. Morgan was also very critical of the Tesla manual, saying “You can engage it over 50 miles an hour, but if you engage it over 50 miles an hour, it’s got trouble finding stationary objects and stopped cars. To me, that’s a big problem. To me, that means you’re selling nothing.”

The problem with Tesla’s and other manufacturers autonomous features is that they have a hard time differentiating objects on the road. Whether it is a plastic bag, or another vehicle, or even a pedestrian, the systems cannot distinguish between them. In order to avoid constantly breaking at any object in the vehicle’s path, self-driving cars are instructed to ignore these inputs, known as false positives, and continue on their path.

Morgan also argued that those that drive Tesla and other self-driving vehicles in Autopilot mode are essentially guinea pigs, testing out this technology for manufacturers. To make matters worse, drivers of other cars on the road, such as the Ford Fiesta that Hudson’s Tesla crashed into, have not engaged in this pseudo-agreement to partake in the testing of autonomous features. As the lawsuit continues, we may see the precedent of how autonomous cars will be held liable in the future.