We would be wise to ensure that laws are in place that take some of the guesswork out of liability when it comes to self-driving vehicles, their safety and security, and how we go about testing future iterations.
Autonomous vehicles used to be the stuff of science-fiction, but the modern world is rapidly approaching a day where true autonomous, self-driving cars will be on the roads. Vehicle manufacturer Tesla, Waymo, Google’s self-driving car subsidiary, and even graphics processor manufacturer NVIDIA are all racing to expand upon existing semi-autonomous technology to be first to market with a truly driverless vehicle. While the thought of being carted around by one’s own personal autonomous chauffeur is certainly appealing, the laws surrounding self-driving cars are still quite murky in a number of ways.
One of the most persistent myths about the capabilities of modern semi-autonomous cars is that they eliminate the need for a human behind the wheel. However, this is far from the truth and while modern vehicles might have collision detection and avoidance systems or lane-keeping assist systems, the vehicles on the road today require an attentive human driver no matter how advanced they are.
In reality, what we tend to think of as a “self-driving vehicle” is pretty far down on the scale of what an autonomous vehicle actually is. The majority of the vehicles on the road today that have some autonomous capabilities are only level 2, meaning that they are partially automated and drivers need to keep their hands on the wheel at all times, ready to take control. Levels 4 and 5 are more akin to what is portrayed in films, television, and books with almost no interaction required on the part of the human passenger.
This is important because until we reach a point wherein full automation is possible, the liability should an accident occur that causes damage or results in injury is likely to fall onto the shoulders of the driver, regardless of how much control they had at the time. This was the case in the now-infamous Arizona incident where a woman was struck and killed by an autonomous Uber vehicle. Uber was not found liable, yet the driver still faces the possibility of manslaughter charges for being behind the wheel at the time.
Autonomous vehicles aren’t just attractive to consumers, either. The Trump administration has pushed for an exemption of standard safety regulations for the testing of semi-trucks with autonomous capabilities, signaling interest from the federal government in the technology. However, this move would be a bit short-sighted as the U.S. government isn’t yet ready to deal with the cybersecurity threats that would come with fleets of autonomous freight vehicles.
Self-driving cars rely on a complex network of sensors, computer systems, and other electronics to achieve semi-autonomy on the road. It should be noted that any number of these components could become compromised by a cybercriminal looking to cause harm or extort money out of passengers. While those developing autonomous and semi-autonomous cars are taking care to build these systems to be secure, vulnerabilities will always exist and could be exploited in costly and dangerous ways.
At this point in time, the government struggles to deal with security threats of that magnitude, especially if they were looking to implement it as a part of the country’s vital infrastructure. There are already a huge number of cyber attacks that occur against the federal government regularly and without the proper funding and cybersecurity experts, we are a long ways off from seeing the bulk of shipping within the continental U.S. being conducted by self-driving vehicles.
Regulation and Legislation
While the march toward fully-autonomous vehicles is an exciting one, surprisingly little regulation is in place on testing self-driving cars. What few federal guidelines exist are only voluntary, and the bulk of the regulatory onus is left to individual states who might have lax standards. The future may be filled with autonomous vehicles, taking the human element out of the equation completely and thus resulting in safer travel, but in getting to that reality companies shouldn’t be allowed to do so without appropriate oversight.
When it gets down to it, there are effectively two camps when it comes to getting semi and fully-autonomous vehicles on the road. One side argues for more stringent safety measures while the other seeks to relax regulations in an attempt to get these vehicles on the road faster. Unfortunately, this has resulted in a bit of a prolonged stalemate when it comes to passing any legislation regarding self-driving cars, and as of now, there are very few laws pertaining to them.
New technology has always forced legislative action when it comes to automobiles, and self-driving cars shouldn’t be an exception. Liability in ridesharing vehicles when it comes to accidents and DUI charges have come under scrutiny as have red light and speed cameras which can be faulty and accuse innocent drivers of traffic violations. These seem like relatively small potatoes when compared to the legislative action that should be taken regarding self-driving vehicles.
Though the future isn’t here quite yet, it is well on its way. We would be wise to ensure that laws are in place that take some of the guesswork out of liability when it comes to self-driving vehicles, their safety and security, and how we go about testing future iterations.