by Jurry Bajwah, Staff Writer
Presently, the technology world is abuzz due to the manufacture, distribution, and implementation of self-driving cars. While self-driving cars may be our future, there is much ground to cover with respect to the nuanced legal issues that will ensue. To start, “self-driving,” is an umbrella term used for both autonomous and automated vehicles. An autonomous vehicle is self-aware and capable of making its own choices; all driving decisions are made by the car. An automated vehicle, an example of which is the current Tesla Autopilot, is automated to perform essential functions and drive under the supervision of a human driver.
This distinction is important when answering a question pertaining to liability. Tesla has a limitation of liability clause, which states “we’re not responsible for what lawyers call ‘incidental, special or consequential damages.’ If we have a disagreement, the most we’ll reimburse you is one monthly subscription payment.” This puts the onus of driving and all driving related decisions on the driver in the current self-driving automated vehicle ecosystem. However, this clause leaves behind a major liability question—in cases where the driver acted reasonably in a given scenario, but the software malfunctioned or failed to deliver control of the vehicle to the driver and resulted in an accident, where will the fault lie? Traditional product liability framework allows victims to seek compensation through insurance claims or lawsuits.However, most technology companies have a history of resolving disputes in arbitration, many requiring arbitration as the sole avenue for relief.
The Tesla Death Tracker, a website that tracks Tesla accidents, puts the number of confirmed Tesla autopilot deaths from 2013 to 2022 at 15. Tesla driver induced accidents totals 320 for the same period. One such death of Joshua Brown, raises another important question on liability centered around harm or death to the human driver. In Mr. Brown’s case a trailer turned left, the vehicle failed to stop and veered off road, striking two fences and a power pole.
SAE International, formerly the Society of Automotive Engineers, is a standard developing organization headquartered 20 miles north of Pittsburgh, Pennsylvania. SAE International has defined the levels of automation in vehicles which run from zero to five. Levels zero to two are defined as levels of automation where the human driver monitors the driving environment. Levels three to five are defined as automated driving systems that monitor the driving environment.SAE International’s standards can help courts answer questions arising from accidents related to the differing levels of automation in the market.
As long as self-driving vehicles come with a steering wheel, breaks, and a command center, the burden of precaution lies with the driver. However, truly autonomous vehicles will likely give no control to the driver. Instead, the vehicle will be steering-less, and the human input will be reduced to the input of a destination or basic service. It is likely in the coming years, tort law and contract law will have to be expanded to encompass liability in an increasingly automated world.