Government Business Technology Safety Autonomous An instrument panel indicates Autopilot is active inside a Tesla Model S. (Christopher Goodney/Bloomberg) On a dark Friday morning in November, a police officer began tailing a Tesla Model S traveling 70 mph on California’s Route 101, between San Francisco International Airport and Palo Alto.
The car’s turn signal was blinking, but it kept passing exits. The officer pulled up alongside and saw the driver in a head-slumped posture, and guessed the car was driving itself under what Tesla calls Autopilot. However, the officer’s lights and sirens failed to rouse the driver. Musk Every Tesla is equipped with hardware the automaker says could someday enable its vehicles to drive themselves on entire trips, from parking space to parking space, with no driver input. Currently, Tesla limits the system to guiding cars from on-ramps to off-ramps on highways, but in this example the car kept driving, safely, with a seemingly incapacitated driver. But it didn’t know how to obey police sirens and pull over.
In this case, there was no way for police to commandeer the car, so they improvised; while a patrol car blocked traffic from behind, the officer following the Tesla pulled in front and began to slow down until both cars came to a stop.
The incident encapsulates both the hopes and anxieties of a driverless future. The 45-year-old Tesla driver failed a field sobriety test, according to the police, and was charged with driving under the influence. The car, which seems to have navigated about 10 miles of nighttime highway driving without the aid of a human, may well have saved a drunk driver from harming himself or others. Neither Tesla nor the police, however, are ready for people to begin relying on the technology in this way.
Drivers, according to Tesla’s disclaimer, are […]