(Image credit: Future)
I’ve been a fan of science fiction since I learned to read. I grew up tearing through Heinlein, Asimov, Herbert, and many other authors, and for decades I’ve been looking forward to taking a ride in a fully autonomous self-driving vehicle. Now, it’s finally happened—probably 10 years later than I originally expected in my youthful ignorance, but I was giddy as I sat in the foyer of the Hard Rock casino, waiting for my test drive.
Prepping for this year’s CES, I received a meeting request from Yandex—a company I had never heard of previously—offering exactly what I wanted. There’s no way I’m going to turn that down. It doesn’t have much to do with PC gaming, of course, except for the fact that most of the deep learning systems used to train these vehicles are powered by PC hardware. Yes, there are Nvidia GPUs and Intel CPUs in the trunk of the car. I’ll talk about the hardware more in a bit, but first, it’s time for my ride.
It was pretty cool. I got to press the “Let’s ride” button on a tablet in the parking garage, and off we went. The car navigated through Vegas like a reasonably cautious and respectful driver. Even more impressive is that the safety failsafe (ie, human) sitting in the passenger seat never once had to reach for the steering wheel.
That’s quite different from the two Lyft self-driving rides (powered by Aptiv) I grabbed this past week. First, Lyft won’t let you take pictures or videos (frankly, I was surprised Yandex did). Lyft also isn’t willing to go into as much detail about the hardware, as it’s all ‘proprietary’ stuff. But the biggest difference is that Lyft has two people sitting up front on each ride, basically overseeing the safety of the vehicle. One sits behind the wheel and is apparently required (per the agreement Lyft/Aptiva has made) to take manual control on all private property, meaning as soon as you enter casino grounds, it’s a human driving.
I couldn’t say which is technically better, but the Lyft / Aptiv BMW is undoubtedly a nicer car than the Yandex Prius. Beyond that, the Yandex demo was a lot closer to being fully autonomous, even if it followed a pre-determined route. There was plenty of traffic, including vehicles and pedestrians, and no one was behind the wheel, which is apparently legal in Vegas.
I also like that the Yandex screens show more of what the vehicle ‘sees’ as sensible graphics. There’s a large white line showing the car’s intended path of travel. Cars, trucks, vans, buses, semis, etc. all show as different length gray rectangles, crosswalks and traffic lights are visible, and pedestrians show up as well (you can see some pedestrians that got awfully close to the car on the crosswalk at the 3:14 mark in the above video for example).
Yandex wasn’t doing any freeway testing this week, but that’s apparently easier than city driving. The car did speed up to a maximum speed of around 45 mph, it was good about signaling and not tailgating, and it didn’t flip the bird or yell at any other cars. In that sense, it was a far cry from the taxis, Ubers, and Lyfts I’m used to. There was no gunning the accelerator and trying to wedge into a too-small gap, hoping the vehicle behind us would hit the brakes. Also, the driver didn’t get lost, thanks to the predetermined route.
As to the hardware and software behind the demo, there’s not a ton to say about the latter. Yandex has a deep learning network that has 1.75 million miles of autonomous driving tests to date, including testing in rain and snow, mostly in Moscow and Tel Aviv. But after the demo, I was more interested in hearing about the hardware running the cars.
There’s a dual-socket Intel server in the trunk of the car, with plenty of cooling to keep it happy even in hotter climates—like 100-degree summers in Tel Aviv. The main computational elements consist of three RTX 2080 graphics cards, plus dual Xeon 6230 CPUs with 20-cores/40-threads each. Unlike games, where multi-GPU is becoming increasingly unsupported, the calculations behind deep learning networks are easily farmed out over multiple GPUs (SLI and NVLink are not required). That’s not to say the RTX 2080s are doing all the work. Yandex says the workload is pretty evenly split between the CPUs and GPUs—some things run better on the CPU, others on the GPU, and both are necessary.
Besides the computational hardware, there are six 720p30 cameras providing a 360 degree view of the vehicle’s surroundings, six radar units, and four lidar sensors. There’s also a GNSS sensor for GPS services, but that’s actually not that critical—GPS isn’t accurate enough for autonomous vehicles. Instead, the software determines the position of the vehicle through a combination of GPS, lidar, and stored maps.
For the demonstration vehicles in Las Vegas, Yandex created a 3D map of sorts with information on all the roads, buildings, crosswalks, traffic lights, etc. I think of it as a customized version of Google Maps, but it’s probably a lot more than that. Anyway, the GPS can get a rough location of where the car is, and then the data from the lidar is used to get an extremely accurate position of the vehicle within the map. Basically, I’m told the software knows within an inch or two where the car is.
More than anything, I’m glad I finally got a proper self-driving car demo. I took a ride around Las Vegas in an autonomous Prius hybrid, navigating through a decent amount of traffic, and everything went smooth as butter for nearly 20 minutes—we were definitely running at way more than 60 fps! Cross that one off the bucket list.
Now I just need the tech to reach the point where I can actually buy one and use it for whatever travel I want. Because unlike me, the software running a car doesn’t get bored or tired, or distracted by a phone call or a text. Sure, it could have a glitch and it’s still in development—I could offer up similar qualifications for any human driver. Give the algorithms more training data, better location detection, and faster and better hardware, and the future is clear.
I don’t think human drivers are going away soon, but the writing is on the wall. Unlike the science fiction of my youth, autonomous vehicles are here. We may not have hoverboards, and I still use a keyboard and mouse to surf through cyberspace instead of a brain jack, but in the not so distant future I expect to be able to tell my car where to go and then kick back and relax. We’re beginning down a path that ultimately will lead to safer roads and vehicles is my bet.
Ten years from now, if the choice for a new car is between an autonomous car and one I drive myself, but the ‘manual’ car has a statistically higher chance of getting in a wreck, I know which one I’ll want. My family might think I’m crazy, but then I’ve never really enjoyed driving—not when I could be playing games instead.
Jarred doesn’t play games, he runs benchmarks. If you want to know about the inner workings of CPUs, GPUs, or SSDs, he’s your man. He subsists off a steady diet of crunchy silicon chips and may actually be a robot.