How Tesla’s Autopilot Works: Understanding Its Technology, Limitations, and Future of Self-Driving Cars

How Tesla's Autopilot Works: Understanding Its Technology, Limitations, and Future of Self-Driving Cars

Even the most ardent Elon Musk stan will admit their lord and savior hasn’t exactly delivered on his promise of properly autonomous self-driving vehicles yet. Anyway, Tesla’s flamboyant CEO has repeatedly suggested we’d be there by now, and frankly, we aren’t. But the so-called autopilot feature available on all modern Teslas is quite a marvelous feat of engineering nonetheless.

Understanding Tesla’s Autopilot

What enables it to detect hazards up ahead? How does it then go about making sense of that information? Is it better at driving than human beings? Join us today as we pop the metaphorical hood and take a peek into how autopilot actually works. For an eye-opening introduction to just how sophisticated Tesla’s driving algorithm has become of late, take a look at this video released back in January 2020. Merging input from an array of smart sensory devices, the car’s onboard computer is clearly capable of identifying and tracking its fellow vehicles on the road in real time. Not only that, it’s able to differentiate between an impressive range of other potential hazards besides.

Everything from lane lines and painted arrows to crossings, stop signs, trash cans, the incline of the road up ahead, and even random puddles are all noted and addressed at least as quickly, crucially, as a human being could spot these things. So how does it do it? At the most basic level, visual feedback is fed into the system via Tesla’s eight onboard cameras.

The Role of Cameras and Sensors

Three of these are mounted on the windscreen, each slightly different from the other and suitable for very different ranges. The car’s main front-facing camera is calibrated for visual recognition up to 150 meters. There’s also a wide-angled camera that can see more broadly, up to a range of 60 meters, and a narrow field camera which peers into the distance as far away as 250 meters. In addition, there are four more regular cameras mounted on each side of the vehicle. Two are slotted rearward, and another two are angled forward for merging and maneuvering into tight spots. Finally, there’s a rear camera which itself boasts a range of up to 50 meters, used both as a run-of-the-mill parking camera and another data source feeding crucial situational data back to the central computer.

These cameras, whose visual fields overlap, provide the necessary redundancy, which is a cornerstone of Tesla’s safe design philosophy. They are only part of the picture. A front-facing radar that detects objects up to 160 meters away by bouncing radio waves off of them is a key component in the sensor array. It’s reported that Tesla is currently planning to integrate a radar with twice that range into its newer models, with slicker processing capacity courtesy of cutting-edge radar design by Israeli tech startup Arbor Robotics.

Tesla’s fondness for radar is actually a controversial topic within the fledgling autonomous car industry. Most other companies working to bring similar vehicles to market—think Ford, GM, or Waymo—prefer so-called lidar, which is similar to radar except it bounces light off of objects in order to ascertain their distance and form. At Tesla’s Autonomy Day last year, Elon Musk offered this scathing critique of lidar technology: “Lidar is a fool’s errand,” he informed a rapt crowd. “Anyone relying on lidar is doomed. They are expensive sensors that are unnecessary. It’s like having a whole bunch of expensive appendices. Like one appendix is bad; well, now you have a whole bunch of them. It’s ridiculous.”

Tesla’s Stance on Lidar

Although lidar’s prohibitively high cost was one reason why radar was preferred by Tesla early on, Musk doubled down on his loathing for the medium in October, even as the price of lidar started to fall. “Even if lidar was free, we wouldn’t put it on,” he thundered, “not least because lidar is notoriously unreliable in rainy or dusty conditions.”

Teslas are also fitted with 12 small dots situated around the car, each of which provides essential short-range sensory input up to about eight meters through the magic of ultrasound. This medium provides Tesla with what has been described as a protective cocoon around the vehicle, enabling it to detect when an object, a crash barrier say, or a dog, is getting too close for comfort.

The Fusion of Technologies

Working together in concert, this imaginative fusion of conventional cameras, sophisticated radar, and 360-degree ultrasound helps Tesla stay finely attuned to their surroundings. Add that to the car’s ultra-precise GPS tracking and world-class mapping systems, and you have a vehicle that’s substantially smarter than most human beings at assessing where it is and what’s going on out there on the mean streets.

Of course, when it comes to safe motoring, sensory input is only part of the story. So how is all that lovely data organized and processed? With his characteristic knack for modest understatement, Elon Musk has described the new processor at the beating heart of his iconic car as the best chip in the world. Tesla’s so-called full self-driving chip, shipped in all new models, is a 260 square millimeter chunk of prime Samsung silicon, boasting no fewer than six billion transistors. Each chip—there’s two aboard—again ensuring that all-important redundancy—is capable of performing some 26 trillion operations a second.

Data Processing and Learning

This means it can respond in real time to any of the multi-various hazards brought to its attention from that smart sensory array we talked about just now. There’s more: alongside the humdrum pre-programmed aspects of Tesla’s driving algorithm—speed limits, stop signs, and the like—the car’s silicon brain also has the ability to learn, and learn it most certainly does. Not only from its own native experience fed by those sensors but from data harvested across the entire global fleet of Teslas and their own sensor arrays.

Every single Tesla on the road—and the company manufactured half a million of them in 2020 alone—collects detailed information on its environment and feeds it back to HQ for other motorists to subconsciously make use of. Despite this system’s obvious brilliance, it has upset some who were reportedly suspicious of Tesla’s so-called shadow mode. Their beef is the fact their pricey new model, when in shadow mode, essentially pretends to be an autonomous vehicle, as in it makes but never executes a driving plan based on the data available to it and reports back to HQ whenever its plan deviates from that of the real flesh-and-blood driver.

The Privacy Debate and Current Status

This is, of course, designed to refine and improve the algorithm and fulfill Elon Musk’s dream of deploying autonomy at scale. Still, it bugs people from a privacy point of view, and perhaps they have a point. So how autonomous actually are the latest Teslas? They’re shipped with all the hardware Elon Musk reckons is necessary for achieving the self-driving dream, but for now, the furthest towards that ambition these cars actually get is Tesla’s so-called autopilot mode.

The newest enhanced bells and whistles Autopilot mode, which Tesla drivers can order over the air for approximately eight thousand dollars as an optional extra, offers dynamic traffic-aware cruise control. Customers can also get automatic lane changing for their spend, which can respond to either sat nav route plans or an impulsive manual flick of the turn signal. Owners also get the fun ultimate bragging rights feature of smart summon. However, all these come with caveats.

Smart summon, which in theory enables Tesla drivers to flick a button on the app and summon their shiny motor from its parking space to wherever they’re proudly standing, is only recommended for use on private driveways. Despite that slick traffic-aware cruise control system, which maintains speeds until the vehicle ahead slows or stops, drivers are still legally required to keep their hands on the wheel at basically all times in order to take over should anything go awry. The car will complain and ultimately stop altogether if hands aren’t on the wheel constantly.

The Automation Spectrum

Not exactly autonomous. Autonomy in vehicles is currently rated on an internationally recognized one-to-five scale. That’s so-called level one automation. A single aspect of driving is automated, for instance, traffic-aware cruise control. This has been around for quite a few years now. At the other end of the scale, level five automation promises the ultimate fantasy of full vehicular autonomy, meaning the driver can catch up on emails, watch the scenery, or simply enjoy a well-earned nap.

Right now, Tesla is reluctantly marooned at level two, with only a couple of aspects of their driving experience automated, most notably speed control and lane changing. Modern Teslas are technically quite capable of, say, taking an exit ramp and moving between highways, but legal and regulatory edicts insist that a person still be in charge at the wheel at all times. This is quite right and proper, by the way, for now at least.

The Challenges Ahead

Tragic failures can and do happen, like the passing of 38-year-old Apple employee Walter Huang, whose Tesla on autopilot mode slammed into a concrete barrier while he was reportedly playing on his phone. In March 2019, 50-year-old Jeremy Beren’s Model 3 slammed into a tractor trailer attempting to cross a Florida highway at 68 miles per hour, shearing the roof clean off his car and sadly ending his life.

So despite its undeniable sophistication and the bullish predictions of Elon Musk, autopilot isn’t quite ready yet. Supporters will point out, rightly perhaps, that self-driving Teslas are on the whole better than human drivers. Certainly, they never get drunk or tired or stressed. Still, each and every tragic mishap in even the most vaguely autonomous vehicle sets back progress by months, if not years. It’s probably fair to trust that Elon Musk’s brainchild, an ingenious mixture of smart sensors, lightning-fast AI, and crowd-sourced machine

Leave a Reply

Your email address will not be published. Required fields are marked *