Header Ads Widget

Why Don’t We Have Self-Driving Cars Yet?

More companies are trying to bringself-driving cars to the masses than ever before. Yet a trulyautonomous vehicle still doesn’t exist. And it’s not clear if, orwhen, our driverless future will arrive. Proponents like Elon Musk have toutedan aggressive timeline but missed their goals and others in theindustry have also missed projections. Well, our goal is todeploy these vehicles in 2019. So you’ll have theoption to not drive. It’s not happening in 2020. It’s happening today.

We wanted to check in. Where exactly are wewith self-driving cars? And when can we expect them tobe part of our daily lives? The current state of driverless carsis very interesting because we’ve passed what people refer to as peakhype and we’ve entered what’s called the trough of disillusionment. Which is, even people within the industryare saying, gee, it turns out there’s a lot harder than we thought. We’re definitely not anywhere near as faralong as a lot of people thought we would be three years ago. But I think over the last 18 to24 months, there’s been a real injection of reality.

There was a sense maybe ayear or two ago that our algorithms are so good, we’re ready to launch,we’re gonna launch driverless cars any minute. And then obviously there’s beenthese setbacks of people getting killed or accidents happening and nowwe’re a lot more cautious. Several big players have begun to walkback their predictions on how soon we could see this technology. Even Waymo’s Chief External Officer admittedthat the hype around its self-driving cars hasbecome unmanageable. The technology has come a long way, butthere’s still a lot of work to be done.

There’s the perception, which is,using the sensors to figure out what’s around the vehicle, inthe environment around the vehicle. Prediction, figuring out what those road usersare going to be doing next in the next few seconds. Turns out the perception and especiallyprediction are really, really hard problems to solve. Companies tacklingself-driving today are taking two general approaches. Some are building aself-driving car from the ground up. Others are developing thebrains that drive the car. An early leader was Google, whostarted its self-driving car project in 2009. Known as Waymo today, thecompany is developing hardware and software that can function as thebrains in a self-driving car. Aurora is taking a similar approach.

Founded in 2017 by early playersfrom Uber, Tesla and Google’s self-driving initiatives, it’s already raised$620 million in funding from Amazon and other big name investors. Aurora is testing vehicles on theroad in Pittsburgh, Pennsylvania and out here in the Bay Area. We don’tyet let the public in our cars. Our cars are on the road, we havetwo of our test operators in there. The technology we’re building can operatefrom a compact electric car, to a minivan, to even abig, long haul truck. Argo AI and Aptiv are examplesof other companies taking a similar approach. Lyft is developing its ownself-driving systems now too and offering self-driving rides on itsapp through partnerships in select areas. Self-driving is too big forjust one company and one effort.

And if you look at our strategy,that is why we’re working with partners on the open platform, Aptiv and Waymo,and why we’re building the tech here. Companies like Tesla, Zoox andGM, with its Cruise division, are making their own vehicles. Aiming for self-driving cars thatcan operate in all environments. This is the engineeringchallenge of our generation. We’ve raised seven and aquarter billion dollars of capital. We have deep integration with bothGeneral Motors and Honda, which we think is central when you’re buildingmission critical safety systems and building those in a way that youcan deploy them at very large scale. Cruise, which was acquired by General Motorsin 2016, has been testing its fleet of vehicles in SanFrancisco with safety drivers onboard.

To give you a sense for themagnitude of the difference between suburban driving and what we’re doing everydayon the streets of San Francisco. Our cars on average see more activityin one minute of San Francisco driving than they see in onehour of driving in Arizona. Zoox, led by the former chief strategyofficer at Intel, is working on creating an all in one self-driving taxisystem with plans to launch in 2020. Instead of retrofitting cars withsensors and computers and saying, hey, here’s a self-driving car. We think there’s an opportunity to createa new type of vehicle that from the very beginning was designedto move people around autonomously. Nissan and Tesla both have semi-autonomoussystems on the roads today. Tesla’s has been available in beta onits vehicles since 2015 and drivers have been known to usethe current system hands-free. Tesla’s promising full self-driving softwareis just around the corner.

It’s going to be tight, but it stilldoes appear that we’ll be at least in limited, in early access release, ofa feature complete full self-driving feature this year. I think Tesla isactually a lot further back than they would like the world to to believe theyare because they are, in fact, so much more limited interms of their hardware. Others are making self-driving shuttlesthat operate along designated routes only or focusing on truckswith long haul highway routes.

And then there are companieslike Ghost and Comma.ai working on aftermarket kits. Essentially hardware that could be installedin older cars to bring them new self-driving capabilitiesone day. For all players in this space, thepath ahead is filled with challenges. Chief among them, provingthe technology is safe. Driverless systems have to meet a veryhigh safety bar that has to be better than a human beforethey’re deployed at scale. There are no federally establishedstandards or testing protocols for automated driving systems in the U.S. today, but there havebeen fatal crashes.

A woman named Elaine Herzberg was killedby an autonomous Uber with a safety driver who waspaying no attention. This woman was crossing the street,walking her bicycle, should easily have been seen by the autonomousvehicle, was not, was run over. Nobody stepped on the brakes. In 2016, a Tesla fan named JoshuaBrown died in a crash while using autopilot hands-free in Florida. Other autopilot involved accidentsare now under investigation.

Still, the industry is hopeful thatautonomous vehicles will make the roads far safer than they are today. Really, the kind of zero to one momentfor the industry will be when we can remove those safety drivers safelyand the vehicle can operate without the presence of any human. Others, likeElon Musk, have said it’s almost irresponsible not to have these vehiclesout there because they are safer and will be safer than human drivers.

Even if we could say that anautonomous vehicle was better than a human driver, it doesn’t mean that an autonomousvehicle is better than a human driver plus all of the advanceddriver assist systems we have. When looking at when the tech couldactually be ready one of the principle metrics touted by companies is the numberof miles driven, but not all miles are created equalwhen testing automated systems. You could take an autonomous vehicle and go,put it on an oval track or just a straight road, and youcould drive 100 million miles. But that’s not really gonna tell youmuch about how well the system actually functions because it’s not encounteringthe kinds of things that are actually challenging ina driving environment.

Testing self-driving vehicles out onpublic roads isn’t enough. They need to be exposed to everyimaginable scenario, so companies rely on simulation. We can create situations thatwe’re basically never going to see or very rarely see. So, for example, we might want tosimulate what happens as a bicycle comes through an intersection, runs a red lightand crashes into the side of our car. Turns out that doesn’t happen veryoften in the real world, but we want to know that if that happens,our vehicles are going to do something safe. Basically allow the car to practiceup in the cloud instead of on the road.

When you’re testing autonomousvehicles out on public roads, not only are the people riding in that carpart of the experiment, but so is everybody else around you. And they didn’tconsent to being part of an experiment. I remain concerned that humanswill be used as test dummies. Instead of self-certification and de-regulationI want to see strong independent safety regulations from the agenciesin front of us today. The self-certification approach did not workout well for the Boeing 737 Max 8 and now Boeingis paying the price. We should heed that lesson when it comesto finding out the best way to deploy autonomous vehicles. Lawmakers held hearings this month to figureout how to keep the public safe without holdingback self-driving innovation. In September, the National HighwayTraffic Safety Administration released new federal guidelines forautomated driving systems.

But they’re only voluntarysuggestions at this point. State legislation is farther along. As of October, 41 states haveeither enacted laws or signed executive orders regulatingautonomous vehicles. With regulatory questions looming, it’sno surprise that self-driving companies are proceedingcautiously at first. What we’re going to be seeing inthe next several years is more limited deployments in very specific areaswhere there’s confidence that the technology can work.

I think we’llsee limited deployments of self-driving vehicles in the nextfive years or so. You’ll see these moving goods andyou’ll see them moving people, but you’ll see them specificallyin fleet applications. Aurora says its systems could beintegrated into any vehicle, from fleets of taxis to long haul trucks. The cost of self-driving technology isanother deciding factor for how it will be deployed. Most consumers are nevergoing to own a vehicle that’s really autonomous because the technology isexpensive and there’s a whole raft of issues around product liabilityand making sure that it’s properly maintained and sensorsare calibrated.

That’s one reason ride hailing companies Lyftand Uber are getting in the game. We havetwo autonomous initiative. One is the open platform wherewe’re connecting Lyft passengers with our partner self-driving vehicles. And so this is Aptiv in LasVegas and Waymo in Chandler, Arizona. And then also kind of the productexperience for the tech that you see here, which is Level 5. As AVcompanies inch toward the mainstream public perception, simple understanding of the techhas become another issue that could impact progress. Some in particular in the industry havedone a disservice to the public in overhyping the technology beforeit’s really ready. It’s still not very clear to mostpeople what we mean when we say driverless car. Waymo and GeneralMotors Cruise Automation are very close to having what they referred to aslevel five cars most of the time.

In other words, again, they canin theory function all by themselves. But so far, it seems that they functionlike a 15 year old driver hoping to get a driver’s license. There’s a lot of people who thinkthat you can buy autonomous vehicles today, especially when you can go out andbuy a car, buy an option that’s called full self-driving andpay for that. You expect that it actually exists. And the fact is, itdoes not exist today. With an uncertain timeline and ahistory of missed targets, public confusion is no surprise.

Despite big developments, most companieshave recognized we are still years away from having truly self-driving carsas part of our daily lives. One big question is whenis the car ready? You have to have a good sense ofall of the scenarios and all of the situations that the vehiclewill need to encounter. And that just takes time. We expect level four vehicles tobe feasible in small quantities within the next five years. And what that means is you’ll probablysee hundreds or maybe thousands of vehicles out either delivering packagesor moving people through neighborhood or maybe haulinggoods on our freeways.

And now, even the experts hesitateto make promises on when true self-driving will get here. You always have to assume that the useris going to find a way to misuse the technology. Assume the worstand then design for that. I think it’s a mistake to beover promoting the technology, over hyping it when it’s still very mucha work in progress.

This is something we need to do with society, with the community and not at society. And wetake that very seriously. We’re building mission critical safety systems that are going to have a huge positive impact on people’s lives. And the tech adage of move fastand break things most assuredly does not apply to what we’re doing here.

Post a comment