Tesla’s AI chief: Self-driving cars don’t need LiDAR
What is the technology stack you need to create fully autonomous vehicles? Companies and researchers are divided on the answer to that question. Approaches to autonomous driving range from just cameras and computer vision to a combination of computer vision and advanced sensors.
Tesla has been a vocal champion for the pure vision-based approach to autonomous driving , and in this year’s Conference on Computer Vision and Pattern Recognition (CVPR), its chief AI scientist Andrej Karpathy explained why.
Speaking at CVPR 2021 Workshop on Autonomous Driving , Karpathy, who has been leading Tesla’s self-driving efforts in the past years, detailed how the company is developing deep learning systems that only need video input to make sense of the car’s surroundings. He also explained why Tesla is in the best position to make vision-based self-driving cars a reality.
A general computer vision system
Deep neural networks are one of the main components of the self-driving technology stack. Neural networks analyze on-car camera feeds for roads, signs, cars, obstacles, and people.
But deep learning can also make mistakes in detecting objects in images. This is why most self-driving car companies, including Alphabet subsidiary Waymo , use lidars, a device that creates 3D maps of the car’s surrounding by emitting laser beams in all directions. Lidars provided added information that can fill the gaps of the neural networks.
However, adding lidars to the self-driving stack comes with its own complications. “You have to pre-map the environment with the lidar, and then you have to create a high-definition map, and you have to insert all the lanes and how they connect and all the traffic lights,” Karpathy said. “And at test time, you are simply localizing to that map to drive around.”
It is extremely difficult to create a precise mapping of every location the self-driving car will be traveling. “It’s unscalable to collect, build, and maintain these high-definition lidar maps,” Karpathy said. “It would be extremely difficult to keep this infrastructure up to date.”
Tesla does not use lidars and high-definition maps in its self-driving stack. “Everything that happens, happens for the first time, in the car, based on the videos from the eight cameras that surround the car,” Karpathy said.
The self-driving technology must figure out where the lanes are, where the traffic lights are, what is their status, and which ones are relevant to the vehicle. And it must do all of this without having any predefined information about the roads it is navigating.
Karpathy acknowledged that vision-based autonomous driving is technically more difficult because it requires neural networks that function incredibly well based on the video feeds only. “But once you actually get it to work, it’s a general vision system, and can principally be deployed anywhere on earth,” he said.
With the general vision system, you will no longer need any complementary gear on your car. And Tesla is already moving in this direction, Karpathy says. Previously, the company’s cars used a combination of radar and cameras for self-driving. But it has recently started shipping cars without radars .
“We deleted the radar and are driving on vision alone in these cars,” Karpathy said, adding that the reason is that Tesla’s deep learning system has reached the point where it is a hundred times better than the radar, and now the radar is starting to hold things back and is “starting to contribute noise.”
Supervised learning
The main argument against the pure computer vision approach is that there is uncertainty on whether neural networks can do range-finding and depth estimation without help from lidar depth maps.
“Obviously humans drive around with vision, so our neural net is able to process visual input to understand the depth and velocity of objects around us,” Karpathy said. “But the big question is can the synthetic neural networks do the same. And I think the answer to us internally, in the last few months that we’ve worked on this, is an unequivocal yes.”
Tesla’s engineers wanted to create a deep learning system that could perform object detection along with depth, velocity, and acceleration. They decided to treat the challenge as a supervised learning problem, in which a neural network learns to detect objects and their associated properties after training on annotated data.
To train their deep learning architecture, the Tesla team needed a massive dataset of millions of videos, carefully annotated with the objects they contain and their properties. Creating datasets for self-driving cars is especially tricky, and the engineers must make sure to include a diverse set of road settings and edge cases that don’t happen very often.
“When you have a large, clean, diverse datasets, and you train a large neural network on it, what I’ve seen in practice is… success is guaranteed,” Karpathy said.
Auto-labeled dataset
With millions of camera-equipped cars sold across the world, Tesla is in a great position to collect the data required to train the car vision deep learning model. The Tesla self-driving team accumulated 1.5 petabytes of data consisting of one million 10-second videos and 6 billion objects annotated with bounding boxes, depth, and velocity.
But labeling such a dataset is a great challenge. One approach is to have it annotated manually through data-labeling companies or online platforms such as Amazon Turk. But this would require a massive manual effort, could cost a fortune, and become a very slow process.
Instead, the Tesla team used an auto-labeling technique that involves a combination of neural networks, radar data, and human reviews. Since the dataset is being annotated offline, the neural networks can run the videos back in forth, compare their predictions with the ground truth, and adjust their parameters. This contrasts with test-time inference, where everything happens in real-time and the deep learning models can’t make recourse.
Offline labeling also enabled the engineers to apply very powerful and compute-intensive object detection networks that can’t be deployed on cars and used in real-time, low-latency applications. And they used radar sensor data to further verify the neural network’s inferences. All of this improved the precision of the labeling network.
“If you’re offline, you have the benefit of hindsight, so you can do a much better job of calmly fusing [different sensor data],” Karpathy said. “And in addition, you can involve humans, and they can do cleaning, verification, editing, and so on.”
According to videos Karpathy showed at CVPR, the object detection network remains consistent through debris, dust, and snow clouds.
EVs overtake diesels as most popular lease cars in the UK
Demand for EVs in the UK’s leasing sector has for the first time surpassed that of diesel-powered cars, data from Leasingom reveals.
The site’s data shows that the demand for electrified vehicles – consisting of battery electric vehicles (BEVs), plug-in hybrids (PHEVs), and hybrid cars – has been gradually growing since the beginning of the year, rendering EVs the winners during the second quarter of 2021.
BEVs have risen by 27% and hybrids by 57%, with the PHEVs seeing the biggest increase at 78%. On the contrary, petrol vehicles only witnessed a 3% growth and diesel vehicles dropped by 13%.
Managing director of Leasingom Dave Timmis noted: “We’ve been planning for the UK market to shift towards electric drivetrains, but the rate at which they’ve overtaken diesel has been surprising.”
Timmis attributes this to the EVs‘ improved accessibility and cost-efficiency. There are now more affordable models in the market, like the Hyundai Ioniq and the Vauxhall Corsa E, compared to previous years when relatively expensive models, such as the Tesla Model S, dominated, he explained.
This reflects a very optimistic trend of the UK’s auto market, which shows that drivers are more open to shifting away from diesel vehicles and switching to electrically powered cars.
According to the latest registration figures from the Society of Motor Manufacturers and Traders (SMMT), diesel car registrations dropped by 18,6% year to date. Diesel vehicles have achieved an 18% market share, while the electrified vehicles combined have reached 21.7%.
Overall, it’s impressive that diesel cars, that are usually the preferred leasing option due to their better fuel economy, no longer hold the first place. EVs seem like the next big player, which might indicate that drivers are open to the new tech, especially when it comes without any commitments.
HT – AM Online
Do EVs excite your electrons? Do ebikes get your wheels spinning? Do self-driving cars get you all charged up?
Then you need the weekly SHIFT newsletter in your life. Click here to sign up
100 solar EVs will boost Utrecht’s vehicle-to-grid charging system
Did you know SHIFT is taking the stage this fall ? Together with an amazing line-up of experts, we will explore the future of mobility during TNW Conference 2021. Secure your ticket now!
Utrecht, the Netherlands’ fourth largest city, is on its way to becoming the first region in the world with a bidirectional ecosystem. How? By using vehicle-to-grid charging(V2G), combined with car sharing.
Bidirectional V2G charging is an efficient and cheap way to store and use renewable energy . The idea is simple: electricity flows both ways, meaning from the grid to the electric vehicle and vice versa.
Utrecht has already partnered up with We Drive Solar and Hyundai , and now will get a fleet of solar electric vehicles (SEVs) made by Sono Motors.
Under the deal, the Munich-based startup will provide the city with 100 Sion solar cars to support the city’s bidirectional goals.
Thanks to their V2G capabilities, the Sion SEVs will access Utrecht’s 500 bidirectional chargers and feed energy back to the grid through their 54kWh batteries.
Specifically, the energy stored in the solar car’s battery can deliver up to 11kW to other electric vehicles, or to homes and the grid via the bidirectional on-board charger.
Combined, the 100 Sions will be able to provide 1.1 megawatt peak power to the city of Utrecht. According to Sono Motors, this is equivalent to the energy created by a large photovoltaic power plant about the size of two football fields.
With this capacity, the solar car fleet will help reduce grid instabilities and the likelihood of blackouts.
“This is the perfect project for Sono Motors to further our vision of a world free from fossil fuels as it is a clear demonstration that electric vehicles can support the transition of the energy sector as a whole,” stated Jona Christians, CEO and co-founder of Sono Motors.
The solar Sions will join We Drive Solar’s car sharing system, which, by early 2022, will have up to 150 bidirectional shared Hyundai Ioniq 5s.
Do EVs excite your electrons? Do ebikes get your wheels spinning? Do self-driving cars get you all charged up?
Then you need the weekly SHIFT newsletter in your life. Click here to sign up.