Advancements in electrification have pushed OEMs to rearchitect their cars from the ground up into software-defined vehicles. But autonomous driving presents a separate challenge and opportunity. While full autonomy remains several years out, automotive companies have started transforming their business models from hardware to software by selling services and features over the life of the vehicle. The key question remains: what technology will become the standard?
THE SOFTWARE OPPORTUNITY
The automotive industry is large with 100 million cars sold a year and an installed base of over 1 billion vehicles on the road. The industry has historically been characterized by fierce competition and limited differentiation. The concept of software and autonomy is relatively new with less than 10% of the vehicles sold today equipped with Level 2 autonomy or greater. But this is rapidly changing, and in the next decade, we expect that most of the vehicles sold each year will be electric and software-defined that support L2+ or higher capability. Level 2 autonomy requires a driver at all times; Level 3 requires that a driver must remain alert but can take the hands off the wheel, and Level 4 is a fully autonomous vehicle.
Over the past year, it has become evident that fully autonomous driving is a much larger challenge than previously expected, despite several executives promising full autonomy 1+ years out. We believe that full autonomy will require years of model training and iterations before it can be available for mass-market production.
WHO WILL STAND TO BENEFIT?
Automotive OEMs have diverged when it comes to developing software and autonomous systems. On one side there is Tesla developing everything in-house, and on the other side, everybody else relying on an ecosystem of providers.
While both approaches have merits, it’s been evident over the past year that the companies that rely on an ecosystem of providers have been able to innovate at a faster pace and reach or surpass Tesla when it comes to autonomy.
For an autonomous system, there are generally three components: vehicle hardware, vehicle software, and an outside of the vehicle data-center or supercomputer used to train the underlying models. The data can come from multiple sources such as cameras, radar, LiDAR, etc. While in the past it was considered cost-prohibitive to use multiple sources, especially LiDAR, recent Chinese product introductions (e.g. NIO E7, XPEV P5) are all equipped with multiple cameras, radars, and LiDAR. Tesla’s autonomous driving system relies solely on cameras.
OVERVIEW OF TESLA’S AUTONOMOUS TECHNOLOGIES
While Tesla has been generally considered to have the first-mover advantage, which served the company well in electric vehicles (EVs), autonomous driving is a greater challenge, and the first-mover position may prove to be a disadvantage. In a way, being a first-mover may limit a company to a specific technology that made sense at a certain point in time but may not be able to adapt to new technologies developed by the ecosystem.
In 2021, Tesla introduced its system on a chip (D1) and the development of its supercomputer, the Dojo. Its current autonomous platform consists of the following:
- Hardware: D1 SoC
- Software: Full Self Driving (FSD) , the equivalent of L2+ autonomy despite its name
- Data Center/Supercomputer: Dojo
Tesla’s edge is that the company collects a vast amount of data from its fleet in real-time. The company developed an advanced neural network running on ~10K GPUs across 3 data centers, using several differentiated methodologies, such as developing predictions in 3D vector space rather than most competitors’ approach of using 2D images, fusing the camera sensor data before detection, and so on. In summary, when it comes to the camera, Tesla is far ahead of its competitors.
Interestingly, while Tesla has been iterating the camera data, Chinese competitors have announced entry-level models fully equipped with multiple cameras, radars, and even a LiDAR, which was previously considered cost-prohibitive. We believe that these platforms, the majority of which are built on Nvidia hardware and software, pose a significant threat to Tesla’s camera-only strategy.
OVERVIEW OF COMPETING TECHNOLOGIES – NVIDIA SETTING THE STANDARD
Most of the innovation over the past year is coming from companies that rely on an ecosystem of infrastructure providers. There are many providers, but Nvidia has emerged as a standard among most major automakers. The company is a leader in AI, which positions it well to build an end-to-end Autonomous Vehicles (AV) platform. Nvidia offers state-of-the-art SoC hardware, software, and products that serve as the backbone for model training. Its current autonomous platform consists of the following:
- Hardware (SoC): DRIVE Orin
- Hardware + Software: DRIVE Hyperion
- Data Center/ Supercomputer: Aos
On the hardware side, Nvidia’s DRIVE Orin is currently used by ~25 (out of 30 major) vehicle manufacturers including all major Chinese EV producers (NIO, Li Auto, XPeng, BYD, etc).
On the software side, Nvidia introduced Hyperion, an end-to-end platform connecting cameras, sensors, radars, and LiDAR. Mercedes and Jaguar were the early adaptors with model introductions coming in 2024. Hyperion is open and can accelerate the AV time to market by giving manufacturers the ability to leverage Nvidia’s own development work and providing ongoing upgrades.
Lastly, while Drive Orin serves as the brain and Hyperion as the nervous system inside the car, model training occurs outside of the car and is yet another opportunity for Nvidia. Most manufacturers are building their own data centers using Nvidia’s hardware (e.g. NIO is using Nvidia’s HGX with eight A100 Tensor Core GPUs). But in the future, Nvidia will be able to enhance its customers’ capabilities with the Aos supercomputer, which the company plans to leverage for model training. Aos is expected to be over 4x faster than the world’s fastest supercomputer and 4x faster than Nvidia’s current Selene supercomputer.
In summary, fully autonomous driving is a complex problem and may require many companies to solve multiple challenges. We believe that the pace of innovation and the magnitude of the total investment required favor an open platform approach.
Disclosures: SPEAR Invest has an investment in Nvidia (NVDA), NIO Inc (NIO) and XPeng (XPEV). Views expressed here are for informational purposes only and are not investment recommendations. For more information visit our website spear-invest.com.Sponsor this Article
Ivana Delevska founded Spear in 2021 after spending 14 years investing in industrial technology companies. Ivana spent four years covering Multi Industry companies at Deutsche Bank as a Vice President 2017-2018 and Gordon Haskett as a Director 2018-2021. Prior to that time, she spent 10 years as a Senior Analyst working for long/short hedge fund platforms Tiger Management, Millennium Management, Citadel Asset Management, and Davidson Kempner. Ivana started her career at JP Morgan in the Mergers and Acquisitions Group. She graduated from the University of Chicago in 2006 with a BA in Economics.