Autonomous driving is regarded by the industry as the biggest challenge for next-generation vehicle technology development. At present, each depot has its own technology for the development of autopilot scenarios. These technologies include environmental sensing capability, core computing capability, and how the car interacts with the environment and autonomously To judge the network communications and decision-making capabilities.
Figure 丨 Tesla's autopilot
In order to solve the calculation problem of vehicle traffic in complex traffic environment very well, the choice of chip solution is very important to the manufacturer.At present, there are many ADAS solutions that support L2 calculation and processing, but above L3, the choice is really Very limited.
Due to the laws and regulations being perfected, the vehicles on the road can only be operated to a maximum of L3, and the driver must be ready to intervene at any time. In other words, autonomous driving has so far only been used as a supplement to mankind, It is not a true replacement for human driving, but it is also the fastest driving autopilot for the mainstream market in the next two years.
However, although L3 is only one level away from L2, the computational requirements involved are quite different: L2 and L3 are not far from automatic control of the vehicle, but L3 allows for autonomous driving in limited conditions that requires Judging the vast majority of the lane around the body, and at the same time to replace the driver to make a variety of decisions in progress, and the decision-making process, you need a huge amount of computing power behind the support.
Figure 丨 Mobileye
Prior to L2, since most of the solutions focused on visual computing, solutions were also moving in this direction. For example, Mobileye, prior to being acquired by Intel, mainly supplied visual-based solutions and ate it Most of the relevant programs market.But with the auto industry to a higher level of autonomous driving development, Mobileye in the absence of strategic reasoning programs, it has become the biggest bottleneck in limiting its future development, and demand for Tesla, In fact, it is also for this reason to abandon Mobileye, rather than several car accidents decided to break up.
Considering that in 2020, even some research institutes think that by 2025, due to regulatory and related program development restrictions, L4 or even L5 level full automatic driving capability may not be possible to get on the road, so the next few years, L3 level The autopilot program will be the target market for various companies, including carmakers, IC design companies and solution companies, and DT Jun believes that the upcoming CES 2018 is an important occasion to observe the trends of various program companies.
According to the news, including Intel, Qualcomm, and even NVIDIA, TI, CES 2018 will announce the latest layout of its autopilot program, the focus will also be placed in the earliest L3-level autopilot.
NVIDIA to promote self-driving technology development, will probably continue from the emphasis on scalable design
One of the reasons Tesla later chose NVIDIA to abandon Mobileye was pressured by NVIDIA's move to join the autopilot market, which put a lot of strain on Mobileye.
NVIDIA can be said to be the first company in the industry to propose a "decision-making" solution for autonomous driving. However, the decision to rely on is not a GPU but a CPU. Therefore, from the earliest Drive CX to the latest Xavier platform, NVIDIA has used ARM architecture core, The Drive PX2 uses six core processors, two of which are Denver cores deeply customizable by NVIDIA and four are standard Cortex-A57 Xavier uses eight deeply customizable ARM cores.
With these cores, which have been greatly enhanced by NVIDIA in computing power, the driver's program can calculate and process visual processing data in the GPU section in a very short period of time.
Xavier is the NVIDIA autopilot program announced at CES 2017, which not only significantly enhances computing power, but also controls power consumption below 30W with better semiconductor technology, which is already substantial compared to the 250W power consumption of the previous generation Drive PX2 Although NVIDIA may not dominate in terms of power consumption, Xavier is the most mature and available L4-level autopilot available on the market, with competing vendors currently only remaining on the L2 Advanced ADAS , So the current depot want to develop products above the L4 level, to quickly come up with the test program can be put on the road, NVIDIA can be said that the only option.
Figure 丨 Xavier NVIDIA
And its support for the L5 level completely autonomous driving program Pegasus platform, although the surface power consumption up to 500W figure frightened many people in the industry, but DT Jun believes that NVIDIA introduced this program is not for mass production, but mainly for the pre-development After all, at present, the world is still at a relatively early stage in formulating autonomous driving regulations. The industry also believes that before 2025, it is still very hard for L5-level autonomous vehicles to go its own path.
That since the mainstream of autonomous driving in the next few years will be based on L3, which Xavier platform supports L4 level is not introduced meaning?
However, in fact, the design of peripheral sensing elements for autonomous driving is actually quite mature. The crux of the matter is the legal liability involved in laws and regulations and decision making as well as the development of high-resolution map data corresponding to autonomous driving. , With the current hardware technology to design a complete road available L5 self-driving car is not much problem, the key part of the software environment there is still much room for escalation.
In other words, the depot adopts NVIDIA-based Xavier platform to launch a car that fully meets the L4 level in terms of hardware functions. However, the software can be gradually upgraded according to the regulations of different periods and the maturity of the software environment to gradually improve its autopilot capability from L3 Upgrade to L4 or even L5.
And because the depot development platform hopes to have a very good consistency and long-term platform support capabilities, the general automotive semiconductor program support period of at least 5 years take-off, as long as up to 10 years.Although the current program of NVIDIA The price is higher, but if the program can be designed from 2018 for production of L4 level above autonomous vehicles, the future depot only need to go on the same platform to gradually upgrade, a complete software environment, high-end, scalable cars Product positioning, not only the high cost of the program itself can be effectively diluted, the market layout of the depot and long-term technology research and development will also have a positive help.
This is NVIDIA hit the wishful thinking, in various programs manufacturers are unable to push the L3 level of automatic driving program now has the first card L5 market, the automotive industry attaches importance to consistency and stability of the habit of converting the future halfway Platform is quite low possibility to ensure that NVIDIA in the future automatic driving program market and profit margins.Of course, the current NVIDIA L5 program demand for power consumption is too large, to be directly produced in this program, although the production car Impossible, but the demand for vehicle power management may be difficult to solve, but with the development of semiconductor technology and chip design technology, this problem should be solved very well.
Therefore, DT Jun believes that the current NVIDIA hardware in the autopilot solution is relatively mature, CES 2018 on the technical solutions may not be too much information, the focus should be on other peripherals or cooperation in the progress.
Mobileye will leverage the ADAS market advantage and leverage Intel's initiative to consolidate its autopilot program
Mobileye does not get much news on the autopilot program above the L3 level, after all, its related programs will not be available until 2018, but its ADAS, which is also a product under the L2 level, can be said to firmly hold the overwhelming majority of the mainstream The car's eyes, the current market, more than 70% of ADAS programs are from Mobileye hand.
Figure 丨 Mobileye EyeQ3
Observing the EyeQ3 solution that Mobileye currently sells on the market, Mobileye positioned it as an autonomous driving solution and was adopted by Tesla's first-generation Autopilot system, but in reality the performance of its decision-making part is weak, The architecture is visible: 4 MIPS has been introduced in 2006 34K core, the clock is only about 500MHz, decision-making performance can only reach about one percent of Drive PX2, basically barely meet the L2 automatic Driving needs, a higher level requires a new EyeQ4 program have a way to support.
However, because Mobileye has been working on the ADAS market for a long time in the past, the cooperation with relevant visual sensing technology is quite mature. If Mobileye's plan is adopted, there will be basically no need to worry about the acclimatization of visual recognition processing technology required for automatic driving. Most programs are available with Mobileye, even the forthcoming EyeQ4, its re-match, the verification time can be minimized.
On the other hand, in the next few years, pure ADAS cars, that is, autonomous vehicles below the L2 level will still be absolutely mainstream in the market, and L3 vehicles will be hard to be compared with their high-end shipments. In the coming years, most L2 vehicles Mobile Eye Eye3 continues to be used mainly because of its low cost and proven reliability in the marketplace.
Figure 丨 Mobileye EyeQ4 architecture
Of course, in order to ensure the autonomous driving market above L3, Mobileye will officially launch the EyeQ4 solution for L3 in 2018. The basic architecture is similar to that of EyeQ3. Both MIPS CPU cores and vector acceleration units Architecture uses a newer and better version, the overall computational efficiency is also enhanced by nearly 10 times EyeQ4, power consumption increased only slightly 0.5W. In other words, the EyeQ4 power consumption is still the most emphasized, but also the most obvious application advantages, But even so, the overall computing power is still far behind NVIDIA's program.
While the design itself has been finalized and there is not much room for change in the computing power of the chip, Intel has a computational and network connectivity solution for basebands, CPUs, and FPGAs that may improve overall computational efficiency in the future through an external solution, which means that EyeQ4 may rely on Intel's technical assistance to achieve a higher level of autopilot support without significantly increasing system power consumption can discourage NVIDIA's attempt to dominate the L3 or higher market.
However, it is unclear how Intel will help EyeQ4 expand its market for autonomous vehicles above the L3 level, which is believed to be Intel's most crucial piece of information about the strategic layout of the autonomous driving market at CES 2018.
Qualcomm aiming car networking technology into the market, but also with NXP BlueBox platform into the autopilot market
As we all know, Qualcomm has always occupied the industry leading chip networking capabilities for the future development of networking technology has also been the most active, although the recent changes in customer relationships suffer, as well as the controversy of patent fees, leading to the company cast a shadow over the future development , But Qualcomm itself is still the technical advantages, and for the field of autopilot, NXP from the acquisition of the entire set of visual recognition technology has a very good accumulation, including Baidu, FAW and other vehicle manufacturers or driving companies have also used NXP technology Design related programs.
However, purely automatic driving program is not concerned with the current Qualcomm projects, but rather in the automotive networking capabilities, Qualcomm hopes to speed up the V2V (car to car), V2X (car to everything) process, regardless of the future of automatic driving car what program , The best network to use Qualcomm technology.
Qualcomm believes that purely based on artificial intelligence, automatic driving program has its application limitations, after all, through the sensor to sense changes in the surrounding environment, only for a small range of environment to make decisions, but can not improve the efficiency of the entire transport network. The autopilot of the car itself can only be regarded as a half-set. There is no way to fundamentally improve the bottleneck of the overall transport system.
In addition, the car's autopilot software is written by human logic, so people will make mistakes, autopilot may also make a similar mistake, and the visual processing is still designed by the people of the marking, the signal is identified Objects, these are not originally for machine identification optimization, in fact, automatic driving system is a very heavy computational burden.
Therefore, starting from the infrastructure, we can directly update the road and signal conditions that match with the geographic location from the network, and the driving system of the vehicle simultaneously considers the information collected from the cloud and the near-field sensor data collected by the vehicle to enable the automatic driving The best route selection for vehicles and the judgment of driving decisions on longer routes will be more effective in optimizing overall driving efficiency and will be more logical than relying solely on the smart driving logic of the car itself.
Of course, in order to achieve Qualcomm's automatic driving vehicle network environment, the infrastructure must be quite complete, and its traffic information is also meaningful, but before the L3 auto-driving level, basically did not use the V2X necessary.
Since the car networking infrastructure can not be popularized within a short period of time, the self-driving program from NXP seems to have no practical effect, and Qualcomm would not have to play in the autonomous driving area.
Qualcomm has always been more emphasis on long-term layout, although the driver of the major manufacturers are focused on the introduction of local smart drive software and hardware solutions, Qualcomm seems a little behind, but these programs will require 100% networked, and networking options, the current Look at Intel, MediaTek, Spreadtrum, Qualcomm are optional programs, Intel should be based on their own programs, MediaTek, Spreadtrum technology is relatively backward, while Huawei, Apple may also develop related programs, but also for personal use-based , Then the most mature networking program is none other than non-Qualcomm.
Of course, Qualcomm will also strengthen the ecology of BlueBox from NXP, and through the Snapdragon platform, artificial intelligence processing capabilities to enhance decision performance, and emphasizes its car network as the core, cloud and terminal integration of large data computing capabilities, with competition Advantages of the program, not just the car's own intelligence, but to make the city's traffic become intelligent, I believe this is Qualcomm in the face of the future of automatic driving technology trends, the heart of real calculations.