Huawei | 'Da Vinci Project' | Exposure: Introducing AI to all products, developing new chips

According to the foreign media The Information report, Huawei has internally formulated the 'Da Vinci Plan', which is also referred to as the 'D Plan' by Huawei executives. The person in charge is Huawei's vice chairman, Huawei's IC design company Hai Si Chairman Xu Zhijun.

The main contents are as follows: First, introduce AI into all Huawei products and services, including telecom base stations, cloud data centers, smart phones, surveillance cameras, etc. Second, develop new AI chips for data centers, enabling speech recognition, images. Applications such as recognition can be used in the cloud.

The Information's speculations on the plan are: First, Huawei's AI chip may pose a threat to Nvidia; Second, Huawei's AI cloud layout may aggravate US government concerns.

Huawei AI layout

Huawei's AI layout is actually nothing new.

Lei Feng.com has reported that at the Huawei DigiX2018 Huawei Terminal·Global Partner and Developer Conference held on June 22, Huawei's consumer business CEO Yu Chengdong elaborated on the new strategy of Huawei's consumer business: the full scene smart life ecological strategy.

Yu Chengdong emphasized that Huawei's full-scenario smart living environment is an open ecology. Facing the whole ecology, Huawei HiAI artificial intelligence open platform will provide core, end, and cloud technologies at three levels, and empower global partners.

On the chip side, Huawei will create a smooth operation experience through HiAI mobile computing platform, audio-visual tools, game assistants, etc. On the terminal side, Huawei builds full-connection services and full-scenario applications through smart hardware, HiAI platform, AR/VR and other devices. Provide a series of intelligent sensing and interaction capabilities; on the cloud side, Huawei Mobile Services (HMS), AppGallery, etc. meet the consumer's personalized use needs.

In September 2017, Huawei released the world's first mobile phone AI chip Kirin 970, adding the neural network processing unit NPU (the technology is from Cambrian), compared with the CPU, the calculation speed is increased by 25 times, the energy efficiency is increased by 50 times. Huawei Mate 10 and Mate 10 Pro and Glory V10 are equipped with Kirin 970 chips.

At present, Huawei still uses NVIDIA's GPU chip to add AI functions to the server.

China Core 'group' enters the cloud AI chip

The domestic competition situation is intensifying. This year, Baidu and Cambrian have entered the cloud AI chip.

In May of this year, Cambrian released the first cloud smart chip Cambricon MLU100.

It is understood that MLU100 uses Cambrian's latest MLUv01 architecture and TSMC 16nm process, can work in balanced mode (1GHz frequency) and high performance mode (1.3GHz frequency), the equivalent theoretical peak speed in balanced mode is 128 per second. Trillions of fixed-point operations, the equivalent theoretical peak speed in high-performance mode reaches 166.4 trillion fixed-point operations per second, but the typical board-level power consumption is 80 watts, and the peak power consumption does not exceed 110 watts.

On July 4, at the 2018 Baidu AI Developer Conference, Baidu CEO and Chairman Li Yanhong released Baidu's self-developed AI chip 'Kunlun', which included training chip Kunlun 818-300 and reasoning chip 818-100.

Baidu claims that this is China's first cloud-wide full-featured AI chip, and it is by far the most powerful AI chip in the industry. From the parameter point of view, 'Kunlun' uses Samsung's 14nm process, with 260Tops performance, 512 GB/s Memory bandwidth; power consumption exceeds 100 watts, consisting of tens of thousands of small cores.

Cloud AI chip market pattern

In the current GPU chip market, NVIDIA has a market share of 70%, which is a well-deserved chip leader.

In 2016, NVIDIA invested billions of dollars, used thousands of engineers, and launched the first Pascal GPU optimized for deep learning. In 2017, it introduced a new GPU architecture with a performance that is five times higher than Pascal. The reasoning accelerator TensorRT 3 also appeared at the same time. TensorRT as a programmable reasoning accelerator can accelerate existing and future network architectures.

Lei Feng.com has carried out a detailed analysis of the AI ​​chip market structure (see: Today's AI chip industry, which will be the head player's world). At present, the cloud AI chip has been controlled by traditional giants, whether from hardware or software. .

The following picture is the global AI chip list released by Compass Intelligence. Because the AI ​​chip is currently rarely used in the terminal, the ranking of the head of the list can be approximated as the current market structure of the cloud AI chip.

We can see that the chip giant Nvidia has firmly occupied the top of the AI ​​chip. Due to the popularity of the CUDA development platform, NVIDIA's GPU is the most widely used general-purpose AI hardware computing platform.

In addition to companies with strong self-developed chips (there are few in the world), if you need to do AI-related work, you must use Nvidia's chips. Nvidia's chip applications are common, and now all AI software libraries support the use of CUDA acceleration. , including Google's Tensorflow, Facebook's Caffe, Amazon's MXNet, etc.

Now the focus of the industry debate is which is the best processor architecture for AI chips, including GPUs, FPGAs, DSPs and ASICs, and even more cutting-edge brain neuromorphic chips. Now the GPU is in a dominant position, but several others The processor architecture also has its own advantages.

Intel is a multi-party bet, do not miss any kind of processor architecture. Google's huge investment in TPU (actually an ASIC) has brought about a significant increase in hardware performance, it seems that the impact on the GPU will be the biggest The reason is not only because of the efficiency advantages brought by the dedicated architecture, but also the cost advantage brought by the business model.

In the cloud, international giants have become the de facto ecological leader, because cloud computing is the battlefield of giants. Now all open source AI frameworks are also released by these giants. In such an ecologically solid environment, Huawei wants to Take a share, domestic cloud AI chips want to advance to the top five, or even challenge Nvidia, it is not easy.

2016 GoodChinaBrand | ICP: 12011751 | China Exports