This article is focused Netease smart studio produced AI, the next read of a great era:! Professor Amon Shashua, senior vice president of Intel Corporation and Intel subsidiary Mobileye CEO and chief technology officer.
For the general public, they hope that self-driving cars can follow more stringent standards than human drivers. Just last week, Ms. Elaine Herzberg was unfortunately hit by an Uber in autopilot mode in Arizona, USA. In the end, after the tragedy occurred, it was time to reflect on the significance of sensing and decision-making for security.
First of all, one of the challenges we face today is to interpret the sensor information. From the police's published video, it seems that even if the ability to detect and classify objects is the most basic building block in an autonomous car system, in fact, this is also true. A very challenging task. However, this capability is at the heart of the current Advanced Driver Assistance System (ADAS), which includes features such as automatic emergency braking (AEB) and lane keeping. Billions of miles are sufficient to verify , The high-precision sensing system in ADAS is saving lives. Similarly, before overcoming even greater challenges, this technology is also an essential element for fully automatic driving of cars in the future.
To prove the capabilities and subtleties of the current ADAS technology, we run Mobileye's software in the video of the TV monitor. This monitor plays the police-provided fragment of the accident. Although it is in poor condition - there may be many accident scenes. The high dynamic range data has been lost, but Mobileye's software was still clearly detected about one second before the impact. The bottom picture shows three snapshots of the bike bounding box detected above and Ms. Herzberg. Two independently operating sources: pattern recognition (generate bounding box) and a 'free space' detection module (producing a horizontal map where the red part represents a 'passer-by' appearing above the red line). using motion structures (technical terminology for ' Plane + parallax'), the third module can distinguish between roads and objects. This verifies that the detected object is 3D, but is less reliable and is therefore described as 'fcvValid: Low' and is displayed on the top left of the screen. Fang. The low credibility is due to the fact that the information normally available for production vehicles is missing here and the image quality is poor. After all, the image taken by the driving recorder Based on the shooting again, there may be some unknown downsampling.
The images are from video clips released by the police on the TV monitor. The overlay image shows the response of the Mobileye® ADAS system. The green and white bounding boxes are the output of the bicycle and pedestrian detection module. The horizontal map shows roads and obstacles. The border between, we call it 'free space'.
The software used in this experiment is the same as that used in current ADAS-equipped vehicles and has been validated in billions of miles of user miles.
Now, the development of artificial intelligence technologies such as deep neural networks has led many to believe that high-precision object detection systems can already be easily developed, and that those computer vision experts with more than a decade of experience are seen as greatly reduced. This makes Many newcomers are pouring into this field. While these new technologies are indeed useful, many traditions can not be ignored. This includes identifying and completing hundreds of tests in extreme situations, annotating tens of millions of miles of data sets, and noting in dozens of ADAS. The project is undergoing challenging trial production verification tests. Experience is critical, especially in areas where safety is at the forefront.
The second observation of the incident was transparency. Everyone said 'we put security in the most important position' but we believe that in order to gain public trust, it must be more transparent. As I announced in Mobileye last October. According to the Responsible Sensitive Security Model (RSS), decisions must conform to the common sense of human judgment. We have mathematically formulated common sense concepts such as 'dangerous situations' and 'correct response' and established a mathematically assured conformance to the definition. system.
The third observation is redundancy. The true sensory system needs a redundant design, and it must rely on independent sources of information: cameras, radar, and lidar. Integrating these sources of information helps improve driving comfort. However, it is not conducive to safety. In order to demonstrate that we have achieved practical redundancy, Mobileye has developed an independent, camera-only end-to-end system as well as an independent Lidar and radar-only system.
If something like last week's accident happens again, the already fragile trust of users will be further reduced, and it may lead to passive supervision and eventually kill this important job. As I said in introducing the responsibility-sensitive security model, I We firmly believe that there is now a need for meaningful discussions on the safety verification framework for fully automated driving cars. We invite car manufacturers, technology companies in this field, regulatory authorities and other parties concerned to work together to resolve these important issues.