For the general public, they hope that self-driving cars can follow more stringent standards than human drivers. Just last week, Ms. Elaine Herzberg was unfortunately knocked down by an Uber in autopilot mode in Arizona, USA. In the end, after the tragedy occurred, it was time to reflect on the significance of sensing and decision-making for security.
First of all, one of the challenges we face today is to interpret the sensor information. From the police's published video, it seems that even if the ability to detect and classify objects is the most basic building block in an autonomous driving car system, in fact, this is also true. A very challenging task. However, this capability is at the heart of the current Advanced Driver Assistance System (ADAS), which includes features such as automatic emergency braking (AEB) and lane keeping. Billions of miles are sufficient to verify , The high-precision sensing system in ADAS is saving lives. Similarly, before overcoming even greater challenges, this technology is also an essential element for fully automatic driving of cars in the future.
To prove the capabilities and subtleties of the current ADAS technology, we run Mobileye's software in the video on the TV monitor. This monitor plays the police-provided fragment of the accident. Although the situation is not good - there may be many accident scenes. The high dynamic range data has been lost, but Mobileye's software was still clearly detected about one second before the impact. The bottom picture shows three snapshots of the bike bounding box detected above and Ms. Herzberg. Two independently operating sources: pattern recognition (generate bounding box) and a 'free space' detection module (producing a horizontal map where the red part represents a 'passer-by' appearing above the red line). using motion structures (technical terminology for ' Plane + parallax'), the third module can distinguish between roads and objects. This verifies that the detected object is 3D, but is less reliable and is therefore described as 'fcvValid: Low' and is displayed on the top left of the screen. Fang. The low credibility is due to the fact that the information normally available for production vehicles is missing here and the image quality is poor. After all, the images taken in the driving recorder Based on the shooting again, there may be some unknown downsampling.
The images are from video clips released by the police on the TV monitor. The overlay image shows the response of the Mobileye ADAS system. The green and white bounding box is the output of the bicycle and pedestrian detection module. The horizontal map shows the road and obstacles. Between the borders, we call it 'free space'.
The software used in this experiment is the same as that used in current ADAS-equipped vehicles and has been validated in billions of miles of user miles.
Now, the development of artificial intelligence technologies such as deep neural networks has led many to believe that high-precision object detection systems can already be developed with ease, and that those computer vision experts with more than a decade of experience are seen as greatly reduced. This makes Many newcomers are pouring into this area. Although these new technologies are indeed useful, many traditions can not be ignored, including the identification and completion of hundreds of extreme cases of testing, annotation of tens of millions of data sets and dozens of ADAS The project is undergoing challenging trial production verification tests. Experience is critical, especially in areas where safety is at the forefront.
The second observation of the incident was transparency. Everyone said 'we put security in the most important position' but we believe that in order to gain public trust, it must be more transparent. As I announced in Mobileye last October. According to the Responsible Sensitive Security Model (RSS), decisions must comply with the common sense of human judgment. We have mathematically formulated common sense concepts such as 'dangerous situations' and 'correct response' and established a mathematically guaranteed conformance to the definition. system.
The third observation is redundancy. The true sensory system needs a redundant design, and it must rely on independent sources of information: cameras, radar, and lidar. Integrating these sources of information helps improve driving comfort. However, it is not conducive to safety. In order to demonstrate that we have achieved practical redundancy, Mobileye has developed an independent, camera-only end-to-end system as well as an independent Lidar and radar-only system.
If something like last week's accident happens again, the already fragile trust of the user will be further reduced, and it may lead to passive supervision and eventually kill this important job. As I said in introducing the responsibility-sensitive security model, I We firmly believe that there is a need to make a meaningful discussion on the safety verification framework for fully automated driving cars. We invite car manufacturers, technology companies in this field, regulatory authorities and other relevant parties to work together to resolve these important issues.
For the general public, they hope that self-driving cars can follow more stringent standards than human drivers. Just last week, Ms. Elaine Herzberg was unfortunately knocked down by an Uber in autopilot mode in Arizona, USA. In the end, after the tragedy occurred, it was time to reflect on the significance of sensing and decision-making for security.
First of all, one of the challenges we face today is to interpret the sensor information. From the police's published video, it seems that even if the ability to detect and classify objects is the most basic building block in an autonomous driving car system, in fact, this is also true. A very challenging task. However, this capability is at the heart of the current Advanced Driver Assistance System (ADAS), which includes features such as automatic emergency braking (AEB) and lane keeping. Billions of miles are sufficient to verify , The high-precision sensing system in ADAS is saving lives. Similarly, before overcoming even greater challenges, this technology is also an essential element for future fully-automated driving.
To prove the capabilities and subtleties of the current ADAS technology, we run Mobileye's software in the video on the TV monitor. This monitor plays the police-provided fragment of the accident. Although the situation is not good - there may be many accident scenes. The high dynamic range data has been lost, but Mobileye's software was still clearly detected about one second before the impact. The bottom picture shows three snapshots of the bike bounding box detected above and Ms. Herzberg. Two independently operating sources: pattern recognition (generate bounding box) and a 'free space' detection module (producing a horizontal map where the red part represents a 'passerby' above the red line). Use of motion structures (technical terminology for ' Plane + parallax'), the third module can distinguish between roads and objects. This verifies that the detected object is 3D, but is less reliable and is therefore described as 'fcvValid: Low' and is displayed on the top left of the screen. Fang. The low credibility is due to the fact that the information normally available for production vehicles is missing here and the image quality is poor. After all, the images taken in the driving recorder Based on the shooting again, there may be some unknown downsampling.
The images are from video clips released by the police on the TV monitor. The overlay image shows the response of the Mobileye ADAS system. The green and white bounding box is the output of the bicycle and pedestrian detection module. The horizontal map shows the road and obstacles. Between the borders, we call it 'free space'.
The software used in this experiment is the same as that used in current ADAS-equipped vehicles and has been validated in billions of miles of user miles.
Now, the development of artificial intelligence technologies such as deep neural networks has led many to believe that high-precision object detection systems can already be developed with ease, and those computer vision experts with more than a decade of experience are considered to be undercutting. This makes Many newcomers are pouring into this area. Although these new technologies are indeed useful, many traditions can not be ignored, including the identification and completion of hundreds of extreme cases of testing, annotation of tens of millions of data sets and dozens of ADAS The project is undergoing challenging trial production verification tests. Experience is crucial, especially in areas where safety is the priority.
The second observation of the incident was transparency. Everyone said 'we put security in the most important position' but we believe that in order to gain public trust, it must be more transparent. As I announced in Mobileye last October. According to the Responsible Sensitive Security Model (RSS), decisions must comply with the common sense of human judgment. We have mathematically formulated common sense concepts such as 'dangerous situations' and 'correct response' and established a mathematically assured conformance to the definition. system.
The third observation is redundancy. The true sensory system needs a redundant design, and it must rely on independent sources of information: cameras, radars, and lidars. Integrating these sources of information helps improve driving comfort. However, it is not conducive to safety. In order to demonstrate that we have achieved practical redundancy, Mobileye has developed an independent, camera-only end-to-end system as well as an independent Lidar and radar-only system.
If something like last week's accident happens again, the already fragile trust of users will be further reduced, and it may lead to passive supervision and eventually kill this important job. As I said when introducing the responsibility-sensitive security model, I We firmly believe that there is a need to make a meaningful discussion on the safety verification framework for fully automated driving cars. We invite automotive manufacturers, technology companies in this field, regulatory authorities and other relevant parties to work together to resolve these important issues.