Experts: Automatic driving technology is still immature, and human drivers and standards are all blank

Recently, after Uber and Tesla’s auto-pilot cars hit a fatal accident, US car safety and technical experts said that auto-driving cars should be required to meet the criteria for detecting potential dangers and that better methods are needed to prepare their drivers. control.

Automobile manufacturers and technology companies rely on human drivers to intervene in autonomous driving when necessary. But in the two recent incidents involving the use of different technology vehicles, no driver took any action before the accident.

Driverless cars rely on laser radar, which uses laser pulses to detect road hazards, as well as sensors such as radar and cameras. However, there are no standards on the system, and all companies use different combinations of sensors, and some vehicles may have blind spots.

The driver opened the music, and the music made it difficult for the driver to hear something else.

Autonomous driving expert and investor Evangelos Simoudis said that in these circumstances, humans could not take over the vehicles as quickly as expected.

In the Uber crash last month, the taxi service company was testing a completely unmanned system. When the prototype crashed a woman who crossed Arizona Road, the system would be used for commercial purposes. From inside the car The car accident footage taken shows that the driver is looking behind the wheel and looking down rather than staring at the road. Before the video stopped, the driver looked up at the road and suddenly it looked shocking.

In the Tesla Motors event that involved any consumer purchase last month, a ModelX was driving in semi-autonomous driving mode when it crashed. Tesla said that the driver received an earlier warning. Put his hand on the steering wheel.

Some semi-automotive cars such as Tesla use different technologies to help drivers stay in the driveway or keep a certain distance from the vehicle in front. These systems rely on alarms - beeps or vibrating the steering wheel - to draw the driver's attention. .

'Immaturity of technology'

Missy Cummings, a professor of mechanical engineering at Duke University, said that the recent Uber and Tesla accidents showed that 'the technology they use is not mature enough'.

Tesla said that its technology has statistically proven that it can save lives through better driving. Tesla said in response to Reuters on Tuesday that as long as the driver turns on the autopilot Autopilot and needs to be ready to respond to 'sound and Visual cue', the driver has the responsibility to maintain the control of the car.

A Uber spokesperson said: 'Security is our primary concern at every step.'

Consumer groups advocating freeway and car safety say that a bill on self-driving cars is currently ending in the US Senate. This is an opportunity to improve safety, completely different from the original intention of the bill. The original purpose was to make no one Driving a car on a public road without human control. The organization has proposed to amend the bill, AVSTART, to set standards for these vehicles, for example, to require a 'vision test' on a self-driving vehicle to test it. Different sensors actually see the situation.

The organization believes that the AVSTART bill should also cover semi-automated systems like Tesla Autopilot, which is a technology that is lower than the current proposed legislation.

Other groups have also proposed self-driving cars, including requiring that even semi-automated systems reach performance targets, increase transparency and data for vehicle manufacturers and operators, strengthen supervision, and strengthen supervision and participation of human drivers.

Others want to focus on human drivers. In November, the Consumer Reports Magazine called on automakers to be responsible for labeling, 'helping consumers fully understand' the autonomy of their vehicles.

Jack Fisher, head of consumer testing for automotive testing, said that human drivers are not good at automating, and that this technology cannot respond to all types of emergencies.

'It's like a passenger sitting in a child's driving car.' He said.

MIT is using semi-automated vehicles for testing, including Tesla, Volvo, Jaguar Land Rover and General Motors models. The goal is to understand how drivers use semi-automated technology - some people look at the road but hands leave the steering wheel, some People did not - and which warnings caught their attention.

Bryan Reimer, a research scientist at the Massachusetts Institute of Technology, said: 'We know very little about drivers using any of these systems in real situations.'

Timothy Carone, an expert and professor of automotive driving systems at the University of Notre Dame Mendoza Business School, said that proponents of autonomous technology must 'find the right balance so that the technology is tested correctly, but it will not be hindered or stopped'.

'Because in the long run, it will save lives.' He said.

2016 GoodChinaBrand | ICP: 12011751 | China Exports