Barbara De Salvo, CEA-Leti chief scientist, pointed out that since the early dawn of Artificial Intelligence (AI) in the mid-1950s, there have been many twists and turns in the study of AI despite the fact that many scientists and engineers are always interested in the concept.
Now, however, it is time to take a look at human brain-inspired technologies, at least, 'we should recognize that there are other technology paradigms that will help us put forward good questions.' 'De Salvo at this year's International Solid State Circuits Conference (ISSCC), after receiving an EE Times interview after delivering a keynote speech, said: 'Without the new paradigms and algorithms, the industry will eventually find it harder to meet tighter power requirements.'
While the industry has been pushing embedded AI platforms such as Movidius' Myriad 2, Mobileye's EyeQ5, and Nvidia's Xavier, in order to meet the demand for end-devices that handle large amounts of analysis with less power, she points out: We are a long way from what we need to achieve. "
De Salvo compares the computational efficiency (GOPS / W) of the corollary phase and the GOPS of several AI chips during a keynote speech.
She pointed out that there is still a big gap between the needs of the industry and existing solutions.'No chips - be it for commercial release, prototypes, chips for design or development in academia - are capable of delivering less than 100μW However, this is exactly what the edge devices need to achieve because they have to rely on energy harvesting or miniature batteries for years to come.
Why return to human brain research?
De Salvo said that we know that the human brain weighs about 2% of the body, yet uses 20% of the body's metabolism.
The human brain processes 1,011 GOPS at 20 W. De Salvo emphasizes that so far, there has not been any processor in the world that has comparable performance and power to the human brain.
De Salvo explained that this energy efficiency class has evolved over the course of human evolution for a long time, maximizing brain function while minimizing energy use.
That's where the semiconductor industry borrows from biology, she said, and traditional computing architectures are geared toward meeting power requirements mainly because 'energy is consumed whenever the processor and memory are communicating.' "By contrast, The brain synapses contain memory and arithmetic in a single architecture, and she explains that this ingenious technique provides the foundation for a brain-inspired non-von Neumann computer architecture.
The essence of brain-inspired principles of operation lies in such things as spike coding and spike-timing-dependent plasticity (STDP), etc. She points out that in terms of the state of the neurons encoded in the system : In the past, neurons were coded using analogies or numbers; the most recent trend in neuromorphic computing, however, is to encode the value of a neuron as a pulse or spike, and she explains: 'Neurons have no clock and are purely event-oriented of. '
The scientific community believes that spike coding and STDP have promise.If the input / output signal is represented by a pulse (spike), the multiplication between the input signal and the synaptic weight will fall to gating at the synaptic level. The goal here is to reduce power consumption by setting up spikes or event-based signaling using heterogeneous solutions.
Is this just like Chronocam's event-driven image processing technology? She replied: "Yes, but Chronocam's solution is aimed at the" vision "of the artificial retina, and our industry mission is to apply neuromorphic principles Expand beyond the entire computing area beyond vision. '
De Salvo said IBM's TrueNorth is an ideal example of a neuromorphic CMOS chip.Nowadays, DynapSEL, a dynamic neuromorphic processor built with 28nm FD-SOI, is a large-scale multi-core neuromorphic processor As another example, DynapSEL is currently one of the development programs under the NeuRAM 3 European Cooperative Research Program.
While neurons in TrueNorth are processed digitally, DynapSEL is analogous, however, neither of these chips effectively performs the full power of the neuromorphic system because memory is not based on the functioning of the neuromorphology.
The scientific community is trying to move memory closer to the processing unit, revolutionizing the traditional memory hierarchy and implementing "in-memory computing." So far, neuromorphic hardware has not yet reached its full potential, De Salvo said: We need to use ultra-high-density 3D architectures in neuromorphic computing 'to achieve maximum connectivity and reconfigurability between neurons and synapses.
Insect brain
We now know that there are 100 billion neurons in our brains, and smart insects like bees have about 950,000 neurons in about 1mm cuboid brain De Salvo pointed out that in all kinds of insects, the bees' brains are the largest She said: 'We have mastered neuronal mapping in the bee's brain, yet we know little about neurons in the human brain.'
Scientists are keen to profile the bees' brains because 'the bees are very smart.' The bees have many sensors that can navigate through their means of navigation and know how to communicate in one group. To live together in a group, bees must Know the function of other bees.
'Of course, the bee's brain is small, but its simplified form provides a good model for the AI system,' said De Salvo: 'It emphasizes the need to think in a systematic way because the organism does not separate the sensor from the signal deal with. '
AI latest progress?
Ten years ago, no one could imagine the breakthrough that we are seeing now in AI.
However, according to De Salvo, the industry may shift its focus too fast to AI applications because of the prospect of AI, in fact she said: 'We still lack a full understanding of the AI system ... just like deep learning.'
She explained that there is a hidden layering in deep learning and the resulting system complexity that makes it hard to say which attributes have the greatest responsibility for improving performance. "Compared to general intelligence, learning, abstraction, and reasoning The prevalence achieved is still very limited.
Deep learning is excellent in terms of classification, but 'prediction remains a fundamental issue in neuronal computing,' De Salvo said: 'Recently, the neural network has failed to perform simple tasks that humans have never missed.' De Salvo and quoting A. Nguyen et al., 'Deep Neural Networks are Easily Fooled: High Confidence Predictions for Unrecognized Images,' she worries that Eventually hinder the development of the market, and even 'betrayed the trust of users, which led to serious moral questions.'
Compile: Susan Hong