In fact, MIT's research team and co-researchers have also made similar discoveries in their earlier 2016 proposed computational chip architecture that uses photons instead of electrons as the theoretical interaction of the light and the lens itself A Complicated Computation: Fourier Transform - Using this principle, and using multiple beam-steering techniques, the correlations can be used to find the desired result of a search, and the chip architecture is referred to by the research team as Programmable Nano Photonic processor.
In June 2017, a MIT research team presented a paper on programmable nanophotonic processors and published in Nature - Photonics, the first author and correspondent for the paper, born in Hangzhou Shen is currently co-founder and CEO of Lightelligence and one of 35 Chinese technology-innovation youth under 35 selected by the MIT Review in 2017.
Photon computing has unique advantages when dealing with some AI algorithms

Figure 丨 2017 MIT Review China's 35-year-old 35-year-old scientific and technological innovation selected, Lightelligence co-founder and CEO Shen also Morning
Under the leadership of Shen Yichen, Lightelligence is trying its best to develop optical chip technology, including chip design, core algorithm, transmission, periphery, etc., to create a complete optical computing ecology. Because the technology developed by Lightelligence will likely completely change the computing ecology And are therefore receiving high attention, including Baidu, which sees cloud computing as a core development project, as well as several executives in the US semiconductor industry, all of whom became early investors in Lightelligence because of their optimism about the future of photonic chips.
Shen Yichen told DT Jun that due to its nano-photon-based research program at the Massachusetts Institute of Technology's Ph.D. program, AI applications take off rapidly just in 2015. It is well known that in addition to data, the use of hardware for AI is also very important , So the idea of using photons in computing environments began.
But why no one ever thought before 2015 should use photon effect to carry on the calculation of the nerve network Shen Shen Chen said that this is because the neural network calculation is not popular in the past, and the traditional logic calculation is not the best place that the photon calculates.
In fact, a photonic chip may be the most suitable future hardware architecture for AI calculations because the nature of light is inherently linear (the most important part of AI calculations), which involves high-dimensional parallel computing. In contrast, although quantum computing has drawn much attention recently because of AI, quantum computing is still a field that is better at decoding or searching. In addition, it is not yet mature in mass production, but its potential is not to be underestimated.
From transmission to calculation, photon chips will become the ultimate computing solution?
Two completely different physical phenomena, electrical and optical, finally succeeded in getting together after Intel introduced the first standard CMOS-process hybrid silicon laser from 2006. Over the years, ultra-high-bandwidth optical transmission based on this technology Architecture has become a favorite of high-performance data centers, thereby effectively reducing the system bottlenecks caused by large amounts of data transmission.
In 2015, IBM researchers published a new experimental technique for photon computing by integrating an array of silicon photons into the same package size as the CPU, but the problem with silicon photonics has always been the chip's optical interface, but IBM's photonics solution The solution can be applied to system-on-chips (SoCs), transferring light between chips with inexpensive edge connectors, or chip-to-chip communication by simply joining together CMOS chip edges.
These photonic chips are mainly developed to solve the traditional chip-to-chip, or chip-to-memory system interconnection problems.With the integration of high-quality photonic chip invention, the replacement of the past huge complex optical transmission architecture, and Faster, lower latency.

However, the real concept of bringing photons into the field of computing and even forming a 'photonic chip' has only been gradually discovered in recent two years.
Although semiconductor chip technology relies on the integration of new applications and algorithms, more and more things can be done. In fact, the chip architecture itself is based on the same logic and is limited by semiconductor technology, computing power, size, and Power consumption, cost formation is difficult to balance the four corners of the relationship.
At this time, the industry is also actively looking for new computing technologies that can break the status quo. GPGPU, neural network chips, DSPs and FPGAs are all put forward at different times and are good at solving application-specific computing. However, these chips do not solve the fundamental problem The problem, that is, its physical limitations based on the semiconductor structure.

Figure 丨 photon synapses principle
The growing computing needs of AI bring the processing architecture up and running. Intel, for example, will combine CPU and FPGA computing power in the future to cope with more complex application scenarios. NVIDIA is significantly strengthening its latest generation of GPU solutions In addition, many are hoping to introduce more appropriate new architectures for specific computations, such as NPUs, Quantum computing, and the latest computing concepts: photonic circuits based on photonic circuits (Photonic Circuits) computing architecture.
In fact, 'light' has been used in the computing environment for more than a decade and was mainly used to transfer data between different chips or storage devices. However, because the related transmission technologies are too costly and must be collocated Expensive perimeter can show its benefits, and therefore, the transmission of 'light' has never been spread to the consumer market, leading us not to have a clear understanding of this fact.
However, calculation is another level of problem.

Figure 丨 SMART Photonics photonic chip
With a very simple concept to explain the photon computing chip, is the use of the chip countless optical switches, the role is similar to the logic gate in semiconductor chips, the use of different wavelengths, phase and intensity of light combinations, complex mirror, filter And prism structure of the array of information processing.
Silicon photonics, like microelectronics, are based on silicon-based semiconductor architectures, and silicon has gained popularity as an optical communications transmission that can be used to instantaneously transmit large amounts of data due to the fast response and parallel nature of light and is therefore commonly used in data Center of the server.Because the photon transfer process is stable, parallel ability, and the error correction design is relatively simple, the transmission and conversion of energy required is very small, so the use of photonic computing architecture can theoretically be relatively low power performance Second, photonic chips can theoretically be used on very small scale applications, such as mobile devices.
Photonic chips can use the currently mature semiconductor technology, and the photonic chip is still in the experimental stage only need the old micron-scale technology can significantly exceed the existing semiconductor chip computing power, and therefore the future of micro-space tremendous. With the increase in chip density, performance can be substantial growth, and even have the opportunity to completely rewrite the limits of Moore's Law.
The CMOS process followed is the biggest advantage of photonics, but the goal is not to replace traditional semiconductors

Figure 丨 CMOS
Shen Yichen also said that since the photonic chip is basically still based on the current CMOS manufacturing process, it is more advantageous in terms of cost or mass production technology than the special process used in quantum computing. Although photonic chips are currently in the laboratory in density It's better than a traditional semiconductor chip but much better than a quantum chip.
The performance of photonic chips depends on the architecture and algorithm, such as how many different wavelengths of light to use at the same time to combine, or the bandwidth of the optical signal used in the chip, and the bottleneck of photoelectric conversion, but only from the physical Feature point of view, in the appropriate algorithm to achieve a hundred times the speed of traditional semiconductor chips is not much of a problem.
Of course, in theory, photonic chips can be large-scale, but also can be very small, but because the light is not suitable for non-linear computing, the other light chip integration and size there will be some norms to completely replace the semiconductor chip Still have great difficulty.
From the chip, the algorithm to the surrounding ecology is developing
Shen Yichen also stressed that at present, the development of Lightelligence photonics chip has completed the laboratory stage. Corresponding designs are underway in terms of algorithms, buses and storage. Of course, the most important issue in computing chips is ecology, which requires more research Institutions and companies to join the expansion of optical computing in the field to jointly establish.
Because the main product is a chip, so the core part is the combination of algorithm and hardware, and the corresponding chip instructions and compiler, and Lightelligence's job is to make the developed chip can be applied to the currently popular framework, for example TensorFlow, Caffe and so on.
In addition, Lightelligence is developing corresponding peripheral designs due to the particularity of photon counting in transmission or storage, although the current storage systems may accelerate the speed of landing operations, but may limit the performance of photon computing, Therefore, this part of the future, or with photon computing optimized design as the goal, will be able to highlight the overall advantages of photon computing.
Today, the Lightelligence team is working hard to improve the ecology of photon computing, of course, not yet mature, but the industry for high-performance computing, and even better neural network computing architecture has very high expectations, I believe its photonic computing architecture landed can be greatly accelerated The overall AI calculates ecological change.
Shen Yichen said that whether for a specific purpose, or for general computing power, this will be the choice of different processes for the development of the chip architecture.Lightelligence first still technology or application scenarios more mature photonic chip applications to start, and then gradually expand to The range of applications, but also efforts to develop the front and back of the photon chip technology, for different future computing scenarios better fit.
Shen also emphasized that there are still many major engineering improvements to be made on the road to photon computing in general, but it is perhaps the best time and the closest thing to achieve, as compared with the past attempts at photon counting.