Will the combination of artificial intelligence and load integration occur in edge computing systems?

IoT devices are growing wildly. It is estimated that the number of IoT devices worldwide will reach 20.4 billion by 2020. At the same time, these devices are also generating data at speeds beyond our imagination. Taking a smart camera as an example, as the resolution of the camera shifts from 1080P to 4K, the amount of data collected in one day will reach 200 GB. Similarly, smart hospitals, autonomous driving, and smart factories will generate more than 3TB, 4TB, and 1PB of data each day. Some people predict that by 2020, the average amount of data an Internet user will generate per day is about 1.5GB. It can be seen that the world is facing a surging data flood.

If all the data that is continuously generated is transmitted to the cloud, the cloud server will face enormous storage pressure. Therefore, the edge computing solution has been proposed. The so-called edge calculation is a method of processing data physically close to the location of data generation. At the 2017 Edge Computing Industry Summit, Dr. Zhang Yu, Chief Technology Officer of Intel’s China Internet of Things Division shared how to realize intelligence at the edge of the network, which is one of the key links in driving data flow, and is also an important trend in the future development of the Internet of Things. He said, "In the age of the Internet of Things, with the transformation of digital, it needs to be more agile to connect, more efficient data processing, and to have better data protection. Because edge computing can effectively reduce the bandwidth requirements, Provide timely response and provide privacy protection for data, so edge computing will play a very important role in the future development of the Internet of Things.”

Intel China Chief Technology Officer of Internet of Things Division - Dr. Zhang Yu

Edge computing will not replace cloud computing and both will complement each other

Since edge computing is so important, does that mean it can replace cloud computing? Zhang Yu emphasized that “edge computing will not replace cloud computing and will complement each other. Because the data processed by edge computing is a localized data, it cannot form a global understanding. The formation of these cognitions also requires clouds. The computing platform integrates the data collected on the various edges at the back end.”

He cited examples of intelligent transportation and double eleven. Intelligent cameras can identify various people passing by the camera, as well as the vehicle's vehicle models, car colors, car models, and license plate recognition through various intelligent methods. Can not understand the trajectory of the car. If you want to form the complete trajectory of the vehicle, you still need cloud computing platform support. The peak sales on the Double 11 Lynx Mall exceed 2.5 billion/second, and such a large number of calculations also require a large cloud computing platform behind it.

Dr. Zhang Yu believes that the development of the Internet of Things can be divided into three stages: interconnection, intelligence, and autonomy. The development of the IoT system to the autonomous stage is also an end-to-end system. Edge computing and cloud computing will work together.

The combination of artificial intelligence and load consolidation occurs in edge computing systems

Analyzing the data flow you will find that many of the data that you need to process before are structured data. You can maintain and manage them through Excel spreadsheets or simple relational databases. However, in the future, the Internet of Things will bring more and more unstructured data. We need to use artificial intelligence technology to discover the intrinsic links from unstructured data.

Increased recognition rate of artificial intelligence

Before 2012, the accuracy of artificial intelligence for image recognition was lower than that of humans. The dashed line represents the level of human recognition. Such a curve represents the error rate of machine recognition. By 2012, the emergence of a large number of new artificial neural networks such as AlexNet has brought the level of artificial intelligence to a new level. With the promotion of new technologies for artificial intelligence, the level of machine image recognition begins to exceed that of humans.

Although artificial intelligence has achieved a very big breakthrough now, it also faces many challenges. The biggest thing is that when artificial intelligence processes, it also needs to consume a lot of computing resources and storage resources. Take Baidu search as an example. To complete a search, hundreds of billions of calculations must be completed. In the reasoning stage, even if a very typical 224×224 resolution image is processed, artificial intelligence networks such as AlexNet or GoogleNet are processed and calculated. The amount is also to exceed 1 billion times. Such a large number of calculations requires a very powerful computing chip support, so the development of artificial intelligence actually puts higher requirements on the chip.

In the process of chip development, the chip's process is the decisive factor. Intel is the founder of Moore's Law and the practitioner of Moore's Law. From 22 nanometers to 14 nanometers, from 14 nanometers to 10 nanometers, from the density change of semiconductor transistors, the density growth rate actually exceeds 2 times. Although Intel's process iteration time is prolonged, but from the update rate, It is still moving forward at the speed of Moore's Law. Moore's Law continues to advance the progress of semiconductor technology, while providing continuous computing power for new computing models such as artificial intelligence. Therefore, the application of artificial intelligence puts forward higher requirements for edge computing and promotes the evolution of edge computing devices.

Dr. Zhang Yu emphasized that the trend toward load integration at the edge is an inevitable trend for the evolution of the Internet of Things. The original discrete load on different devices will increasingly be integrated into a single high-performance computing platform through virtualization and other technologies to achieve a comprehensive and complex function. Each functional subsystem can share the equipment and provide The calculation, storage, network and other resources, but also have a certain degree of independence, to avoid mutual influence, which can simplify the system architecture and reduce the overall system. At the same time, load consolidation actually provides the conditions for the implementation of edge computing and the application of artificial intelligence. The integrated device is not only the aggregation node of the edge data, but also the center of the edge control. This provides the edge intelligence with the data needed for processing, and also provides the entrance for control. Intel therefore believes that the combination of artificial intelligence and load integration will occur in future edge computing systems.

Take advantage of hardware to provide users with comprehensive and appropriate solutions

Dr. Zhang Yu pointed out that the Internet of Things system must be an edge-to-end collaborative system, and artificial intelligence will be widely used in the Internet of Things system, not only at the front end but also at the back end. The different computing resources required by different network sources in the Internet of Things need to be different. Coupled with the deployment of artificial intelligence, different hardware platforms and software and hardware are needed for collaborative optimization. Intel provides an end-to-end, industry-leading, full-stack solution for artificial intelligence, including leading and complete coverage of Xeon processors, Xeon Phi processor, Intel Nervana neural network processor and FPGA, networking and storage technologies Hardware platforms, as well as a variety of software tools and libraries, optimize open source frameworks. It is worth mentioning that, for the edge of the calculation of how to balance the power and computing power is a major challenge, with Movidius leading single-watt computing capabilities, Intel can provide the industry with low-power, high-performance edge computing solutions . For the front-end camera, the power consumption requirements are strict, and it is more appropriate to use a low-power chip such as Movidius; for the equipment that runs on an edge-domain connected camera device or a server service center, using an FPGA is more suitable.

Currently, AI chip design companies have emerged for artificial intelligence applications. What are the development trends of AI chips in the future? According to Zhang Yu, “The problems that the real-world system has to solve are different, and the locations in the system are different, and the hardware requirements and calculation requirements are not the same. The user should select a more appropriate hardware architecture according to different requirements. A lot of smart applications actually revolve around image processing. Even with AlphaGo, Alphago's next move is to convert the chessboard into a two-dimensional image as input, and then use neural network analysis to get the final result, including weights network, valuation network, etc. It's hard to say whether the next step will be the biggest conclusion in which position to win, but is this not representative of the future of artificial intelligence?The reason is that there are many types of problems that are later analyzed with the human brain or machines, some of which can be attributed to images, some are not. Yes, if you boil down to an image and you can do it with convolution, if it doesn't boil down to the image, isn't there another efficient structure? With the increasing complexity of processing problems and the increasing number of types of problems, The understanding of the problem is becoming clearer and it may be possible to find a suitable application for a particular application in the future. Application architecture problems. "

With edge computing and cloud computing, developers are more concerned with defining edge computing and cloud computing when designing. Which one is more suitable? Intel believes that we have a common technology that needs to be understood and mastered. It is how to make the computing architecture more easily defined by software. No matter what kind of business types are available, they can be more flexible in the cloud, on the edge, and even on the endpoints. Without software-defined flexibility, it would be very difficult for me to move workloads from the cloud to the front-end. From a chip vendor's perspective, we need to make such a consideration.

Network optimization is the key to the use of artificial intelligence in edge computing

The theoretical foundation of artificial intelligence is not yet complete, which causes a large part of the artificial intelligence to be redundant at present. If you want to use artificial intelligence to the edge, network optimization is a key technology. Intel's network optimization ideas are divided into three areas: low bits, pruning, and parameter quantization.

The so-called low-bit, in the traditional deep learning domain, the parameters are often expressed in 32-bit single-precision floating-point, but we see that in many application scenarios, such as security, machine learning, and machine vision, the actual On the accuracy requirements are not so high, Intel to the entire parameter accuracy without affecting the final recognition rate, from a 32-bit single-precision floating-point, into a 16-bit semi-smooth, and even converted to 8-bit full precision Or it is 2-bit full precision. As the number of bits continues to decrease, the amount of storage and the amount of calculations are reduced, so that more complex operations can be performed on platforms with relatively limited computing power.

Pruning is similar. If you compare an artificial intelligence network to a branch, each different branch of the tree actually corresponds to a different detection feature. For different application scenarios, the characteristics are different. It is very likely that these processing and detection features have no effect on the final detection. For branches that have no effect, they can be completely cut off. Pruning can greatly reduce calculations.

Quantization is the parameter can be clustered according to some of its characteristics. One type of parameter can be represented by relatively simple symbols or numbers, which can greatly reduce the requirement of artificial intelligence for storage. For such optimization ideas, Intel is continuously developing artificial intelligence technology and hardware cooperation to form a very good interaction.

Strengthen the ecosystem to better promote development

Edge computing is a big ecology. No company can provide all the upstream and downstream links in this industry chain. In this industry chain, Intel positioned itself as a chip company that provides the chip solutions needed for computing, communications, and storage. Zhang Yu said, “Only our own strength alone is not enough to achieve the grand goals of the edge computing industry. We need many types of companies and institutions to participate, as you can see from the ECC Edge Computing Alliance. The participation is supported by government agencies, so that we can go hand in hand in order to fully realize the potential of artificial intelligence and enable people to do more with the artificial intelligence technology that we could not do before.”

Edge computing requires chips, but hardware is not enough, and software is required to cooperate. Zhang Yu stated that “we also provide users with the underlying software and middleware elements related to the chip. Using the parts we provide can help partners better develop their corresponding products.” Currently Intel and Huawei and Shenyang Automation Research Institute have In cooperation, Huawei just released the AR550i, an Intel-based edge gateway product. Using this product, Huawei plays an ODM and OEM role in the entire industrial chain. At the Edge Computing Industry Alliance Summit, Intel teamed with the Shenyang Institute of Automation to demonstrate the edge computing test bed - intelligent robots, to verify the effectiveness of deep learning based machine vision solutions in real systems.

Zhang Yu emphasized that “our recognition of the vertical industry is certainly not as deep as that of industry partners. In this regard, we are very good at cooperating with them. We provide solutions and they provide their application solutions. The overall plan to meet specific vertical industry requirements to accelerate its landing.In addition to the ongoing research and development at the edge computing node, Intel also has cloud computing capabilities and capabilities in the network communication infrastructure, we can be more macro and overall To serve the needs of the entire industry."