Intel Baidu teamed up to develop the Nervana neural network training processor

At the Baidu AI Developer Conference, Naveen Rao, Intel's vice president and general manager of the Artificial Intelligence Products Division, announced that Intel is working with Baidu to develop the Intel® NervanaTM Neural Network Training Processor (NNP-T). This collaboration includes a new custom accelerator to achieve the goal of speed training deep learning models.

Naveen Rao, Vice President of Intel Corporation and General Manager of Artificial Intelligence Products Division, delivered a speech

Naveen Rao said: "In the next few years, the complexity of the AI ​​model and the need for large-scale deep learning computing will explode. Intel and Baidu will continue their cooperation for more than a decade and focus on joint design and development of new hardware and Supporting software to continuously move towards the new frontier of 'AI 2.0'."

AI is not a single workload, but a powerful capability that enhances the performance of all applications, whether they run on mobile phones or in large data centers. However, mobile phones, data centers, and all facilities in between have different requirements for performance and power consumption, so a single AI hardware can't meet all the requirements. Intel provides superior hardware choices in artificial intelligence and maximizes the release of hardware through software, helping customers run AI applications wherever data is complex or wherever they are. The NNP-T is a new class of highly developed deep learning system hardware that accelerates large-scale distributed training. Close cooperation with Baidu ensures that Intel's development department is always keeping up with the latest customer demand for training hardware.

Since 2016, Intel has been optimizing the Baidu Paddle* (Deep Learning Framework) for Intel® Xeon® scalable processors. Today, by optimizing NNP-T for Baidu's flying paddles, both parties can provide data scientists with more hardware options.

At the same time, Intel is further enhancing the performance of these AI solutions with more technology. For example, with the higher memory performance offered by Intel's Proud Data Center-class persistent memory, Baidu is able to offer personalized mobile content to millions of users through its Feed Stream* service, and gains access through Baidu's AI recommendation engine. Efficient customer experience.

In addition, given the importance of data security to users, Intel is working with Baidu to create MesaTEE*, the Memory Security Feature as a Service (FaaS) computing framework based on Intel Software Protection Extensions (SGX) technology.