Intel introduced new M.2 form factor plug-in cards, can be used to train artificial intelligence

Plug-in cards for AI training: Intel presents the chipset featuring the M.2 form factor Intel Neural Network Processor


chipset featuring the M.2 form factor Intel Neural Network Processor
Chipset featuring the M.2 form factor Intel Neural Network Processor. 


To train AI you need suitable algorithms and processors for neural networks. Chip manufacturer Intel has now introduced new products that are used by major customers to train AI. All end customers can experience the effects of this.

This is one of the fields of the future and a stimulating topic alike. We are talking about AI. Many IT companies are researching it and using building blocks such as machine learning algorithms in certain products. Chip manufacturer Intel has now introduced new processors for AI applications. 

A lot of training data sets will be needed to train artificial intelligence. In addition, you need not only a permanent Internet connection to connected to the cloud, but also powerful hardware in end devices. 

Intel has introduced two new neural network processors to address the needs of the user to. With the Intel Nervana Neural Network Processor for Training, it is important to be able to efficiently calculate the most complex training data sets possible. 

The Chinese company Baidu, for example, relies on such processors. Here, between eight and 32 plug-in cards with processors can be installed in one housing. In addition, a plug-in card in M.2 format for the calculation of inferezes called Intel Nervana Neural Network Processor for Inference was presented. 

For the calculation of the forward chaining up to 50 points should be available with only 12 watt consumption. The compact design should allow the use in as many devices as possible. Both products are intended exclusively for use in data centers. A similar form factor may also be used in Intel's dedicated Intel Xe graphics cards.


Areas of application for users

Even if the innovations presented are only used in data centers and by large customers, the end users can still see the effects. For example, a trained AI should help to filter out hate messages in Xbox live chat. In addition, Microsoft also relies on an update distribution supported by a machine learning algorithm. 

The aim here is to offer a Windows 10 function update as far as possible only to users on whose PCs the update runs without errors. If the algorithm detects that any element has caused problems with the update for many users, the update should still be held back.

Google relies on AI on a large scale. Images and objects are recognized thanks to algorithms. In Google Stadia, the game streaming service, the AI is supposed to contribute to lower latencies in comparison to PCs within a very short period of time. 

An appropriately trained computer recently caused a sensation in the game Stracraft 2. During 44 days, the program was trained and then sent into the game against human opponents without them knowing that they were playing against an AI. Some experts believe that you have to put artificial intelligence into tight limits to make it clear for which purposes you are allowed to use it.


Facts about Intel's neural network processors:

Intel has introduced two new plug-in cards for calculating algorithms for AI applications especially for large customers and data centers. The Intel Neural Network Processor for Training is used to calculate the training data sets.
Intel offers the Intel Neural Network Processor for Inference to calculate forward chaining as efficiently as possible.

Comments