Mining Chip Producer Bitmain is Developing ASIC Chips for AI

Mining Chip Producer Bitmain is Developing ASIC Chips for AI

Mining
May 25, 2018 by Alexander Caruso
212
Bitmain, the largest cryptocurrency mining hardware producer in the world, is working on ASIC chips for AI applications, such as deep learning. ASIC chips developed for cryptocurrency mining are Bitmain’s core competency. Bernstein Research, a financial research firm, published a report in February which estimated Bitmain’s annual profits at about $3 billion, which rivals chip
bitmain chips

Bitmain, the largest cryptocurrency mining hardware producer in the world, is working on ASIC chips for AI applications, such as deep learning. ASIC chips developed for cryptocurrency mining are Bitmain’s core competency. Bernstein Research, a financial research firm, published a report in February which estimated Bitmain’s annual profits at about $3 billion, which rivals chip giant Nvidia Corp. Nvidia’s core offerings are GPUs for gaming and AI applications (deep learning), though it has also seen a significant boost to its top and bottom lines due to the cryptocurrency craze.

As a means of diversifying their product portfolio and hedging their business, Bitmain is looking to leverage their strength in ASIC chip development and enter the AI hardware market, which puts them in direct competition with chip giants such as Nvidia, AMD, and Intel.

ASIC (application specific integrated circuit) chips are processors that are physically designed to complete a specific computational task or set of tasks with the highest efficiency possible. The transistors in the chips are physically structured such that the logical functions represented by their architecture are optimized to complete the computational task at hand fully and with the minimum possible number of bitwise operations.

Google was the first to develop an ASIC chip for deep learning. Their first foray into the AI ASIC hardware space was their release of the Tensor Processing Unit (TPU). The chip was designed specifically for use with TensorFlow, an open-source software library used for neural network machine learning, also developed by Google. Google’s TPUs improve the computational efficiency with which TensorFlow neural networks can train and inference by using a technique called quantization. Google defines it as “an optimization technique that uses an 8-bit integer to approximate an arbitrary value between a preset minimum and a maximum value.” Rather than using 32-bit or 16-bit floating point operations to compute optimal neural network weights, using quantized 8-bit integers greatly reduces the computational cost of deep learning.

Bitmain’s opportunity to challenge Google in this space comes as a byproduct of Chinese regulations. Since Google only offers use of its TPU through its cloud service, which is banned in China, Bitmain can compete. Bitmain released early versions of its Sophon BM1680 ASIC chip for AI in October. It is a component in a $600 machine learning accelerator card that they offer. While limited in the scope of its functions, the Sophon chip is cheaper than GPUs and can outperform them in certain types of machine learning, such as convolutional neural networks (CNN), deep neural networks (DNN), and recurrent neural networks (DNN). The company plans to release two more chip generations in the next few months.

Bitmain CEO Jihan Wu claims that he believes its AI division could account for 40 percent of their total revenue in five years. That may be possible considering the explosive growth in machine learning in recent years. Bitmain’s pivot to AI could prove to be a good strategic move if cryptocurrency prices continue floundering.

Add a comment

Loading data ...
Comparison
View chart compare
View table compare