Banking Law NGO Research Telecommunication

Artificial intelligence is awakening the chip industry’s animal spirits

Artificial intelligence is awakening the chip industry’s animal spirits

Supercomputers usually fill entire rooms. But the one on the fifth floor of an office building in the centre of Bristol fits in an average-sized drawer. Its 16 processors punch more than 1,600 teraflops, a measure of computer performance.

SwissCognitiveThis puts the machine among the world’s 100 fastest, at least when solving certain artificial-intelligence () applications, such as recognising and images. The computer’s processors, developed by Graphcore, a startup, are tangible proof that has made chipmaking exciting again. After decades of big firms such as America’s Intel and Britain’s ARM ruling the semiconductor industry, the insatiable demand for computing generated by has created an opening for newcomers. And it may even be big enough to allow some startups to establish themselves as big, independent firms.

New Street, a research firm, estimates that the market for chips could reach $30bn by 2022. That would exceed the $22bn of revenue that Intel is expected to earn this year from selling processors for server computers. It could swell further, argue the authors of a recent report by UBS, an investment bank. processors, they believe, will create their own demand; they allow firms to develop cleverer services and devices, which will collect even more data, generating a need for even brainier chips.

To understand what is going on it helps to make a short detour into zoology. Broadly speaking, the world of processors is populated with two kinds of animal, explains Andrew Feldman, chief executive of Cerebras, an American competitor to Graphcore. One sort of chip resembles hyenas: they are generalists designed to tackle all kinds of computing problems, much as the hyenas eat all kinds of prey. The other type is like cheetahs: they are specialists which do one thing very well, such as hunting a certain kind of gazelle.

For much of computing history, hyenas named “central processing units” (CPUs) have dominated the chip savannah. Becoming ever more powerful according to Moore’s law, the rule that the performance of processors doubles every 18 months, they were able to gobble up computing tasks, or “workloads”, in the jargon. This is largely why Intel, for instance, in the early 1990s became the world’s biggest chipmaker and stayed that way for decades.

But in recent years the world of number-crunching has changed radically. Moore’s law has started to peter out because making ever-denser chips has hit physical limits. More importantly, has made it extremely cheap to amass huge amounts of data. Now more and more firms want to turn this asset into money with the help of , meaning distilling data to create offerings such as recognising faces, translating or predicting when machinery will break down.

Such trends have altered the chip-design habitat. First to benefit were “graphics processing units” (GPUs), a kind of hyena which are mainly made by Nvidia. Originally developed to speed up the graphics in video games, they are also good at digesting reams of data, which is a similar computational problem. But because they are insufficiently specialised, GPUs have been hitting the buffers, too. The demand for “compute”, as geeks call processing power, for the largest projects has been doubling every 3.5 months since 2012, according to OpenAI, a non-profit research organisation. “Hardware has become the bottleneck,” says Nigel Toon, the chief executive of Graphcore. […]

  1. lamin ahmed

    @SwissCognitive awesom

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.