Cerebras Systems to weave together chips & enhance AI power efficiency

Semiconductor company Cerebras Systems has reportedly stated that it has the capability of connecting almost 200 of the artificial intelligence chips to significantly reduce the power utilized by AI functions.

Cerebras is one of many startups producing chips particularly developed for artificial intelligence and it aims at challenging existing market leaders Google and Nvidia Corp. The firm has raised around USD 475 million in venture capital and has closed deals with pharma companies AstraZeneca Plc and GlaxoSmithKline Plc to utilize its chips to accelerate drug discovery.

Conventionally, thousands of computer chips are produced on a 30 cm silicon disc known as a wafer, which is then sliced up into single chips. On the contrary, Cerebras uses the whole wafer. The huge chip can hold more data simultaneously.

However, AI researchers now possess AI models known as neural networks, that are too big for any individual chip to hold which requires them to be distributed across many chips. The biggest existing neural networks are still only a fragment of a human brain’s complexity. But they use more energy than human brains, as the systems running them become less power-coherent with the addition of more chips.

According to the company, it can connect 192 of its chips to train bigger neural networks, sans any changes in the power efficiency with the addition of more chips. To put it another way, Cerebras can double the computing ability of its chips without affecting the power efficiency, unlike existing systems that require more than twice the power to double their computing capability.

The Chief Executive of Cerebras, Andrew Feldman, supposedly mentioned that the existing AI systems need power in tens of megawatts, which is basically the equivalent power required for a small city, to train these neural networks for a long time.

Source credits: