Limited Time Offer!

For Less Than the Cost of a Starbucks Coffee, Access All DevOpsSchool Videos on YouTube Unlimitedly.
Master DevOps, SRE, DevSecOps Skills!

Enroll Now

AI Chip Startup Cerebras Reveals ‘World’s Fastest AI Supercomputer’

Source: crn.com

Artificial intelligence chip startup Cerebras Systems claims it has the “world’s fastest AI supercomputer,” thanks to its large Wafer Scale Engine processor that comes with 400,000 compute cores.

The Los Altos, Calif.-based startup introduced its CS-1 system at the Supercomputing conference in Denver last week after raising more than $200 million in funding from investors, most recently with an $88 million Series D round that was raised in November 2018, according to Andrew Feldman, the founder and CEO of Cerebras who was previously an executive at AMD.

The CS-1 system is based on the startup’s Wafer Scale Engine chip, which it says is “the only trillion transistor wafer scale processor in existence,” measuring 56.7 times larger and containing 78 more compute cores than the largest GPU.

This brings total funding to more than $200 million for Cerebras, which showed off its CS-1 system at the Supercomputing conference in Denver last year. The system is based on its Wafer Scale Engine chip, which it says is “the only trillion transistor wafer scale processor in existence,” measuring 56.7 times larger and containing 78 more compute cores than the largest GPU.

With Wafer Scale Engine’s 400,000 compute cores and 18 GB of on-chip memory, the startup said the CS-1 can deliver compute performance with less space and power than any other system, representing one-third of a standard data center rack while replacing the need for hundreds of thousands of GPUs.

“The CS-1 is the industry’s fastest AI computer, and because it is easy to install, quick to bring up and integrates with existing AI models in TensorFlow and PyTorch, it delivers value the day it is deployed,” Feldman said in a recent statement. “Depending on workload, the CS-1 delivers hundreds or thousands of times the performance of legacy alternatives at one-tenth the power draw and one-tenth the space per unit compute.”

Cerebras is targeting deep learning workloads, both for training and inference. The startup said the large size of its Wafer Scale Engine allows it to “process information more quickly” than other AI accelerators like GPUs, reducing training work from months to minutes. Inference, in the meantime, “is thousands of times faster,” with the Wafer Scale Engine capable of making a single image classification in microseconds, which is equal to one one-thousandth of a millisecond.

Among Cerebras’ dozens of customers are the U.S. Department of Energy’s Argonne National Laboratory and Lawrence Livermore National Laboratory. Argonne, in particular, is using the CS-1 to accelerate neural networks for cancer studies, study the properties of black holes and treat traumatic brain injury.

The startup doesn’t have plans to sell its chips or systems through channel partners for now, according to Cerebras’ spokesperson.

Marc Fertik, vice president of technology solutions at Elk Grove Village, Ill.-based Ace Computers, No. 261 on CRN’s 2019 Solution Provider 500 list, said with the CS-1 likely costing a “fortune,” it wouldn’t make sense for Cerebras to work with resellers until it starts selling less expensive systems at higher volumes.

“As soon as you move down in price point to increase your volume, you need to build your channel, because that’s when your business development and sales force can’t do it alone,” he said.

Even then, he said, prebuilt systems aren’t for everyone in the channel, citing Nvidia’s GPU-accelerated DGX deep learning system as an example. While some partners have built practices around the platform, others, like Ace Computers, would rather focus on building their own GPU-accelerated systems using parts from server vendors such as Supermicro, according to Fertik.

“We have military guys that don’t care about the extra software support, but they care about the hardware cost,” he said. “We have successfully convinced them and sold them multiple times something that is 30 percent cheaper.”

Related Posts

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Artificial Intelligence