AI & RoboticsNews

Chip developer Cerebras bolsters AI-powered workload capabilities with $250M

Cerebras Systems, the California-based company that has built a “brain-scale” chip to power AI models with 120 trillion parameters, said today it has raised $250 million funding at a valuation of over $4 billion. Cerebras claims its technology significantly accelerates the time involved in today’s AI work processes at a fraction of the power and space. It also claims its innovations will support the multi-trillion parameter AI models of the future.

In a press release, the company stated that this additional capital will enable it to further expand its business globally and deploy its industry-leading CS-2 system to new customers, while continuing to bolster its leadership in AI compute.

Cerebras’ cofounder and CEO Andrew Feldman noted that the new funding will allow Cerebras to extend its leadership to new regions. Feldman believes this will aid the company’s mission to democratize AI and usher in what it calls “the next era of high-performance AI compute” — an era where the company claims its technology will help to solve today’s most urgent societal challenges across drug discovery, climate change, and much more.

Redefining AI-powered possibilities

“Cerebras Systems is redefining what is possible with AI and has demonstrated best in class performance in accelerating the pace of innovation across pharma and life sciences, scientific research, and several other fields,” said Rick Gerson, cofounder, chairman, and chief investment officer at Falcon Edge Capital and Alpha Wave.

“We are proud to partner with Andrew and the Cerebras team to support their mission of bringing high-performance AI compute to new markets and regions around the world,” he added.

Image of the Cerebras Wafer Scale Engine.

Cerebras’ CS-2 system, powered by the Wafer Scale Engine (WSE-2) — the largest chip ever made and the fastest AI processor to date — is purpose-built for AI work. Feldman told VentureBeat in an interview that in April of this year, the company more than doubled the capacity of the chip, bringing it up to 2.6 trillion transistors, 850,000 AI-optimized cores, 40GBs on-chip memory, 20PBs memory bandwidth, and 220 petabits fabric bandwidth. He noted that for AI work, big chips process information more quickly and produce answers in less time.

With only 54 billion transistors, the largest graphics processing unit pales in comparison to the WSE-2, which has 2.55 trillion more transistors. With 56 times more chip size, 123 times more AI-optimized cores, 1,000 times more high-performance on-chip memory, 12,733 times more memory bandwidth, and 45,833 times more fabric bandwidth than other graphic processing unit competitors, the WSE-2 makes the CS-2 system the fastest in the industry. The company says its software is easy to deploy, and enables customers to use existing models, tools, and flows without modification. It also allows customers to write new ML models in standard open source frameworks.

New customers

Cerebras says its CS-2 system is delivering a massive leap forward for customers across pharma and life sciences, oil and gas, defense, supercomputing centers, national labs, and other industries. The company announced new customers including Argonne National Laboratory, Lawrence Livermore National Laboratory, Pittsburgh Supercomputing Center (PSC) for its groundbreaking Neocortex AI supercomputer, EPCC, the supercomputing center at the University of Edinburgh, Tokyo Electron Devices, GlaxoSmithKline, and AstraZeneca.

A list of Cerebras's newest customers. The list of these customers can be found in the text of the article itself.

The series F investment round was spearheaded by Alpha Wave Ventures, a global growth stage Falcon Edge-Chimera partnership, along with Abu Dhabi Growth (ADG).

Alpha Wave Ventures and ADG join a group of strategic world-class investors including Altimeter Capital, Benchmark Capital, Coatue Management, Eclipse Ventures, Moore Strategic Ventures, and VY Capital. Cerebras has now expanded its frontiers beyond the U.S., with new offices in Tokyo, Japan, and Toronto, Canada. On the back of this funding, the company says it will keep up with its engineering work, expand its engineering force, and hunt for talents all over the world going into 2022.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member


Author: Kolawole Samuel Adebayo
Source: Venturebeat

Related posts
AI & RoboticsNews

DeepSeek’s first reasoning model R1-Lite-Preview turns heads, beating OpenAI o1 performance

AI & RoboticsNews

Snowflake beats Databricks to integrating Claude 3.5 directly

AI & RoboticsNews

OpenScholar: The open-source A.I. that’s outperforming GPT-4o in scientific research

DefenseNews

US Army fires Precision Strike Missile in salvo shot for first time

Sign up for our Newsletter and
stay informed!