September 30, 2022






Sunnyvale, Calif., United States & Bangalore, Karnataka, India:
 

Cerebras Methods, the pioneer in accelerating synthetic intelligence (AI) compute, at the moment introduced its continued world enlargement with the opening of a brand new India workplace in Bangalore, India. Led by business veteran Lakshmi Ramachandran, the brand new engineering workplace will concentrate on accelerating R&D efforts and supporting native clients. With a goal of greater than sixty engineers by 12 months finish, and greater than twenty presently employed, Cerebras is trying to quickly construct its presence in Bangalore.


 

“India basically and Bangalore particularly, is extraordinarily well-positioned to be a hotbed for AI innovation. It has world main universities, pioneering analysis establishments and a big home enterprise market,” mentioned Andrew Feldman, CEO and Co-Founding father of Cerebras Methods. “Cerebras is dedicated to being a frontrunner on this market. Below Lakshmi’s management, we’re quickly hiring top-notch engineering expertise for Cerebras Methods India, in addition to supporting subtle regional clients who wish to do essentially the most difficult AI work extra shortly and simply.”


 

As a part of the India enlargement, Cerebras appointed Ramachandran as Head of Engineering and India Website Lead at Cerebras India. Primarily based in Bangalore, Ramachandran brings greater than 24 years of technical and management expertise in Software program and Engineering. Previous to becoming a member of Cerebras, she was with Intel in varied engineering and management roles. Most lately, she was Senior Director at Intel’s Knowledge Heart and AI group, chargeable for delivering key capabilities of deep studying software program for AI accelerators. She has intensive expertise in scaling enterprise operations and establishing technical engineering groups in India.


 

“I’m honored to be a part of Cerebras Methods’ mission to vary the way forward for AI compute, and to work with the extraordinary crew that made wafer scale compute a actuality,” mentioned Ramachandran. “We’ve already begun to construct a world class crew of prime AI expertise in India, and we’re excited to be constructing core elements right here which are vital to the success of Cerebras’ mission. We stay up for including extra know-how and engineering expertise as we help the various buyer alternatives we now have countrywide.”


 

The Cerebras CS-2 is the quickest AI system in existence and it’s powered by the most important processor ever constructed – the Cerebras Wafer-Scale Engine 2 (WSE-2), which is 56 occasions bigger than the closest competitor. Consequently, the CS-2 delivers extra AI-optimized compute cores, extra quick reminiscence, and extra cloth bandwidth than every other deep studying processor in existence. It was function constructed to speed up deep studying workloads, lowering the time to reply by orders of magnitude.


 

With a CS-2, Cerebras lately set a world file for the most important AI mannequin skilled on a single machine. That is necessary as a result of with pure language processing (NLP), bigger fashions skilled with giant datasets are proven to be extra correct. However historically, solely the most important and most subtle know-how corporations had the sources and experience to coach huge fashions throughout a whole bunch or 1000’s of graphics processing models (GPU). By enabling the potential to coach GPT-3 fashions with a single CS-2, Cerebras is enabling your entire AI ecosystem to arrange and prepare giant fashions in a fraction of the time.


 

Clients around the globe are already leveraging the Cerebras CS-2 to speed up their AI analysis throughout drug discovery, clear vitality exploration, most cancers therapy analysis and extra. For instance, pharmaceutical chief GSK is now capable of prepare advanced epigenomic fashions with a beforehand prohibitively giant dataset – made attainable for the primary time with Cerebras. AstraZeneca is iterating and experimenting in real-time by working queries on a whole bunch of 1000’s of abstracts and analysis papers – a course of that beforehand took over two weeks with a GPU cluster and is now being achieved in simply over two days with Cerebras. And Argonne Nationwide Laboratory is utilizing a CS-2 to determine how the virus that causes COVID-19 works. They can run simulations with a single CS-2 that may require 110-120 GPUs.


 

The latest worldwide enlargement in India comes on the heels of Cerebras’ world development throughout Japan and Canada previously two years. With clients in North America, Asia, Europe and the Center East, Cerebras is delivering business main AI options to a rising roster of shoppers within the enterprise, authorities, and excessive efficiency computing (HPC) segments, together with GlaxoSmithKline, AstraZeneca, TotalEnergies, nference, Argonne Nationwide Laboratory, Lawrence Livermore Nationwide Laboratory, Pittsburgh Supercomputing Heart, Leibniz Supercomputing Centre, Nationwide Heart for Supercomputing Purposes, Edinburgh Parallel Computing Centre (EPCC), Nationwide Power Expertise Laboratory, and Tokyo Electron Units.


 

For extra data on Cerebras India, please go to www.cerebras.internet.


 

About Cerebras Methods

Cerebras Methods is a crew of pioneering pc architects, pc scientists, deep studying researchers, and engineers of every type. We’ve come collectively to construct a brand new class of pc system, designed for the singular function of accelerating AI and altering the way forward for AI work endlessly. Our flagship product, the CS-2 system is powered by the world’s largest processor – the 850,000 core Cerebras WSE-2 permits clients to speed up their deep studying work by orders of magnitude over graphics processing models.


 


 





Supply hyperlink