The Worlds Largest AI Supercomputer 36 ExaFlops
>> YOUR LINK HERE: ___ http://youtube.com/watch?v=oVHkXEzKzxM
This is big news. It's easy to put NVIDIA at the center of AI hardware, but Cerebras Systems just made a massive sale, going above and beyond all the VC funding they've raised to date. Each one of Cerebras' Wafer Scale Engine systems costs between $1-2million, and a tier-2 cloud provider from the Middle East just purchased nine supercomputers with 64 chips each, or 576 total. All nine of these supercomputers can be networked together, to create a behemoth 36 ExaFlop (FP16) supercomputer just for machine learning. It's somewhat insane. • ----------------------- • Need POTATO merch? There's a chip for that! • http://merch.techtechpotato.com • http://more-moore.com : Sign up to the More Than Moore Newsletter • / techtechpotato : Patreon gets you access to the TTP Discord server! • Follow Ian on Twitter at / iancutress • Follow TechTechPotato on Twitter at / techtechpotato • If you're in the market for something from Amazon, please use the following links. TTP may receive a commission if you purchase anything through these links. • Amazon USA : https://geni.us/AmazonUS-TTP • Amazon UK : https://geni.us/AmazonUK-TTP • Amazon CAN : https://geni.us/AmazonCAN-TTP • Amazon GER : https://geni.us/AmazonDE-TTP • Amazon Other : https://geni.us/TTPAmazonOther • Ending music: • An Jone - Night Run Away • ----------------------- • Welcome to the TechTechPotato (c) Dr. Ian Cutress • Ramblings about things related to Technology from an analyst for More Than Moore • #cerebras #machinelearning #galaxycondor • ------------ • More Than Moore, as with other research and analyst firms, provides or has provided paid research, analysis, advising, or consulting to many high-tech companies in the industry, which may include advertising on TTP. The companies that fall under this banner include AMD, Armari, Facebook, IBM, Infineon, Intel, Lattice Semi, Linode, MediaTek, NordPass, ProteanTecs, Qualcomm, SiFive, Tenstorrent.
#############################
