Main full stack of AI supercomputing options unveiled at GTC 2024
SINGAPORE – Media OutReach Newswire – 21 March 2024 – ASUS right this moment introduced its participation on the NVIDIA GTC international AI convention, the place it should showcase its options at sales space #730. On present would be the apex of ASUS GPU server innovation, ESC NM1-E1 and ESC NM2-E1, powered by the NVIDIA MGX modular reference structure, accelerating AI supercomputing to new heights.
To assist meet the growing calls for for generative AI, ASUS makes use of the newest applied sciences from NVIDIA, together with the B200 Tensor Core GPU, the GB200 Grace Blackwell Superchip, and H200 NVL, to assist ship optimized AI server options to spice up AI adoption throughout a variety of industries.
To higher assist enterprises in establishing their very own generative AI environments, ASUS affords an in depth lineup of servers, from entry-level to high-end GPU server options, plus a complete vary of liquid-cooled rack options, to satisfy various workloads. Moreover, by leveraging its MLPerf experience, the ASUS staff is pursuing excellence by optimizing {hardware} and software program for large-language-model (LLM) coaching and inferencing and seamlessly integrating complete AI options to satisfy the demanding panorama of AI supercomputing.
Tailor-made AI options with the all-new ASUS NVIDIA MGX-powered server The most recent ASUS NVIDIA MGX-powered 2U servers, ESC NM1-E1 and ESC NM2-E1, showcase the NVIDIA GH200 Grace Hopper Superchip which provide excessive efficiency and effectivity. The NVIDIA Grace CPU consists of Arm® Neoverse V9 CPU cores with Scalable Vector Extensions (SVE2) and is powered by NVIDIA NVLink-C2C expertise. Integrating with NVIDIA BlueField-3 DPUs and ConnectX-7 community adapters, ASUS MGX-powered servers ship a blazing information throughput of 400Gb/s, splendid for enterprise AI improvement and deployment. Coupled with NVIDIA AI Enterprise, an end-to-end, cloud-native software program platform for constructing and deploying enterprise-grade AI functions, the MGX-powered ESC NM1-E1 gives unparalleled flexibility and scalability for AI-driven information facilities, HPC, information analytics and NVIDIA Omniverse functions.
Superior liquid-cooling expertise The surge in AI functions has heightened the demand for superior server-cooling expertise. ASUS direct-to-chip (D2C) cooling affords a fast, easy choice that distinguishes itself from the competitors. D2C may be quickly deployed, reducing information middle power-usage effectiveness (PUE) ratios. ASUS servers, ESC N8-E11 and RS720QN-E11-RS24U, assist manifolds and funky plates, enabling various cooling options. Moreover, ASUS servers accommodate a rear-door warmth exchanger compliant with normal rack-server designs, eliminating the necessity to change all racks — solely the rear door is required to allow liquid cooling within the rack. By carefully collaborating with industry-leading cooling resolution suppliers, ASUS gives enterprise-grade complete cooling options and is dedicated to minimizing information middle PUE, carbon emissions and power consumption to help within the design and development of greener information facilities.
Assured AI software program options With its world-leading experience in AI supercomputing, ASUS gives optimized server design and rack integration for data-intensive workloads. At GTC, ASUS will showcase ESC4000A-E12 to exhibit a no-code AI platform with an built-in software program stack, enabling companies to speed up AI improvement on LLM pre-training, fine-tuning and inference — lowering dangers and time-to-market with out ranging from scratch. Moreover, ASUS gives a complete resolution to assist completely different LLM tokens from 7B, 33B and even over 180B with personalized software program, facilitating seamless server information dispatching. By optimizing the allocation of GPU assets for fine-tune coaching, the software program stack ensures that AI functions and workloads can run with out losing assets, which helps to maximise effectivity and return on funding (ROI). Moreover, the software-hardware synergy delivered by ASUS gives companies with the flexibleness to decide on the AI capabilities that finest match their wants, permitting them to push ROI nonetheless additional.
This revolutionary software program method optimizes the allocation of devoted GPU assets for AI coaching and inferencing, boosting system efficiency. The built-in software-hardware synergy caters to various AI coaching wants, empowering companies of all sizes, together with SMBs, to leverage superior AI capabilities with ease and effectivity.
To handle the evolving necessities of enterprise IoT functions, ASUS, famend for its sturdy computing capabilities, is collaborating with industrial companions, software program specialists and domain-focused integrators. These collaborations intention to supply turnkey server assist for full options, together with full set up and testing for contemporary data-center, AI and HPC functions.
AVAILABILITY & PRICING ASUS servers can be found worldwide. Please go to for extra ASUS data-center options or please contact your native ASUS consultant for additional data. Hashtag: #ASUS #AI #servers
The issuer is solely chargeable for the content material of this announcement.
About ASUS
ASUS is a world expertise chief that gives the world’s most revolutionary and intuitive gadgets, parts and options to ship unbelievable experiences that improve the lives of individuals in all places. With its staff of 5,000 in-house R&D specialists, ASUS is world-renowned for repeatedly reimagining right this moment’s applied sciences for tomorrow, garners greater than 11 awards daily for high quality, innovation and design, and is ranked amongst Fortune’s World’s Most Admired Corporations.