Skip to Content Facebook Feature Image

ASUS Unveils Game-Changing Liquid-Cooled AI Infrastructure Powered by NVIDIA Vera Rubin Platform

Business

ASUS Unveils Game-Changing Liquid-Cooled AI Infrastructure Powered by NVIDIA Vera Rubin Platform
Business

Business

ASUS Unveils Game-Changing Liquid-Cooled AI Infrastructure Powered by NVIDIA Vera Rubin Platform

2026-03-17 13:06 Last Updated At:13:25

Delivering trusted AI with total flexibility, from rack-scale AI factories to edge and enterprise deployment

TAIPEI, March 17, 2026 /PRNewswire/ -- ASUS today unveiled its fully liquid-cooled AI infrastructure at NVIDIA GTC 2026 (Booth# 421), delivering a comprehensive, end-to-end solution powered by the NVIDIA Vera Rubin platform. Under the theme Trusted AI, Total Flexibility, this customizable framework — from rack-scale AI Factories, desktop AI supercomputing, Edge AI to Enterprise AI solutions — enables enterprises and cloud providers to build high-performance, energy-efficient large-scale AI clusters with unmatched efficiency and dramatically reduced PUE and TCO.

As a provider of NVIDIA GB300 NVL72 and NVIDIA HGX B300 systems, the flagship ASUS offering is the ASUS AI POD built on the NVIDIA Vera Rubin platform — a liquid-cooled, rack-scale powerhouse designed for massive AI workloads. Through strategic partnerships with leading cooling and component providers, ASUS offers diverse cooling modalities, tailored thermal solutions, and redundancy to meet any enterprise requirement. Proven by global client successes, ASUS provides expert consultation, a broad portfolio of AI and storage solutions, seamless infrastructure deployment, application integration, and ongoing services — combining scalability, and sustainability to drive business value and intelligence.

From infrastructure to implementation: The ASUS AI Factory in action

At the forefront is the flagship XA VR721-E3 built on NVIDIA Vera Rubin NVL72, a 100% liquid-cooled rack-scale system. This offers a TDP of up to 227kW (MaxP) or 187kW (MaxQ), delivers up to 10X higher performance per watt, and is purpose-built for trillion-parameter models and delivering massive AI performance for large-scale AI factories. Partnering with Vertiv, a global leader in critical digital infrastructure, Schneider Electric and other leading providers, ASUS delivers a full-stack power and cooling infrastructure designed for zero-throttle performance from standard deployments to advanced liquid cooling, ensuring redundancy for each specific needs.

Addressing rigorous data-center demands, ASUS also introduces its latest server series built on NVIDIA HGX Rubin NVL8 systems, featuring eight NVIDIA Rubin GPUs connected via sixth-generation NVIDIA NVLink with integrated 800G bandwidth per GPU. To facilitate a seamless and cost-effective transition to liquid cooling, ASUS offers two distinct solutions: the XA NR1I-E12L, an innovative hybrid-cooled option; and the XA NR1I-E12LR, a 100% liquid-cooled system. The hybrid-cooled XA NR1I-E12L specifically combines direct-to-chip (D2C) liquid cooling for the NVIDIA HGX Rubin NVL8 baseboard with air cooling for the dual Intel® Xeon® 6 processors.

The portfolio is further strengthened by high-performance scalable servers like the XA NB3I-E12 built on NVIDIA HGX B300 systems to ensure a solution for every demanding AI workload, the ESC8000A-E13X based on NVIDIA MGX integrated with NVIDIA ConnectX-8 SuperNICs for extreme GPU to GPU connectivity and ESC8000A-E13P accelerated by NVIDIA RTX PRO 4500 Blackwell Server Edition or NVIDIA RTX PRO 6000 Blackwell Server Edition GPUs, delivering breakthrough performance for demanding data processing, AI, video, and visual computing workloads in a power efficient design.

The tangible impact of the complete ASUS AI Factory concept is already demonstrated through several successful customer deployments, where ASUS ESC8000 series powered a production-line digital twin built on  NVIDIA Omniverse libraries and integrated with NVIDIA's customizable multi-camera tracking workflow, enabling remote simulation and significantly reducing deployment risks, managing the entire process for seamless, low-disruption deployment and maximizes value from day one.

To support these powerful systems and democratize AI development, ASUS also has established a robust data ecosystem by partnering with NVIDIA-Certified storage providers — including IBM, DDN, WEKA and VAST Data — to deliver scalable, resilient solutions for memory-intensive AI. A full spectrum of storage solutions across block storage-VS320D-RS12, JBOD-VS320D-RS12J, object storage-OJ340A-RS60, and software-defined systems — ensuring flexibility from edge to cloud, and from enterprise applications to AI and HPC workloads.

Realizing physical AI: Full-stack edge AI supercomputing from development to deployment

As the domain expert in full-stack edge AI, ASUS establishes a complete ecosystem for physical AI, delivering the critical compute power required from initial development to final deployment. The journey begins at the developer's desk with ASUS ExpertCenter Pro ET900N G3, a deskside supercomputer powered by the NVIDIA Grace Blackwell Ultra platform. Featuring NVIDIA NVLink-C2C interconnects and 748GB of coherent unified memory, it handles the heavy lifting of training massive models. Alongside it, the ultrasmall ASUS Ascent GX10 offers agile petaflop-scale performance powered by NVIDIA Grace Blackwell Superchip, ideal for rapid model iteration and scalable edge setups.

This development prowess seamlessly transitions to the PE3000N, a ruggedized inference engine powered by NVIDIA Jetson Thor. Delivering a staggering 2,070 TFLOPS, the PE3000N provides the real-time compute needed for sensor fusion and autonomous navigation. Together, these systems form a unified workflow where open models such as NVIDIA Cosmos and vision AI libraries from Metropolis can effectively perceive, reason, and act in the physical world.

Secure and scalable agentic AI development

To further enhance these capabilities, ASUS Ascent GX10 and ASUS ExpertCenter Pro ET900N G3 empower agentic AI development with NVIDIA NemoClaw. This integration establishes an agent-ready platform for developers to build safe, long-running autonomous agents locally. By leveraging isolated sandbox environments, governed access control, and private on-device inference, ensuring safe and scalable agent workflows for the most demanding enterprise AI applications.

Enterprise AI: ASUS AI Hub with real-time business intelligence

To accelerate enterprise AI, ASUS presents the ASUS AI Hub, a turnkey on-premises AI platform optimized with ESC8000-series servers and powered by open-source LLMs such as NVIDIA Nemotron, and Gemma, enabling enterprises to build custom AI assistants, implement RAG-enhanced document intelligence, and maintain full data sovereignty for security and compliance.

Proven internally across over 10,000 employees with peak loads exceeding 600 requests per hour, >80% OCR accuracy, and >30% efficiency gains, the platform features domain-specific modules for diverse applications — including the newly-developed ASUS agentic internal business-intelligence platform — that allow senior leaders to instantly access critical insights on costs, sales, gross margins, factory operations, and other key metrics through simple natural-language Q&A, transforming complex data into immediate, actionable executive decision-making power.

ASUS and NVIDIA are also working together on NVIDIA NemoClaw — an open source stack that simplifies running OpenClaw always-on assistants, more safely, with a single command. It installs the NVIDIA OpenShell runtime—a secure environment for running autonomous agents, and open source models like NVIDIA Nemotron.

Green computing and sustainability at the core

Sustainability is a foundational pillar of the ASUS design philosophy, with green computing innovations integrated across both hardware and software to minimize TCO and environmental impact. On the hardware level, ASUS servers feature Thermal Radar 2.0, which uses up to 56 sensors to intelligently optimize fan performance, cutting power consumption by up to 36% and saving approximately $29,000 annually in a 1,000-node cluster. This commitment extends to software with the ASUS Control Center (ACC) Data Center Edition, a unified management platform that enhances security and includes automated carbon emissions tracking, providing enterprises with the tools needed to achieve their critical ESG goals.

ASUS, the AI supercomputing domain expert, provides comprehensive solutions and services, from consultation and deployment to user training and seamless integration via OpenAI-compatible APIs. As enterprises navigate the AI era, ASUS flexibly offers a cost-effective, secure, powerful and sustainable pathway to innovation and intelligent management.

AVAILABILITY & PRICING

ASUS servers are available worldwide. However, the availability of certain other ASUS products is subject to local regulatory requirements. For specific product availability and offerings in your region, please contact your local ASUS representative.

** This press release is distributed by PR Newswire through automated distribution system, for which the client assumes full responsibility. **

ASUS Unveils Game-Changing Liquid-Cooled AI Infrastructure Powered by NVIDIA Vera Rubin Platform

ASUS Unveils Game-Changing Liquid-Cooled AI Infrastructure Powered by NVIDIA Vera Rubin Platform

News Summary

  • Relativity expands its Relativity Academic program with modules around its aiR suite of legal AI solutions, large language models and generative AI to prepare students for modern legal work.
  • The program, now in its 14th year, reaches over 3,000 students annually and aims to equip future legal professionals with in-demand AI skills as the industry prioritizes tech-enabled talent.
  • Relativity Academic provides free, hands-on experience with real-world legal technology to thousands of students worldwide.

CHICAGO, May 13, 2026 /PRNewswire/ -- Relativity, a legal data intelligence company, today announced it has expanded Relativity Academic, its program that provides education to law school and paralegal students, with access to the generative AI-powered solutions in its aiR suite. Beginning in June, the Relativity Academic program curriculum will include modules covering large language models, generative AI, and how to use aiR for Review, aiR for Case Strategy, and aiR for Privilege.

"This incorporation of aiR solutions into the Relativity Academic curriculum goes beyond our proven dedication to expanding access to technology. This move is emblematic of our trust and investment in the next generation of legal talent," said Phil Saunders, CEO of Relativity. "The future of the legal profession is an AI-ready one and we want to further support the legal leaders of tomorrow with the knowledge and skill sets they need to enter the workforce with a bang."

According to the 2025 State of the Legal Industry report from SurePoint Technologies, law firms are increasingly recruiting legal professionals with AI expertise. Further, from 2024 to 2025, the report found that lateral hiring within the specialty of AI grew 68% across all attorney types in Am Law 200 firms, and associate lateral hiring increased by 106%. Relativity recognizes the importance of technology and AI proficiency in the hiring process and its academic program aims to arm students with the type of firsthand experience necessary to differentiate themselves amongst their peers.

"The Relativity Academic Program has been a boon to e-discovery education for over a decade, giving students not just conceptual grounding but genuine hands-on experience with the tools they will use in practice. Now, true to form, Relativity has stepped up to meet the watershed moment that generative AI represents for the profession," said William F. Hamilton, Master Legal Skills Professor at University of Florida Levin College of Law. "Adding aiR into the Academic Program does something essential: it moves students beyond awareness of generative AI into actual engagement with it, learning to interact with the technology, evaluate its outputs, and exercise the judgment that defines good lawyering. That capacity to think critically and judge wisely is the skill that will carry our students, and our profession, into the future."

Since it was established in 2012, the Relativity Academic program has partnered with more than 115 universities, law schools, paralegal and data science programs across the U.S., Canada, the U.K., Ireland, Australia, and select Asian and European countries, equipping over 3,000 students in the past year with practical experience using the same technology relied on by corporations, law firms, and government agencies.

Relativity Academic provides law schools, as well as paralegal and data science programs, with a hands-on technology component for their courses and access to a workspace in the AI platform for legal data intelligence, RelativityOne, free of charge. Participating students gain experience with AI-powered legal technology solutions, giving them a valuable foundation prior to graduating and entering the workforce.

Additionally, Relativity Academic delivers training for faculty, hands-on resources, localized curriculum and a community of Relativity professionals. Through the program, now in its 14th year, instructors can design assignments that mirror real-world scenarios, giving students direct experience with document review, issue tagging, workflow management, and case organization.

Those interested in exploring opportunities to integrate AI into their curricula may reach out to academic@relativity.com or visit www.relativity.com/resources/academic to learn more.

About Relativity
Relativity is a leading legal data intelligence company that builds technology to help users organize data, discover the truth, and act on it. Its extensible, AI-powered cloud platform, RelativityOne, transforms complex data into actionable insights at massive scale for litigation, investigations, regulatory inquiries, data breach responses, and other legal use cases. The world's largest law firms and corporations, government agencies, and a robust network of channel partners rely on Relativity's legal AI software to securely surface and manage the most relevant and impactful information in their matters. The company also expands access to technology by providing its platform at no cost to academic institutions through its Relativity Academic program and to organizations supporting pro bono legal work through its Justice for Change initiative.

** This press release is distributed by PR Newswire through automated distribution system, for which the client assumes full responsibility. **

Relativity Equips Future Legal Talent with AI Through Its Relativity Academic Program

Relativity Equips Future Legal Talent with AI Through Its Relativity Academic Program

Recommended Articles