AMD Advancing AI/ISC 2025: Compal Optimizes AI Workloads with AMD Instinct MI355X
SG720-2A/OG720-2A AI server with AMD Instinct MI355X, featuring advanced liquid cooling and high-density GPU design
This is a Press Release edited by StorageNewsletter.com on June 16, 2025 at 2:02 pmAs AI computing accelerates toward higher density and greater energy efficiency, Compal Electronics, Inc. unveiled its latest high-performance server platform: SG720-2A/OG720-2A at both AMD Advancing AI 2025 in the U.S. and the International Supercomputing Conference (ISC) 2025 in Europe.
It features the AMD Instinct MI355X GPU architecture and offers both single-phase and 2-phase liquid cooling configurations, showcasing Compal’s leadership in thermal innovation and system integration. Tailored for next-gen GenAI and large language model (LLM) training, the SG720-2A/OG720-2A delivers flexibility and scalability for modern data center operations, drawing significant attention across the industry.
With GenAI and LLMs driving increasingly intensive compute demands, enterprises are placing greater emphasis on infrastructure that offers both performance and adaptability. The SG720-2A/OG720-2A emerges as a solution, combining high-density GPU integration and flexible liquid cooling options, positioning itself as an ideal platform for next-gen AI training and inference workloads.
Key Technical Highlights:
- Support for up to 8 AMD Instinct MI350 Series GPUs (including MI350X/MI355X): Enables scalable, high-density training for LLMs and GenAI applications.
- Dual cooling architecture – Air and Liquid Cooling: Optimized for high thermal density workloads and diverse deployment scenarios, enhancing thermal efficiency and infrastructure flexibility. The 2-phase liquid cooling solution, co-developed with ZutaCore, Inc., leverages the ZutaCore HyperCool 2-Phase DLC liquid cooling solution, delivering stable and exceptional thermal performance, even in extreme computing environments.
- Advanced architecture and memory configuration: Built on the CDNA 4 architecture with 288GB HBM3E memory and 8TB/s bandwidth, supporting FP6 and FP4 data formats, optimized for AI and HPC applications.
- High-speed interconnect performance: Equipped with PCIe Gen5 and AMD Infinity Fabric for multi-GPU orchestration and high-throughput communication, reducing latency and boosting AI inference efficiency.
- Comprehensive support for mainstream open-source AI stacks: Fully compatible with ROCm, PyTorch, TensorFlow, and more – enabling developers to streamline AI model integration and accelerate time-to-market.
- Rack compatibility and modular design: Supports EIA 19″ and ORv3 21″ rack standards with modular architecture for simplified upgrades and maintenance in diverse data center environments.
SG720-2A/OG720-2A AI server with AMD Instinct MI355X,
featuring advanced liquid cooling and high-density GPU design
The company has maintained a long-standing, strategic collaboration with AMD across multiple server platform generations. From high-density GPU design and liquid cooling deployment to open ecosystem integration, both companies continue to co-develop solutions that drive greater efficiency and sustainability in data center operations.
“The future of AI and HPC is not just about speed, it’s about intelligent integration and sustainable deployment. Each server we build aims to address real-world technical and operational challenges, not just push hardware specs. SG720-2A/ OG720-2A is a true collaboration with AMD that empowers customers with a stable, high-performance, and scalable compute foundation.” said Alan Chang, VP, infrastructure solutions business group, Compal.
The series made its debut at Advancing AI 2025 and was concurrently showcased at the ISC 2025 in Europe. Through this dual-platform exposure, Compal is further expanding its global visibility and partnership network across the AI and HPC domains, demonstrating a strong commitment to next-gen intelligent computing and international strategic development.