Regístrese ahora para una mejor cotización personalizada!

Noticias calientes

South Korean firm unveils faster AI data centre architecture with CXL-over-Xlink

Jul, 17, 2025 Hi-network.com

South Korean company Panmnesia has introduced a new architecture for AI data centres aimed at improving speed and efficiency.

Instead of using only PCIe or RDMA-based systems, its CXL-over-Xlink approach combines Compute Express Link (CXL) with fast accelerator links such as UALink and NVLink.

The company claims this design can deliver up to 5.3 times faster AI training and reduce inference latency sixfold. By allowing CPUs and GPUs to access large shared memory pools via the CXL fabric, AI workloads are no longer restricted by the fixed memory limits inside each GPU.

It will enable data centres to scale compute and memory independently, adapting to changing workload demands without hardware overprovisioning.

Panmnesia's system also reduces communication overhead using accelerator-optimised links for CXL traffic, helping maintain high throughput with sub-100ns latency.

The architecture incorporates a hierarchical memory model blending local high-bandwidth memory with pooled CXL memory, alongside scalable CXL 3.1 switches that connect hundreds of devices efficiently without bottlenecks.

,

tag-icon Etiquetas calientes: Inteligencia Artificial Desarrollo de la capacidad capacidad Convergencia y OTT

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.
Our company's operations and information are independent of the manufacturers' positions, nor a part of any listed trademarks company.