Leo CXL® Smart Memory Controllers

Purpose-built memory expansion, sharing, and pooling for AI and cloud platforms

Memory Solutions for the AI Era

  • Accelerates AI and cloud infrastructure with memory expansion, sharing, and pooling for enhanced performance
  • Eliminates bandwidth/capacity bottlenecks, reduces total cost of ownership, and optimizes memory utilization
  • Ensures end-to-end data integrity and protection with best-in-class industry standard security features
  • Delivers server-grade customizable RAS, advanced diagnostics, and fleet management via COSMOS suite

Leo Highlights

Accelerating AI with CXL Memory

Increase Memory
Capacity

With multiple DDR5 channels,
up to 5600 MT/s

Seamless
Interoperability Across

Stress tested with all major xPUs and Memory vendors

Enhanced Diagnostics
& Telemetry

Advanced capabilities through in-band and out-of-band management

Why use Leo CXL Smart Memory Controllers?

Astera Labs’ Leo CXL Smart Memory Controller is the industry’s first purpose-built solution that supports both memory expansion and memory pooling to solve performance bottlenecks and capacity constraints in cloud servers.

Purpose-Built for Cloud

Comprehensive portfolio of purpose-built SoCs and hardware solutions for cloud-scale deployment targeting workloads such as AI and ML

Customizable RAS & Security

Server-grade customizable RAS, end-to-end security features, and software tools to integrate with fleet-management services

Low-Latency DDR5

Flexible and scalable memory interface with low-latency data path to support JEDEC DDR5 memory interfaces

Seamless Interoperability

Seamless interoperability with all major CPU, GPU and memory vendors, making it easy to manage, debug, and deploy at scale

E-Series

  • Memory Expansion

P-Series

  • Memory Expansion
  • Memory Pooling
  • Memory Sharing

Leo A-Series Hardware Solutions

Leo A-Series CXL Smart Memory Hardware solutions offer all the benefits of Leo Controllers and enable quick plug-and-play deployment with faster time-to-market for system OEMs and data centers.

  • PCIe x16 CEM add-in card form factor
  • Up to 4x DDR5 RDIMMs supporting up to 2TB
  • On-board debug connectors for fleet management on Cloud Servers
  • Temperature and health monitoring of Leo controller and memory
  • RDIMM fault isolation and error correction
  • High volume production-qualified solutions with robust supply chain

Use Cases

Read more

Memory Disaggregation

Expand, pool and share memory between multiple servers to increase memory bandwidth and capacity while providing the option to reclaim stranded or under-utilized…

Videos

PCIe® Retimers vs. Redrivers: Ensuring Signal Integrity for AI Infrastructure

PCIe® technology serves as the backbone of data center communication in AI infrastructure. The latest PCIe standards deliver unprecedented data transfer rates—32 GT/s with PCIe 5.0 and 64 GT/s with PCIe 6.0. However, as speeds increase, so do the challenges in maintaining clean, reliable signals over longer distances or complex paths. Higher data rates make signals more susceptible to…

Read more

Extending Our Connectivity Leadership: Industry’s First End-to-End PCIe® over Optics Demo

The Generative AI revolution is reshaping all industries and redefining what’s possible in every aspect of our lives. Behind the scenes, the rapid pace of innovation is creating significant challenges for data center infrastructure, including:Exploding demand for AI processing resources that must be interconnected across the data center due to the need for Large Language Models to…

Read more

The Long and Short of AI: Building Scalable Data Centers in the PCIe® 6.x Era

By Abhishek Wadhwa, Senior Field Applications Engineer The rise of artificial intelligence (AI) and Generative AI are transforming how we interact with technology. From healthcare to business efficiency and groundbreaking research, AI and Generative AI are making waves. These AI marvels rely on vast amounts of hardware and infrastructure to function. As such, data centers are undergoing…

Read more

Astera Labs Opens New R&D Hub in Bengaluru to Drive AI and Cloud Innovation

Dr. Shivananda Koteshwar to lead the company’s India operations and spearhead development of innovative connectivity solutions for AI and cloud infrastructureSANTA CLARA, CA, U.S. – September 11, 2024 – Astera Labs, Inc. (Nasdaq: ALAB), a global leader in semiconductor-based connectivity solutions for AI and cloud infrastructure, today announced the official opening of its…

Read more

Astera Labs to Participate in the Deutsche Bank 2024 Technology Conference

SANTA CLARA, Calif.–Aug. 21, 2024— Astera Labs (Nasdaq: ALAB), a global leader in semiconductor-based connectivity solutions for AI and cloud infrastructure, today announced that it will participate in the Deutsche Bank 2024 Technology Conference on Aug. 29, 2024. Astera Labs’ presentation is scheduled for 12:30 pm PT. A webcast of the session will be made available on Astera Labs’…

Read more

Astera Labs’ AI Inferencing Demo with Leo CXL® Smart Memory Controllers Wins FMS Best of Show Award

Joint demo delivered 40% faster time-to-insights and 40% lower CPU utilizationAstera Labs, alongside its ecosystem partners, Supermicro and MemVerge, has won the Future of Memory and Storage (FMS) 2024 Most Innovative Technology Award. At FMS 2024, we jointly demonstrated how AI inferencing can gain significant benefits using CXL®-attached memory.Astera Labs is a second time winner,…

Read more

Astera Labs Announces Financial Results for the Second Quarter of Fiscal Year 2024

Record quarterly revenue of $76.9 million, up 18% QoQ and up 619% YoYMultiple secular trends, design wins across diverse AI platform architectures, and increasing average dollar content position the Company to outpace industry growthSANTA CLARA, CA, U.S. – August 6, 2024 – Astera Labs, Inc. (Nasdaq: ALAB), a global leader in semiconductor-based connectivity solutions for cloud…

Read more

PCIe® 6.x Technology Demo with Aries 6

See a live demo of PCIe® 6.x technology with Aries 6 – the third generation of our Aries PCIe/CXL® Smart DSP Retimer family.

Read more

AI Inferencing Demo with CXL®-Attached Memory: FMS 2024

This award-winning demo between Astera Labs, Supermicro, and MemVerge shows how AI inferencing can gain significant benefits using CXL® attached memory.

Read more

End-to-End PCIe® over Optics for GPU Clusters: First Look Demo

For this demo, we have assembled two common configurations for reach extension: head node to GPU clusters and head node to remote, disaggregated memory systems. This system is running PCI Express® at full rate over single mode fiber for an aggregate bandwidth of 128GB/s and a reach of 20 meters in this demonstration. This reach can be easily extended to 50 meters or more based on application…

Read more