Connect with us

Infra

Huawei Innovative Data Infrastructure Forum 2024

Published

on

Huawei Innovative Data Infrastructure Forum 2024

OceanStor A800 High-Performance AI Storage

OceanStor A800 can handle the small file loading of training sets and high-bandwidth-based resumable training after breakpoints. It also offers a leading intrinsic vector knowledge repository, supporting large model inference and application with over 250,000 queries per second (QPS), and achieving accelerated vector retrieval and millisecond-level inference response.

Benefits

Ultra-high performance

The groundbreaking control and data plane separation architecture facilitates the direct flow of data to disks, reducing CPU usage and delivering 24 million IOPS and 500 GB/s bandwidth per controller enclosure. Training-set loading efficiency is 8x higher than the next best in the industry. After a breakpoint, training can resume 4x faster than the next-best product in the industry. Near-storage computing accelerates data preprocessing. In addition, storage clusters can scale up to a maximum of 4,096 computing cards.

EB-level scalability

Fully-symmetric architecture supports scale-out to up to 512 controllers, delivering EB-level, on-demand capacity expansion. This enables large AI models to perform smooth evolution to ultra-large clusters with 10,000, or even 100,000 GPUs, multi-modality, and trillions of parameters.

Superb inference

An industry-leading intrinsic vector knowledge repository eliminates AI hallucinations and boosts vector retrieval speeds to more than 250,000 QPS, guaranteeing millisecond-level inference response.

Learn More

Continue Reading