AMD
AI
AMD Authorized partnerDatacenter GPUs

AMD Instinct MI300X

SKU MI300X

192 GB HBM3 accelerator for LLM training and large-context inference.

MSRP indication
$27,000
Quote pricing may differ — request below.
View RFQ

Specifications

Memory
192 GB HBM3 @ 5.3 TB/s
Compute
FP8/FP16/BF16 mixed-precision
Form factor
OAM module
AMD
Sourced from AMD
Authorized Qubit partner
Vendor site
Need it bundled with deployment? Mention in your RFQ — we handle install and support too.

More in Datacenter GPUs

NVIDIA
NVIDIADatacenter GPUs

NVIDIA L40S 48GB

Universal GPU for AI inference, training, and graphics workloads.

MSRP $8,190
Details