
ZAYA1-8B: Zyphra's Reasoning MoE Trained Entirely on AMD MI300X That Punches Far Above Its Weight Class - May 8, 2026
Published: May 8, 2026
Duration: 12:36
Send us Fan Mail
ZAYA1-8B: Zyphra's Reasoning MoE Trained Entirely on AMD MI300X That Punches Far Above Its Weight Class - May 8, 2026 Zyphra dropped ZAYA1-8B this week, a sub-billion-active-parameter mixture of experts reasoning model pretrained end to end on AMD Instinct MI300X GPUs that matches DeepSeek R1 on competition mathematics and approaches Claude 4.5 Sonnet under their novel Markovian RSA test time compute. Chris and Laura unpack the architecture innovations (Compressed Convolutional Attention, MLP-based router, learned residual scaling), the 14-trillion-token AMD-only training run, and what an Apache 2.0 frontier reasoning model on non-NVIDIA silicon...