K2-Think: How a 32-Billion-Parameter Model Outperforms Giants in Math Olympiads

7 hours ago 高效码农

A conversation starter “Can a model small enough to fit on four gaming GPUs beat the latest 120-billion-parameter heavyweights at high-school math competitions?” The Mohamed bin Zayed University of Artificial Intelligence (MBZUAI) just proved the answer is ‘yes’. Below is a fully-transparent walk-through of their K2-Think recipe—data, code, training budget, safety filters and all—rewritten for junior-college graduates and busy engineers who simply want facts, numbers and reproducible steps. 1. Thirty-second summary Base model: Qwen2.5-32B (completely open weights) Post-training data: one open-source set, 92 k problems with automatically checkable answers Training stages: long-chain supervised fine-tuning → verifiable-reward RL → simple test-time …