OpenPangu Ultra-MoE-718B-V1.1: A Practical Guide to This Massive Mixture-of-Experts Language Model What Is OpenPangu Ultra-MoE-718B-V1.1, and How Can It Fit into Your AI Projects? OpenPangu Ultra-MoE-718B-V1.1 is a large-scale mixture-of-experts language model trained on Ascend NPU hardware, boasting a total of 718 billion parameters but activating just 39 billion at a time. This setup gives it two key abilities: quick thinking for fast responses and deep thinking for tackling tough problems. Compared to the earlier V1.0 version, V1.1 shines brighter with better tool-calling skills for agents, a much lower rate of hallucinations—those pesky made-up facts—and overall stronger performance across the …