back

Z.ai open-sources GLM-5 and GLM-5.1: 744B MoE under MIT license, tops SWE-bench Pro

2026-04-13 07:04

Z.ai (formerly Zhipu AI) released GLM-5 and its post-training variant GLM-5.1 as open source under the MIT license on April 7–8, 2026. The base model is a 744B-parameter mixture-of-experts with 40B active parameters and DeepSeek Sparse Attention, trained on 28.5 trillion tokens; it scores 77.8% on SWE-bench Verified and 92.7% on AIME 2026. GLM-5.1, the agentic post-training upgrade, reached the top of the SWE-Bench Pro leaderboard at 58.4%, edging past GPT-5.4 (57.7%) and Claude Opus 4.6 (57.3%). Weights are available on Hugging Face with no usage restrictions.

Citations