back

Tencent open-sources Hy3-preview, a 295B MoE model built in under three months

2026-04-24 13:05

Tencent released Hy3-preview on April 23, a 295B-parameter Mixture-of-Experts model with 21B active parameters, a 256K token context window, and scores of 74.4% on SWE-bench Verified and 54.4% on Terminal-Bench 2.0. The model was trained from a cold start in roughly three months following a leadership rebuild of Tencent's Hy team, and weights are available on GitHub, Hugging Face, and ModelScope under an open license. Tencent framed the release around product integration over benchmark positioning: the model already powers Yuanbao, CodeBuddy, and Tencent Docs.

Citations