
Tencent Hunyuan Open-Sources Translation Model 1.5: Runs on 1GB Phone Memory, Outperforms Commercial APIs
Want to read in a language you're more familiar with?
Tencent Hunyuan open-sources Translation Model 1.5 in 1.8B and 7B variants, with the smaller version running on 1GB phone memory for offline real-time translation, outperforming commercial APIs.
On December 30, 2025, Tencent Hunyuan announced the open-sourcing of Hunyuan Translation Model 1.5 (HY-MT1.5), available in 1.8B and 7B variants. Supporting mutual translation across 33 languages and 5 ethnic/regional dialect conversions, the models are now live on the Hunyuan website, GitHub, and Hugging Face for direct developer access.
Key highlights:
- HY-MT1.5-1.8B targets edge deployment: quantized version runs smoothly on 1GB memory, enables offline real-time translation with average 50-token processing in 0.18 seconds (faster than major commercial APIs), matching most commercial tools—ideal for instant messaging and smart customer service.
- HY-MT1.5-7B upgrades the WMT25 30-language champion model, improving accuracy and reducing annotation carryover and language mixing for professional scenarios.
Both models support edge-cloud hybrid deployment and three practical features: custom terminology libraries for fields like medicine and law, context understanding for coherent long-text translation, and format-preserving translation for webpages and documents. Built with large-model distillation, they deliver high performance in small sizes, already deployed in Tencent Meeting and Enterprise WeChat, with compatibility for Arm, Qualcomm, Muxi, and other platforms.
Source: IT Home




