Kai-Fu Lee’s 01.AI Releases A Large-scale Model Yi-34B Worth Over $1 Billion

Sinovation Ventures Chairman Dr. Kai-Fu Lee led the establishment of AI 2.0 company 01.AI, which officially open-sourced and released its first pre-trained large model Yi-34B on November 6th.

Dr. Kai-Fu Lee, the founder and CEO of 01.AI, stated: “01.AI is firmly committed to becoming a leader in the global market. From hiring our first employee, writing the first line of code, to designing the first prototype, we have always held onto our original intention and determination to become the “World’s No.1″. We have assembled a team with great potential that can compete with top companies such as OpenAI and Google. After nearly half a year of accumulation and development at a steady pace, along with globally competitive research and engineering capabilities, we have achieved remarkable results that are highly competitive on an international level. It can be said that Yi-34B has lived up to expectations and made an impressive debut.”

Kai-Fu Lee stated in a media interview that after a round of financing, the valuation of 01.AI has exceeded $1 billion US dollars.

The latest evaluation results show that Yi-34B ranks first in the so-called pre-training base large language models, outperforming leading open-source models including Meta’s Llama 2 in some key indicators. This is also the only domestically produced model that has successfully topped the Hugging Face global open-source model leaderboard so far.

Hugging Face is the world’s most popular open-source community for large models and datasets, considered to be the GitHub of the large model field, with significant authority in English proficiency testing for large models.

As a domestically produced high-quality large model, the Yi-34B focuses on better understanding Chinese. Compared to the benchmark GPT-4 in terms of CMMLU, E-Eval, and Gaokao – three major Chinese indicators, the Yi-34B also has advantages, highlighting its excellent ability in the Chinese language world and better meeting domestic market demands.

SEE ALSO: Sinovation Ventures Chairman, Kai-Fu Lee’s AI2.0 Company Has Officially Been Named

The newly released Yi-34B model, which is open source this time, will have the longest context window in the world, supporting up to 200K. It can handle input of extremely long texts with approximately 400,000 Chinese characters. In comparison, OpenAI’s GPT-4 has a context window of only 32K and can process about 25,000 words.

In language models, the context window is one of the key indicators of the comprehensive computational ability of large models. It is crucial for understanding and generating text that is specific to a particular context. Language models with longer windows can handle a richer knowledge base and generate more coherent and accurate text.

At present, the Yi series models have been officially launched on three major global open-source community platforms: Hugging Face, ModelScope, and GitHub. They are also open for commercial applications, providing developers with more options and higher quality choices in using LLM.