Chinanews.com, Beijing, December 12 (Reporter Xia Bin) Since the beginning of this year, the most important trend in the global science and technology field is a new round of artificial intelligence model competition. With the release of ChatGPT by OpenAI in the United States, the Chinese science and technology community has taken up the recruitment, and the total number of large models in China once approached 2.

Generative AI technology is gradually transforming and reshaping human capabilities in various fields such as society, economy, and culture. With the opening of the "100-model war" in China, large models have shown amazing creative ability in many fields such as writing articles, dialogues, planning, and writing code, and are evolving rapidly every day.

Industry experts believe that the industrial dividends brought by this technology have just begun to emerge, and the deep integration of AIGC with the digital economy and the real economy will also create more disruptive social value and economic value, which is expected to open a new round of technological and industrial changes related to the next few decades.

Some analysts believe that the above-mentioned evolution and transformation need to consolidate the wide area and span of the large model technology base, and it is necessary to create a safe and credible artificial intelligence ecology through a large number of intelligent calculations, coupling the underlying technology of artificial intelligence with basic software, accelerating the implementation of applications, going deep into vertical fields, and building a growth flywheel of skill models and industry models built on the base of high-quality general large models.

The Artificial Intelligence Computing Conference (AICC2023) held recently in Beijing attracted large model experts from Zhipu AI, IDEA Research Institute, Baichuan Intelligence, Circular Intelligence, NetEase Fuxi, Institute of Automation, Chinese Academy of Sciences and other institutions to share large model technology topics such as multimodality, ultra-long context, and open source datasets, and will have a dialogue and collision on the technical route of large models, the engineering challenges of training and the construction of open source ecology.

Caption: The Artificial Intelligence Computing Conference (AICC 2023) was recently held in Beijing. Photo: Courtesy of the organizer

At the same time, the conference set up a special area of "Helping 1500 Models" in the 2m<> real-life AI Innovation Technology Exhibition, gathering the industry's top basic models and industry models, and demonstrating the charm of large model technology through a real-time interactive system.

Inspur Information also recently officially released the 2-billion-level open source large model source 0.2, and announced the source large model co-training plan, calling on developers to train large models together and build a prosperous open source ecosystem. As a 0-billion-level open-source model, Source <>.<> has evolved in programming, reasoning, logic, etc., providing richer and more comprehensive assistance for large-scale model entrepreneurs and developers, as well as a more open space for technological innovation.

It should be noted that in the face of the advent of the "AI era", "going it alone" is no longer suitable for scientific and technological innovation in the era of big science. The "100 model war" has entered the second half, and after the savage growth, the "group model era" is coming, and the industry pattern has undergone reshuffle and reorganization, and the survival of the fittest is gradually taking shape.

It is said that after a lot of scientific and technological exploration and market trial and error, the consensus of the industry has gradually become clear: a prosperous open source model ecosystem is an important way to attract and cultivate users, and it is also a new business paradigm to avoid the current AI large model enterprises from reinventing the wheel.

On the one hand, large models require a huge amount of continuous investment, and the B-end market is also facing this problem.

On the other hand, only by gathering the power of technological innovation with excellent open source model performance and feeding back the iterative upgrading of large model data, tools, and applications with technological innovation can we provide a solid foundation and soil for growth for global developers, research institutions, and technology enterprises, and stimulate infinite innovation.

Liu Jun, senior vice president of Inspur Information, believes that in order to cope with the development and challenges of generative AI, we should innovate from the four levels of computing power system, AI Infra, algorithm model, and industrial ecology, and effectively improve the ability of basic large models through innovations such as large model computing power efficiency and model algorithms, and promote the application of "100 models and 100 lines" with a healthy industry ecology.

Wu Shaohua, director of artificial intelligence software research and development of Inspur Information, told a reporter from Chinanews that the open source general model is the "foundation" of the most advanced productivity in the future, and the stability and thickness of the foundation determine the height of the building. How to use the same amount of computing power and higher quality data in exchange for a lower loss rate, and more effectively match the computing power with the emergence of intelligence, this is the core key to promoting the healthy development of the industry. (ENDS)