Is enormous size central to success?

Huge Artificial Intelligences (AI) from America and China impress specialists everywhere with their ability to understand and use language for a variety of tasks - and this at a level that sometimes corresponds to the human one.

Economically, it is about a billion dollar business, technologically presumably a breakthrough to AI systems, which are no longer just outstanding in a highly specialized skill, but which actually have more general skills.

Alexander Armbruster

Responsible editor for Wirtschaft Online.

  • Follow I follow

Researchers and entrepreneurs in Germany want to counter this with something comparable. They have come together in a project called OpenGPT-X to form an alliance that wants to construct competitive AI systems and make them easily accessible to small and medium-sized German companies. The federal government has now selected the project as part of the Gaia-X initiative and is funding it.

"We want to develop at least one generally available general language model for German that can compete internationally," says Joachim Köhler, the responsible AI specialist at the Fraunhofer Institute for Intelligent Analysis and Information Systems in Sankt Augustin, who coordinates the consortium. In addition to the research company, these include the AI ​​companies Aleph Alpha, Alexander Thamm and ControlExpert, Forschungszentrum Jülich, TU Dresden, the German Research Center for Artificial Intelligence (DFKI), Internet service provider 1 & 1 IONOS, West German Broadcasting and the Federal Association of AI. Fraunhofer researcher Köhler emphasizes that his own claim goes well beyond the academic: “There is great interest from German industry, for example from the automotive sector.One of the topics is robust question-and-answer systems. "

The American and Chinese competitors who are currently setting the standards are as well-known as they are resourceful. A huge neural network called GPT 3 caused a stir. It was created by inventors from the Californian AI company OpenAI, which was once co-founded by Elon Musk. The abbreviation GPT 3 stands for "Generative Pretrained Transformer 3", the name components "Generative", "Pretrained" and "Transformer" say something about how the developers built and trained this model.

Compared to its previous version, it differs centrally in that it was trained with a lot more data and is many times larger; GPT 3 has 175 billion parameters, so to speak, the smallest adjustment screws that help determine its learning and performance. Such a large neural network did not exist before.

However, it is no longer the only and largest of its kind: Google presented its language model “Switch Transformer” with 1.6 trillion parameters at the beginning of this year, and the Chinese Beijing Academy of Artificial Intelligence (BAAI) recently presented its AI system WuDao 2.0, which with 1.75 trillion parameters is said to be ten times as large as GPT 3 and was also trained not only with language data, but also with extensive image material. Leading researchers are amazed at what the new giant AI networks can do - even if they often point out that they are currently still far removed from the versatile human brain.

“This is really an amazing invention that works well. But GPT 3 still has room for improvement. I find GPT 3 particularly exciting in terms of what GPT 7 could be - or what another algorithm could be in a few years' time, ”said Stanford professor and now independent AI entrepreneur Andrew Ng in a recent interview with FAZ - He once helped set up Google's AI research group and temporarily also headed the relevant department of the Chinese IT group Baidu.

As can be heard, OpenAI has long been working on the successor version of the current system. The fact that the originally charitable project under its boss Sam Altman and as a result of a billion-dollar investment by the Microsoft IT group has now turned into a commercial provider is another reason for competitors in Germany and Europe to urge them to hurry.