Some who hear the name Gottfried Wilhelm Leibniz may think of an outstanding lawyer, historian, philosopher and diplomat.

Many also know him as one of the most important mathematicians of all time.

In fact, this universal genius, whose birthday is 375th anniversary this year, has achieved even more: He laid the practical and theoretical foundations of all modern computers.

But first things first: The construction of automatons began in antiquity.

The Antikythera gearbox (a type of astronomical calculating machine) was created more than 2000 years ago.

Probably the first programmable machine in the world (the automatic puppet theater of Heron of Alexandria) dates from the 1st century.

Its source of energy was a falling weight pulling a cord wound around the pins of a rotating cylinder.

Complex command sequences that controlled doors and puppets for several minutes were encoded by complex wraps.

Binary arithmetic

But many aspects of modern computer science can actually be traced back to Leibniz.

In 1673 he invented the so-called pedometer, the first machine that could perform all four basic arithmetic operations.

This pointed beyond Wilhelm Schickard's gear-based calculating machine from 1623 and Blaise Pascal's superior Pascaline from 1642. This is more important than it might appear at first glance, because: Anyone who has mastered the basic arithmetic operations can perform any numerical calculation - A quarter of a millennium later, the Austrian mathematician Kurt Gödel even coded any formal systems and arithmetic processes using basic arithmetic operations.

Interview with Jürgen Schmidhuber: Every 30 years, AIs get a million times bigger "

The pedometer was also the first computer with internal memory. The so-called Leibniz wheel stored the multiplicand of a multiplication. During the calculation it counted how many additions had already been carried out to carry out a given multiplication. Such internal variables are of course indispensable in computer science today.

Inspired by the ancient binary Chinese I Ching, Leibniz also documented binary arithmetic, which is used in practically all modern calculating machines. A calculator that only works with 0 and 1 as elementary symbols is easier to build than a decimal calculator that uses all digits from 0 to 9. It should be mentioned, however, that number representations in the dual system are much older per se and go back to ancient Egypt. The algorithmic aspect of dual computing, however, is relatively new.

In 1679, Leibniz actually described the basic principles of binary computers.

He represented binary numbers using marbles controlled by punch cards.

He essentially explained the functional principle of electronic computers in which gravity and the movement of the marbles are, however, replaced by electrical circuits.

Universal language

All of this does not include all of his achievements as a thought leader in computer science. In 1686 Leibniz created his formal algebra of thinking, which from a deductive point of view is equivalent to the much later so-called Boolean algebra from the year 1847. The truth values ​​0 and 1 are linked by elementary operations such as “and” or “or” into sometimes very complex expressions . This in turn laid the foundation for the first formal language (1879) by Gottlob Frege and thus ultimately also for theoretical computer science. Bertrand Russell wrote that Leibniz advanced the field of formal logic “in a way that has not been seen since Aristotle”.

But what actually drove Leibniz? Throughout his life he pursued the ambitious project of clarifying all possible questions through calculations. Inspired by the 13th century scholar Ramon Llull, he developed highly influential ideas on a universal language and a general calculus for inference (Characteristica Universalis & Calculus Ratiocinator). The AI ​​pioneer Norbert Wiener once said: "In fact, the general idea of ​​a calculating machine is nothing more than a mechanization of Leibniz's calculus Ratiocinator."