In 1968, this doctor of chemistry created NM Electronics in collaboration with physicist Robert Noyce, nicknamed the "Mayor of Silicon Valley".

A few months later, the two men bought the Intel name, for $ 15,000.

Gordon Moore was the company's CEO from 1979 to 1987.

In 1971, Intel released the first microprocessor, the equivalent of a computer on a chip, a programmable processor that contains several thousand transistors, a revolution.

Intel is now the largest semiconductor manufacturer in the United States and the third largest in the world by revenue, behind South Korea's Samsung and Taiwan's TSMC.

In 1965, while employed by another company, Fairchild Semiconductor, Gordon Moore predicted, in an article published by Electronics magazine, that the density of transistors in microprocessors would double every year.

He modified his projection in 1975, just as empirically, to retain a doubling every two years. Another pioneer of microchips, Carver Mead, called this prophecy Moore's Law.

The evolution of microprocessor capabilities has followed Moore's Law for decades, multiplying the performance of electronics and computing while lowering costs.

According to several estimates, the cost of a transistor has been divided by several hundred million since the early 60s.

This evolution has made it possible to democratize computers and electronics, first with personal computers, then various devices, up to the mobile phone.

"The world has lost a giant in Gordon Moore, one of Silicon Valley's leaders and a true visionary, who paved the way for the technological revolution," Apple CEO Tim Cook tweeted.

Experts predict that Moore's Law will soon no longer be applicable because of physical limitations to the inregration of transistors on a microprocessor.

© 2023 AFP