Nvidia officially joined, albeit briefly, the trillion club during stock trading on the US stock exchange last Tuesday, May 30. The members of this club are the highest companies by market value in the world, whose value clearly exceeds the trillion dollar barrier, and includes only 5 companies: Apple, Microsoft, Saudi Aramco, Alphabet (Google), and Amazon. Now NVIDIA sits in sixth place just behind them with a market capitalization of just under $982 trillion, $1 billion at the time of writing.<>

This happened after the company's shares rose recently after announcing the results of the first quarter of the current fiscal year, in which it recorded profits higher than market expectations, and most importantly, the company expected its sales to rise to $ 11 billion during the second quarter of the year, while the previous expectations were significantly lower with estimates of only $ 7.15 billion (2).

In October last year, Nvidia's market value reached $300 billion, and by the end of last year it had reached $364 billion.[3] This means that the company's market capitalization has increased since the end of last year 2022 until the time of writing this report by about 172%, and by 230% since last October, outperforming any other company in the S&P 500 index, a stock index that includes the largest 500 American companies(500).

If we put the numbers achieved by Nvidia in context compared to those giant companies, it will become clear to us the magnitude of what it has achieved, as the journey of Nvidia since it broke the barrier of $ 500 billion for the first time in the third quarter of 2021 to reach a trillion dollars took only about 500 days, which is equivalent to a third of the time it took Apple to make that same journey (5).

All of this is beautiful, of course, but legitimate questions may now come to mind: What is NVIDIA at all? And what does it produce? More importantly, how did you reach such a huge market value to join the club of the world's big companies in such a short period?

NVIDIA Journey

NVIDIA was founded in 1993. (AFP)

Unlike other giants in the trillion dollar club, such as Apple, Google and Microsoft, the name "Nvidia" is not well known except for those interested in the world of electronic games and technology in general, however, there is a great chance that you are now using a computer that contains one of the Nvidia chips, because we are talking about the largest company in the global chip production market at the moment, and it specializes specifically in the production of graphics processors (GPU), or what we know as "graphics cards" in devices Personal computer.

NVIDIA was founded in 1993 from the restaurant "Denny's" in San Jose, California, where Engineer "Jensen Hong" met with his friends "Chris Malashwinski" and "Curtis Prime"(6), then the three discuss their vision for the future of GPUs, and their uses that may go beyond the field of games and Graphic fees, because they saw the possibility of using them in computing operations that need strong performance, as well as scientific research that needs simulation and complex calculations, and other tasks that require capabilities Powerful processing.

Over the past two decades, Nvidia's graphics cards have succeeded in developing the video game industry, by providing tremendous in-game graphics processing power, and after the company proved its success and worth in leading graphics processing in the field of games, it began to search for the possibility of developing the field of graphics processing and directing it towards more uses as its founders saw from the beginning.

Jensen Hong, founder and CEO of Nvidia. (Reuters)

Little by little, the uses of graphics cards have diversified, and the demand for them has increased, especially during 2020 with the outbreak of the Corona virus epidemic and the isolation of most people in their homes, and heading to video games as a way to escape and spend all this time. In the same period, cryptocurrency prices rose to record highs, and with it the demand for Nvidia cards rose, pushing the company's market value towards the highest record in its history until that time, exceeding the barrier of $ 820 billion by November of 2021 (7).

But as life returned to normal, with the collapse of cryptocurrency prices, successive economic crises, and the collapse of technology stocks during 2022, NVIDIA's shares began to collapse, until they lost more than half of their value and reached less than $ 300 billion in October last year, but by the end of the following month, something surprising happened that changed the entire market!

At the end of November of the same year, OpenAI launched a first beta version of the new chatbot "Chat GPT"(8), the famous robot that uses generative artificial intelligence, and from that moment on, a fierce and fierce battle began in this field between the largest technology companies, as we have all followed over the past months. All of this is good, of course, but what does it have to do with Nvidia?

AI War

What distinguishes graphics processors (GPUs), compared to central processing units (CPUs), is that they excel at performing many calculations at the same time. (Reuters)

The large language models that chatbots like GPT-4 rely on need powerful cloud services, which in turn rely on giant devices to train on all this massive data, which are in data centers. NVIDIA's GPUs are the perfect solution to accelerate these complex computations that occur within data center hardware.

AI model developers quickly realized that NVIDIA graphics processors were very useful and powerful in performing complex operations that support modern AI systems. What distinguishes these graphics processors (GPUs), compared to CPUs, is that they excel at performing many calculations at the same time. This was the same discovery made by cryptocurrency miners, such as Bitcoin and Ethereum, as GPUs proved powerful and efficient in speeding up the mining process, a process that takes very complex calculations to access those cryptocurrencies.

For example, OpenAI develops its language models based on Microsoft's cloud services, so Microsoft will need Nvidia processors in large quantities, up to thousands of units, to provide tremendous processing power, whether it is for OpenAI or any other company that wants to develop its large language models.

With the current struggle between giant companies, such as Microsoft and Google, in the race for artificial intelligence, the biggest beneficiary is Nvidia itself, so that Jensen Hong, founder and CEO of the company, described this current stage as the "iPhone moment" for the field of computing, likening the growth of the field to the exponential growth of smartphones after Apple launched the first iPhone in 2007 (9).

Of course, the huge high demand for graphics processing units has led to a rise in Nvidia shares over the past months, for example, the price of some of these advanced units has reached more than $ 33,10, and in some markets they may be more expensive amid high demand for them (<>). What's exciting now for investors is that after months of talk, arguments, and discussions about AI, money is finally pouring in, so they know where they can put their money, at least in an already strong company like Nvidia.

What's even more exciting is that the good times for Nvidia are not yet at their peak, as analysts see the AI boom as holding greater and more sustainable promise for NVIDIA's future than cryptocurrencies, and there is no real competitor currently that can offer the same as it offers to meet the complex processing requirements of generative AI systems.

Well, the question now is: How did NVIDIA manage to reach such a high position and dominate the graphics processor market in the field of artificial intelligence in particular?

The story of control

Over the past years that NVIDIA has spent between video games and cryptocurrencies, the company has been quietly and steadily becoming an artificial intelligence company in the first place. The company's entry into the field of artificial intelligence, or machine learning, began through a "cat" in 2010, or at least so the legend goes.[11]

Well, it's not a cat in the literal sense, but the story tells that Bill Daly, NVIDIA's chief researcher, was having breakfast with his former colleague from Stanford University, computer scientist Andrew Ng, who was working on a project with Google in which he was trying to find cats on the Internet.

The project aimed to develop a neural network that could learn by itself, showing it tens of millions of videos on YouTube, and thus learning how to determine human faces, bodies and cat shapes themselves, but the problem was in the learning process itself, for that process to happen accurately, it requires running a supercomputer for thousands of CPUs, which are the processors we know in computers, but "Dali" challenged him at the time that NVIDIA could perform that process using a few processing units. Graphics.

Of course, as I expected, Daly succeeded in his mission, training the model with just 12 GPUs, and proving to his friend Andrew that those units are faster and more efficient in training his model to identify cats. That's where NVIDIA realized what it could achieve in the world of artificial intelligence and machine learning, and decided to direct its efforts towards this field.

CUDA is a platform that allows programmers to use the parallel processing power of graphics cards for computing tasks for general use.

But things certainly don't go that simple, the most realistic story is that Nvidia began breaking into this field since 2006, after launching the platform and programming model "CUDA", short for the English phrase "Compute Unified Device Architecture", which was one of the revolutionary events in the company's history (12).

Simply put, CUDA is a platform that allows programmers to use the parallel processing power of graphics cards in computing tasks for general use, allowing programmers to exploit the power of graphics cards for tasks that go beyond just graphics and games. To understand it, imagine that you have a very large Excel spreadsheet, with thousands of rows and columns, but you need to perform complex calculations within each cell. Typically, your device's processor will handle these calculations one by one, which takes a long amount of time.

While with CUDA programming, you can take advantage of your device's graphics card to help you perform these complex calculations, you can divide the spreadsheet into smaller parts, and then assign each part of it one part of the graphics card, so that many of those calculations can be done simultaneously. This speeds up calculations, because the graphics card has thousands of cores (CUDA Cores) that can work together on different parts of the problem at the same time.

This can be applied to cryptocurrency mining, machine learning models and other complex processing, with a company like Netflix even leveraging GPUs in its AI models, which filter you the content on your homepage(13).

All these complex things, which we tried to simplify for you as much as we can, were happening in the background, and we may not know that they are happening at all, but what caused a stir about Nvidia recently, and its role in the field of artificial intelligence, is the event of the launch of the "OpenAI" chatbot "Chat GPT", so that the whole world suddenly notices and realizes that Nvidia is at the heart of this complex process, and sees the importance of this giant company for what it is, or at least this is what investors saw and reacted to clearly.

________________________________

Sources:

  • Nvidia crosses into $1 trillion market cap before giving back gains
  • NVIDIA Announces Financial Results for First Quarter Fiscal 2024
  • Market capitalization of NVIDIA
  • Nvidia briefly joins $1 trillion valuation club
  • Nvidia crashes $1 trillion party, perhaps not for long
  • How Nvidia's CEO Cooked Up America's Biggest Semiconductor Company
  • Market capitalization of NVIDIA
  • ChatGPT chatbot.. The AI Revolution Emerges from the Lab to Public Life
  • Nvidia Q1 2024 Earnings Call Transcript
  • The AI Boom Runs on Chips, but It Can't Get Enough
  • NVIDIA and the battle for the future of AI chips
  • CUDA Zone
  • How Netflix Uses AI to Find Your Next Binge-Worthy Show