AI computing power has increased by 680 million times in 70 years, and three historical stages have witnessed the exponential explosion of AI technology

**Source:**Xinzhiyuan

A picture reveals that AI computing power has developed 670 million times in more than 70 years. In the future, AI will surpass humans in all aspects of its capabilities. What is really exciting is that the AI industry has just entered the embryonic stage before the outbreak.

Electronic computers were born in the 1940s, and within 10 years of the emergence of computers, the first AI application in human history appeared.

More than 70 years later, AI models can now not only write poems, but also generate images based on text prompts, and even help humans discover unknown protein structures.

So, what has driven the exponential growth of AI technology in such a short period of time?

A long chart from "Our World in Data" traces the history of AI development through changes in computing power used to train AI models as a scale.

Large high-definition picture:

The source of the data in the figure comes from a paper published by researchers from MIT and other universities.

Paper address:

In addition to the paper, there is also a research team that has made a visual table based on the data of this paper. The icons can be zoomed in and out to obtain detailed data.

Form address:

The author of the chart mainly estimates the calculation amount of training each model by calculating the number of operations and GPU time. As for which model to choose as the representative of an important model, the author mainly determines through three properties:

Significant importance: A system has significant historical impact, significantly improves SOTA, or has been cited more than 1,000 times.

Relevance: The author only includes papers that contain experimental results and key machine learning components, and the goal of the paper is to promote the development of existing SOTA.

Uniqueness: If another paper describing the same system is more influential, that paper will be excluded from the author's dataset.

Three eras of AI development

In the 1950s, American mathematician Claude Shannon trained a robotic mouse named Theseus to navigate a maze and remember its paths—the first example of artificial learning.

Theseus is built on 40 floating point operations (FLOPs). FLOPs are commonly used as a measure of computer hardware computing performance. The higher the number of FLOPs, the greater the computing power and the more powerful the system.

Computing power, available training data and algorithms are the three major elements of AI progress. In the early decades of AI development, the computing power required grew according to Moore's Law - computing power doubled in about 20 months.

However, by the time 2012 marked the beginning of the deep learning era with AlexNet, an image-recognition AI, that doubling time had shortened significantly to six months as researchers invested more in computing and processors.

With the emergence of AlphaGo in 2015—a computer program that defeated professional human Go players—researchers discovered a third era: the era of large-scale AI models with greater computational demands than all previous AI systems. big.

Future Progress of AI Technology

Looking back over the last decade, computing power has grown so fast it’s almost mind-boggling.

For example, the computing power used to train Minerva, an AI that can solve complex mathematical problems, was almost 6 million times that used to train AlexNet a decade ago.

This growth in computing, coupled with the vast number of available data sets and better algorithms, has allowed AI to make a lot of progress in an extremely short period of time. Today, AI can not only reach human performance levels, but even surpass humans in many fields.

AI capabilities will continue to surpass humans in all aspects

As is clear from the chart above, AI has already surpassed human performance in many areas and will soon surpass human performance in others as well.

The figure below shows in which year AI has reached or exceeded human levels in common capabilities used in daily work and life.

### AI technology development potential is sufficient

It's hard to say whether computing growth will maintain the same pace. Large-scale models require more and more computing power to train. If the supply of computing power cannot continue to grow, it may slow down the progress of AI technology development.

Likewise, using up all the data currently available for training AI models could also hinder the development and implementation of new models.

However, in 2023, a large amount of capital will pour into the AI industry, especially generative AI represented by large language models. Perhaps more breakthroughs are about to appear. It seems that the above three elements that promote the development of AI technology will be further optimized and developed in the future.

In the first half of 2023, the financing scale of startups in the AI industry reached US$14 billion, which is even more than the total financing received in the past four years.

A large number (78%) of generative AI startups are still in the very early stages of development, and even 27% of generative AI startups have not yet raised funds.

There are more than 360 generative artificial intelligence companies, 27% of which have not yet raised funds. More than half are early-stage or earlier projects, indicating that the entire generative AI industry is still in a very early stage.

Due to the capital-intensive nature of developing large language models, the generative AI infrastructure category has received over 70% of funding since Q3 2022, accounting for only 10% of all generative AI transaction volume. Much of the funding comes from investor interest in emerging infrastructure such as underlying models and APIs, MLOps (machine learning operations), and vector database technology.

References:

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)