💞 #Gate Square Qixi Celebration# 💞
Couples showcase love / Singles celebrate self-love — gifts for everyone this Qixi!
📅 Event Period
August 26 — August 31, 2025
✨ How to Participate
Romantic Teams 💑
Form a “Heartbeat Squad” with one friend and submit the registration form 👉 https://www.gate.com/questionnaire/7012
Post original content on Gate Square (images, videos, hand-drawn art, digital creations, or copywriting) featuring Qixi romance + Gate elements. Include the hashtag #GateSquareQixiCelebration#
The top 5 squads with the highest total posts will win a Valentine's Day Gift Box + $1
AI computing power has increased by 680 million times in 70 years, and three historical stages have witnessed the exponential explosion of AI technology
**Source:**Xinzhiyuan
Electronic computers were born in the 1940s, and within 10 years of the emergence of computers, the first AI application in human history appeared.
More than 70 years later, AI models can now not only write poems, but also generate images based on text prompts, and even help humans discover unknown protein structures.
So, what has driven the exponential growth of AI technology in such a short period of time?
A long chart from "Our World in Data" traces the history of AI development through changes in computing power used to train AI models as a scale.
The source of the data in the figure comes from a paper published by researchers from MIT and other universities.
In addition to the paper, there is also a research team that has made a visual table based on the data of this paper. The icons can be zoomed in and out to obtain detailed data.
The author of the chart mainly estimates the calculation amount of training each model by calculating the number of operations and GPU time. As for which model to choose as the representative of an important model, the author mainly determines through three properties:
Significant importance: A system has significant historical impact, significantly improves SOTA, or has been cited more than 1,000 times.
Relevance: The author only includes papers that contain experimental results and key machine learning components, and the goal of the paper is to promote the development of existing SOTA.
Uniqueness: If another paper describing the same system is more influential, that paper will be excluded from the author's dataset.
Three eras of AI development
In the 1950s, American mathematician Claude Shannon trained a robotic mouse named Theseus to navigate a maze and remember its paths—the first example of artificial learning.
Theseus is built on 40 floating point operations (FLOPs). FLOPs are commonly used as a measure of computer hardware computing performance. The higher the number of FLOPs, the greater the computing power and the more powerful the system.
Computing power, available training data and algorithms are the three major elements of AI progress. In the early decades of AI development, the computing power required grew according to Moore's Law - computing power doubled in about 20 months.
With the emergence of AlphaGo in 2015—a computer program that defeated professional human Go players—researchers discovered a third era: the era of large-scale AI models with greater computational demands than all previous AI systems. big.
Future Progress of AI Technology
Looking back over the last decade, computing power has grown so fast it’s almost mind-boggling.
For example, the computing power used to train Minerva, an AI that can solve complex mathematical problems, was almost 6 million times that used to train AlexNet a decade ago.
AI capabilities will continue to surpass humans in all aspects
The figure below shows in which year AI has reached or exceeded human levels in common capabilities used in daily work and life.
It's hard to say whether computing growth will maintain the same pace. Large-scale models require more and more computing power to train. If the supply of computing power cannot continue to grow, it may slow down the progress of AI technology development.
Likewise, using up all the data currently available for training AI models could also hinder the development and implementation of new models.
However, in 2023, a large amount of capital will pour into the AI industry, especially generative AI represented by large language models. Perhaps more breakthroughs are about to appear. It seems that the above three elements that promote the development of AI technology will be further optimized and developed in the future.
In the first half of 2023, the financing scale of startups in the AI industry reached US$14 billion, which is even more than the total financing received in the past four years.
References: