
Just as there are widely understood empirical laws of nature - for example, what goes up must come down, or every action has an equal and opposite reaction - the field of AI was long defined by a single idea: that more compute, more training data and more parameters makes a better AI model.
However, AI has since grown to need three distinct laws that describe how applying compute resources in different ways impacts model performance. Together, these AI scaling laws - pretraining scaling, post-training scaling and test-time scaling, also called long thinking - reflect how the field has evolved with techniques to use additional compute in a wide variety of increasingly complex AI use cases.
The recent rise of test-time scaling - applying more compute at inference time to improve accuracy - has enabled AI reasoning models, a new class of large language models (LLMs) that perform multiple inference passes to work through complex problems, while describing the steps required to solve a task. Test-time scaling requires intensive amounts of computational resources to support AI reasoning, which will drive further demand for accelerated computing.
What Is Pretraining Scaling? Pretraining scaling is the original law of AI development. It demonstrated that by increasing training dataset size, model parameter count and computational resources, developers could expect predictable improvements in model intelligence and accuracy.
Each of these three elements - data, model size, compute - is interrelated. Per the pretraining scaling law, outlined in this research paper, when larger models are fed with more data, the overall performance of the models improves. To make this feasible, developers must scale up their compute - creating the need for powerful accelerated computing resources to run those larger training workloads.
This principle of pretraining scaling led to large models that achieved groundbreaking capabilities. It also spurred major innovations in model architecture, including the rise of billion- and trillion-parameter transformer models, mixture of experts models and new distributed training techniques - all demanding significant compute.
And the relevance of the pretraining scaling law continues - as humans continue to produce growing amounts of multimodal data, this trove of text, images, audio, video and sensor information will be used to train powerful future AI models.
Pretraining scaling is the foundational principle of AI development, linking the size of models, datasets and compute to AI gains. Mixture of experts, depicted above, is a popular model architecture for AI training. What Is Post-Training Scaling? Pretraining a large foundation model isn't for everyone - it takes significant investment, skilled experts and datasets. But once an organization pretrains and releases a model, they lower the barrier to AI adoption by enabling others to use their pretrained model as a foundation to adapt for their own applications.
This post-training process drives additional cumulative demand for accelerated computing across enterprises and the broader developer community. Popular open-source models can have hundreds or thousands of derivative models, trained across numerous domains.
Developing this ecosystem of derivative models for a variety of use cases could take around 30x more compute than pretraining the original foundation model.
Developing this ecosystem of derivative models for a variety of use cases could take around 30x more compute than pretraining the original foundation model.
Post-training techniques can further improve a model's specificity and relevance for an organization's desired use case. While pretraining is like sending an AI model to school to learn foundational skills, post-training enhances the model with skills applicable to its intended job. An LLM, for example, could be post-trained to tackle a task like sentiment analysis or translation - or understand the jargon of a specific domain, like healthcare or law.
The post-training scaling law posits that a pretrained model's performance can further improve - in computational efficiency, accuracy or domain specificity - using techniques including fine-tuning, pruning, quantization, distillation, reinforcement learning and synthetic data augmentation.
Fine-tuning uses additional training data to tailor an AI model for specific domains and applications. This can be done using an organization's internal datasets, or with pairs of sample model input and outputs.
Distillation requires a pair of AI models: a large, complex teacher model and a lightweight student model. In the most common distillation technique, called offline distillation, the student model learns to mimic the outputs of a pretrained teacher model.
Reinforcement learning, or RL, is a machine learning technique that uses a reward model to train an agent to make decisions that align with a specific use case. The agent aims to make decisions that maximize cumulative rewards over time as it interacts with an environment - for example, a chatbot LLM that is positively reinforced by thumbs up reactions from users. This technique is known as reinforcement learning from human feedback (RLHF). Another, newer technique, reinforcement learning from AI feedback (RLAIF), instead uses feedback from AI models to guide the learning process, streamlining post-training efforts.
Best-of-n sampling generates multiple outputs from a language model and selects the one with the highest reward score based on a reward model. It's often used to improve an AI's outputs without modifying model parameters, offering an alternative to fine-tuning with reinforcement learning.
Search methods explore a range of potential decision paths before selecting a final output. This post-training technique can iteratively improve the model's responses
More from Nvidia
20/02/2025
American Sign Language is the third most prevalent language in the United States...
20/02/2025
The NVIDIA GeForce RTX 5070 Ti graphics cards - built on the NVIDIA Blackwell ar...
20/02/2025
Editor's note: This post is part of Into the Omniverse, a series focused on ...
20/02/2025
Wield magic and steel as GeForce NOW's fifth-anniversary celebration summons Obsidian Entertainment's highly anticipated Avowed to the cloud.
This firs...
19/02/2025
In financial services, AI has traditionally been used primarily for fraud detect...
19/02/2025
Scientists everywhere can now access Evo 2, a powerful new foundation model that...
19/02/2025
The telecom industry's efforts to drive efficiencies with AI are beginning to show fruit.
An increasing focus on deploying AI into radio access networks (R...
13/02/2025
Asteroids were responsible for extinction events hundreds of millions of years a...
13/02/2025
Asteroids were responsible for extinction events hundreds of millions of years a...
13/02/2025
It's a match made in heaven - GeForce NOW and Warner Bros. Games are collabo...
12/02/2025
Just as there are widely understood empirical laws of nature - for example, what goes up must come down, or every action has an equal and opposite reaction - th...
12/02/2025
The rapid evolution of generative AI has created countless opportunities for inn...
11/02/2025
NVIDIA's contributions to accelerating medical imaging, genomics, computatio...
11/02/2025
Tara Chklovski has spent much of her career inspiring young women to take on som...
07/02/2025
Every year, venomous snakes kill over 100,000 people and leave 300,000 more with devastating injuries - amputations, paralysis and permanent disabilities. The v...
06/02/2025
AI built for speech is now decoding the language of earthquakes.
A team of researchers from the Earth and environmental sciences division at Los Alamos Nationa...
06/02/2025
GeForce NOW celebrates its fifth anniversary this February with a lineup of five major releases. The month kicks off with Kingdom Come: Deliverance II. Prepare ...
05/02/2025
When non-technical users can create and deploy reliable AI workflows, organizations can do more to serve their clientele
Platforms for developing no- and low-c...
05/02/2025
The financial services industry is reaching an important milestone with AI, as organizations move beyond testing and experimentation to successful AI implementa...
05/02/2025
NVIDIA's GeForce RTX 5090 and 5080 GPUs - which are based on the groundbreaking NVIDIA Blackwell architecture -offer up to 8x faster frame rates with NVIDIA...
04/02/2025
AI reasoning models and agents are set to transform industries, but delivering their full potential at scale requires massive compute and optimized software. Th...
31/01/2025
The recently released DeepSeek-R1 model family has brought a new wave of excitement to the AI community, allowing enthusiasts and developers to run state-of-the...
30/01/2025
DeepSeek-R1 is an open model with state-of-the-art reasoning capabilities. Instead of offering direct responses, AI models like DeepSeek-R1 perform reasoning th...
30/01/2025
GeForce NOW turns five this February. Five incredible years of high-performance gaming have been made possible thanks to the members who've joined the cloud...
30/01/2025
New GeForce RTX 5090 and RTX 5080 GPUs - built on the NVIDIA Blackwell architect...
29/01/2025
AI agents with advanced perception and cognition capabilities are making digital experiences more dynamic and personalized across retail, finance, entertainment...
27/01/2025
Named after Greek mythology's goddess of the sea, France-based startup Amphi...
23/01/2025
Businesses across every industry are rolling out AI services this year. For Microsoft, Oracle, Perplexity, Snap and hundreds of other leading companies, using t...
23/01/2025
GeForce NOW is expanding mod support for hit game Baldur's Gate 3 in collaboration with Larian Studios and mod.io for Ultimate and Performance members.
Thi...
22/01/2025
Companies and organizations are increasingly using AI to protect their customers and thwart the efforts of fraudsters around the world.
Voice security company ...
22/01/2025
Editor's note: This post is part of Into the Omniverse, a series focused on ...
22/01/2025
AI agents - which can understand, adapt to and support each user's unique journey - are making online shopping and digital marketing more efficient and pers...
21/01/2025
More than 90 million new vehicles are introduced to roads across the globe every...
16/01/2025
Time to suit up, members. The multiverse is about to get a whole lot cloudier as GeForce NOW opens a portal to the first season of hit game Marvel Rivals from N...
16/01/2025
AI agents are poised to transform productivity for the world's billion knowledge workers with knowledge robots that can accomplish a variety of tasks. To ...
15/01/2025
Troves of unwatched surgical video footage are finding new life, fueling AI tools that help make surgery safer and enhance surgical education. The Surgical Data...
14/01/2025
AI is making inroads across the entire healthcare industry - from genomic research to drug discovery, clinical trial workflows and patient care.
In a fireside ...
14/01/2025
Quantum computing is one of the most exciting areas in computer science, promising progress in accelerated computing beyond what's considered possible today...
13/01/2025
For decades, leadership in computing and software ecosystems has been a cornerst...
13/01/2025
For decades, leadership in computing and software ecosystems has been a cornerst...
13/01/2025
IQVIA, the world's leading provider of clinical research services, commercial insights and healthcare intelligence, is working with NVIDIA to build custom f...
10/01/2025
Artificial intelligence is rapidly becoming the cornerstone of innovation in the...
09/01/2025
Driving the future of smart mobility, Hyundai Motor Group (the Group) is partnering with NVIDIA to develop the next generation of safe, secure mobility with AI ...
09/01/2025
This GFN Thursday recaps the latest cloud announcements from the CES trade show, including GeForce RTX gaming expansion across popular devices such as Steam Dec...
08/01/2025
Over the past year, generative AI has transformed the way people live, work and play, enhancing everything from writing and content creation to gaming, learning...
07/01/2025
Data is the fuel of AI applications, but the magnitude and scale of enterprise data often make it too expensive and time-consuming to use effectively.
Accordin...
07/01/2025
In the fast-evolving landscape of AI, it's becoming increasingly important to develop models that can accurately simulate and predict outcomes in physical, ...
06/01/2025
The next big moment in AI is in sight - literally.
Today, more than 1.5 billion enterprise level cameras deployed worldwide are generating roughly 7 trillion h...
06/01/2025
Generative AI and foundation models let autonomous machines generalize beyond th...
06/01/2025
According to Gartner, the worldwide end-user spending on all IT products for 202...