Microsoft CTO Kevin Scott on How Far Scaling Laws Will Extend | Training Data

Published: 09 July 2024
on channel: Sequoia Capital
40,310
881

The current LLM era is the result of scaling the size of models in successive waves (and the compute to train them). It is also the result of better-than-Moore’s-Law price vs performance ratios in each new generation of Nvidia GPUs. The largest platform companies are continuing to invest in scaling as the prime driver of AI innovation.

Are they right, or will marginal returns level off soon, leaving hyperscalers with too much hardware and too few customer use cases? To find out, we talk to Microsoft CTO Kevin Scott who has led their AI strategy for the past seven years. Scott describes himself as a “short-term pessimist, long-term optimist” and he sees the scaling trend as durable for the industry and critical for the establishment of Microsoft’s AI platform.

Scott believes there will be a shift across the compute ecosystem from training to inference as the frontier models continue to improve, serving wider and more reliable use cases. He also discusses the coming business models for training data, and even what ad units might look like for autonomous agents.

Hosted by: Pat Grady and Bill Coughran, Sequoia Capital

00:00 - Introduction
01:20 - Kevin's backstory
06:56 - The role of PhDs in AI engineering
09:56 - Microsoft's AI strategy
12:40 - Highlights and lowlights
16:28 - Accelerating investments
18:38 - The OpenAI partnership
22:46 - Soon inference will dwarf training
27:56 - Will the demand/supply balance change?
30:51 - Business models for data
36:54 - The value function
39:58 - Copilots
44:47 - The 98/2 rule
49:34 - Solving zero-sum games
57:13 - Lightning round


Watch video Microsoft CTO Kevin Scott on How Far Scaling Laws Will Extend | Training Data online without registration, duration hours minute second in high quality. This video was added by user Sequoia Capital 09 July 2024, don't forget to share it with your friends and acquaintances, it has been viewed on our site 40,310 once and liked it 881 people.