You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Progress in computing is often described with numbers alone: megahertz, gigabytes, teraflops. But the real meaning of progress isn’t found in statistics — it’s found in how the change felt to users, and in which bottleneck happened to give way at the time.
From Vacuum Tubes to Silicon
The 1960s were a decade of upheaval, even if ordinary people didn’t notice it yet. The transistor replaced the vacuum tube, integrated circuits emerged, and in 1965 Gordon Moore put forward an observation that later became a “law”: the number of transistors on a chip roughly doubles every couple of years. By today’s standards the machines were painfully slow, but the relative pace of improvement was enormous. Each year brought something that had previously been impossible.
The Microprocessor Changes Everything
The 1970s aren’t the most colorful chapter in popular history, but they were essential. Intel’s 4004 in 1971 was the first commercial microprocessor, followed by the 8008 and 8080, and eventually machines like the Altair 8800 — one of the early turning points for hobbyists and the home-computing wave. Just as important was memory development such as Intel’s 1103 DRAM: without cheap RAM, the microprocessor might have remained an industrial curiosity. The 1970s laid the foundation on which the next decade built a revolution.
The Short but Bright Era of Home Micros
In the early 1980s, computers arrived in living rooms. The Commodore 64, Amiga 500, Atari ST, MSX, and many others put real computing power within reach of ordinary people. At the same time, the IBM PC and the first Macintosh set their own trajectories — and PC compatibles gradually became the standard that eventually took over the market. The era of home micros was brief, but its cultural impact was massive: an entire generation grew up with computers, and that generation later built much of what came next.
Home micros also made BASIC almost a kitchen-table language. Many of today’s software veterans wrote their first lines of code in BASIC during those years.
The Pentium Wars and the Megahertz Race
From the mid-1990s onward, progress was perhaps at its most visible to everyday users. Competition between Intel and AMD pushed clock speeds upward at an almost exponential pace. You could buy a computer, and a year and a half later the same money would buy literally twice the performance. Comparisons were easy: megahertz told the whole story. This was the era when Moore’s Law felt real in daily life — not just on circuit diagrams.
Hitting the Wall
In the mid-2000s, rising clock speeds ran into the limits of physics. Dennard scaling broke down, and heat dissipation became an unavoidable problem. The response was multi-core processors: if a single core couldn’t be made much faster, you put several in parallel. For users this was less obvious, because taking advantage of parallelism required software to be redesigned. To many everyday users, the years from roughly 2005 to 2015 felt like CPU progress had stalled — even though it continued, just in a different direction.
At the same time, the user experience changed dramatically in another way. The spread of SSDs in the 2010s was, for many people, as dramatic as the Pentium Wars had been on the CPU side. A machine that used to boot in a minute suddenly woke up in ten seconds. The processor didn’t get faster — the bottleneck simply moved. Meanwhile, part of “performance” shifted into the network: more and more everyday software lived in the browser, backed by data centers.
A New Acceleration from the Side
The rise of GPU computing began as early as CUDA in 2007, and the deep-learning breakthrough of AlexNet in 2012 gave it a clear direction. But the real explosion happened in the 2020s with the revolution of large language models. Suddenly there still wasn’t enough compute — but this time the story wasn’t about general-purpose CPUs, it was about specialized parallel hardware.
That’s the paradox of the present: a five-year-old computer can feel almost as good as a new one for everyday tasks, yet for AI workloads the difference can be enormous. The center of gravity has moved to where the demand is strongest.
When Was Progress the Fastest?
There’s no single answer, because comparing different eras is like comparing apples to oranges. The 1960s were the most dramatic in relative terms, the 1990s were the most visible, and the 2020s are the most specialized. The metric changes with time: first transistors, then megahertz, then cores — and now TOPS figures.
And above all, the experience is personal. For one person the biggest leap was moving from cassette tape to floppy disk, for another it was the “instant wake-up” of SSDs, and for a third it was the first time a language model answered in a way that actually made sense. How “intense” progress feels depends on who measures it — and from where.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Progress in computing is often described with numbers alone: megahertz, gigabytes, teraflops. But the real meaning of progress isn’t found in statistics — it’s found in how the change felt to users, and in which bottleneck happened to give way at the time.
From Vacuum Tubes to Silicon
The 1960s were a decade of upheaval, even if ordinary people didn’t notice it yet. The transistor replaced the vacuum tube, integrated circuits emerged, and in 1965 Gordon Moore put forward an observation that later became a “law”: the number of transistors on a chip roughly doubles every couple of years. By today’s standards the machines were painfully slow, but the relative pace of improvement was enormous. Each year brought something that had previously been impossible.
The Microprocessor Changes Everything
The 1970s aren’t the most colorful chapter in popular history, but they were essential. Intel’s 4004 in 1971 was the first commercial microprocessor, followed by the 8008 and 8080, and eventually machines like the Altair 8800 — one of the early turning points for hobbyists and the home-computing wave. Just as important was memory development such as Intel’s 1103 DRAM: without cheap RAM, the microprocessor might have remained an industrial curiosity. The 1970s laid the foundation on which the next decade built a revolution.
The Short but Bright Era of Home Micros
In the early 1980s, computers arrived in living rooms. The Commodore 64, Amiga 500, Atari ST, MSX, and many others put real computing power within reach of ordinary people. At the same time, the IBM PC and the first Macintosh set their own trajectories — and PC compatibles gradually became the standard that eventually took over the market. The era of home micros was brief, but its cultural impact was massive: an entire generation grew up with computers, and that generation later built much of what came next.
Home micros also made BASIC almost a kitchen-table language. Many of today’s software veterans wrote their first lines of code in BASIC during those years.
The Pentium Wars and the Megahertz Race
From the mid-1990s onward, progress was perhaps at its most visible to everyday users. Competition between Intel and AMD pushed clock speeds upward at an almost exponential pace. You could buy a computer, and a year and a half later the same money would buy literally twice the performance. Comparisons were easy: megahertz told the whole story. This was the era when Moore’s Law felt real in daily life — not just on circuit diagrams.
Hitting the Wall
In the mid-2000s, rising clock speeds ran into the limits of physics. Dennard scaling broke down, and heat dissipation became an unavoidable problem. The response was multi-core processors: if a single core couldn’t be made much faster, you put several in parallel. For users this was less obvious, because taking advantage of parallelism required software to be redesigned. To many everyday users, the years from roughly 2005 to 2015 felt like CPU progress had stalled — even though it continued, just in a different direction.
At the same time, the user experience changed dramatically in another way. The spread of SSDs in the 2010s was, for many people, as dramatic as the Pentium Wars had been on the CPU side. A machine that used to boot in a minute suddenly woke up in ten seconds. The processor didn’t get faster — the bottleneck simply moved. Meanwhile, part of “performance” shifted into the network: more and more everyday software lived in the browser, backed by data centers.
A New Acceleration from the Side
The rise of GPU computing began as early as CUDA in 2007, and the deep-learning breakthrough of AlexNet in 2012 gave it a clear direction. But the real explosion happened in the 2020s with the revolution of large language models. Suddenly there still wasn’t enough compute — but this time the story wasn’t about general-purpose CPUs, it was about specialized parallel hardware.
That’s the paradox of the present: a five-year-old computer can feel almost as good as a new one for everyday tasks, yet for AI workloads the difference can be enormous. The center of gravity has moved to where the demand is strongest.
When Was Progress the Fastest?
There’s no single answer, because comparing different eras is like comparing apples to oranges. The 1960s were the most dramatic in relative terms, the 1990s were the most visible, and the 2020s are the most specialized. The metric changes with time: first transistors, then megahertz, then cores — and now TOPS figures.
And above all, the experience is personal. For one person the biggest leap was moving from cassette tape to floppy disk, for another it was the “instant wake-up” of SSDs, and for a third it was the first time a language model answered in a way that actually made sense. How “intense” progress feels depends on who measures it — and from where.
Beta Was this translation helpful? Give feedback.
All reactions