Monday, November 10, 2025

Is Moore’s Law Obsolete? Rethinking Moore’s Law in the Age of ICT Maturity

Must Read

For half a century, Moore’s Law was the gospel of the computing world. Coined in 1965 by Intel co-founder Gordon Moore, it predicted that the number of transistors on a chip would double roughly every two years, bringing faster performance and lower costs. This simple observation became a doctrine, driving the evolution of the global ICT industry—from mainframes to smartphones, cloud computing to artificial intelligence. It was both an engineering benchmark and an economic engine, inspiring generations of technologists to chase exponential progress.

Yet in 2025, the very foundation of that belief stands on uncertain ground. The exponential era that Moore described is no longer a law of nature but a historical phase—a golden age that is reaching its thermodynamic, economic, and strategic limits. The question confronting the ICT industry today is not whether innovation continues, but whether it still follows Moore’s curve. Increasingly, the answer appears to be no.

The first cracks in Moore’s Law appeared long before the phrase “AI acceleration” became part of corporate vocabulary. By the mid-2000s, transistor miniaturization began to encounter physical barriers: heat dissipation, quantum tunneling, and power density constraints. Dennard scaling, the companion principle that allowed power efficiency to scale alongside transistor density, broke down. Frequency gains plateaued, and chipmakers turned to parallelism—more cores, more threads—to sustain growth.

In the short term, this strategy worked. Multicore CPUs and GPUs extended the curve by distributing workloads across multiple processing units. However, this shift masked an underlying truth: performance gains were no longer “free.” Power consumption, heat management, and fabrication costs all rose steeply. The transistor count kept doubling, but each doubling came at exponentially greater cost and complexity.

Cost per Transistor (2000–2025)
Cost per Transistor (2000–2025)

By 2020, the pattern had become unmistakable. TSMC’s five-nanometer process required $10 billion in capital investment; the current three-nanometer generation, nearing atomic scale, costs closer to $20 billion per fabrication facility. Intel’s roadmap toward “angstrom-class” chips—where transistor gates measure mere fractions of a nanometer—demands precision so extreme that quantum effects threaten to destabilize basic circuit function. Moore’s Law, once driven by engineering ingenuity, is now constrained by the physics of matter itself.

And yet, paradoxically, the demand for computational power has never been higher. Artificial intelligence, data analytics, immersive computing, and 5G infrastructure have created an insatiable appetite for processing capability. ICT today consumes nearly 10 percent of global electricity, according to the International Energy Agency, and the growth trajectory continues upward.

So if transistors can no longer shrink indefinitely, how does the ICT industry continue to grow? The answer lies not in the repetition of Moore’s formula but in its reinvention.

In the post-Moore era, the locus of innovation has shifted from hardware scaling to architectural optimization, system design, and specialization. Instead of universal performance gains, we now see targeted performance—hardware tailored to specific workloads.

The rise of AI accelerators exemplifies this transformation. Nvidia’s GPU architecture, once a tool for graphics rendering, became the backbone of deep learning after engineers discovered that neural networks map naturally to massively parallel computation. Similarly, Google’s Tensor Processing Unit (TPU) is a purpose-built chip optimized for machine learning inference. These designs achieve efficiency not through smaller transistors but through domain-specific architecture—proof that the future of performance lies in specialization, not scale.

A 2024 study from Nature Electronics noted that domain-optimized chips can outperform general-purpose CPUs by 10–100× for dedicated tasks while consuming less energy per computation. This model—known as “More than Moore”—represents the ICT industry’s strategic pivot: growth through intelligence, not raw density.

Even as architectural innovation flourishes, the economic foundation of Moore’s Law is eroding. Historically, transistor doubling also meant cost reduction: more performance per dollar. That link has now broken. Semiconductor economics have entered what researchers at MIT’s Microsystems Technology Laboratory call the “cost inversion era,” where each new process node costs more per transistor than the last.

This inversion is visible across the supply chain. Manufacturing advanced chips requires extreme ultraviolet (EUV) lithography machines costing over $150 million each—produced solely by ASML in the Netherlands. R&D budgets are escalating beyond what any single company can sustain, prompting consolidation across the semiconductor ecosystem. Intel, TSMC, and Samsung now dominate global foundry capacity, while design houses like Nvidia and AMD depend heavily on their fabrication capabilities.

These economic pressures are reshaping the competitive landscape of ICT. Rather than pursuing blanket transistor scaling, firms are investing in heterogeneous integration—stacking chips in 3D configurations, combining different process nodes, and embedding specialized accelerators alongside traditional cores. The era of single-node dominance is ending; the future is modular, adaptive, and collaborative.

The slowing of Moore’s Law also challenges assumptions in software development. For decades, programmers relied on the “free lunch” of hardware improvement—writing inefficient code with the confidence that faster chips would compensate. That luxury has evaporated.

Modern software must now co-evolve with hardware. This reality has revived interest in low-level optimization, parallel programming, and algorithmic efficiency. Companies like OpenAI and DeepMind have pioneered software-level compression techniques that dramatically reduce compute requirements without sacrificing performance.

Academic research mirrors this shift. A 2024 IEEE Computer Society study found that code optimization and algorithmic improvements now yield performance gains comparable to a full generation of hardware scaling. The implication for ICT is profound: innovation increasingly depends on collaboration between hardware engineers, software developers, and system architects—a co-design paradigm rather than a linear pipeline.

But the cultural dimension of Moore’s Law may be the hardest to relinquish. The ICT industry has long equated progress with speed, miniaturization, and exponential growth. Moore’s curve became a psychological baseline, shaping investor expectations, R&D cycles, and consumer demand. Even as physical scaling slows, companies continue to announce “Moore-like” progress through marketing proxies—flops per watt, AI throughput, or transistor-equivalent performance.

Yet this faith in exponentialism carries risk. It breeds what some analysts describe as innovation illusion—the belief that all technological challenges are temporary, solvable through the next leap in compute power. This mindset obscures systemic issues in ICT: software bloat, data inefficiency, network congestion, and the limits of human comprehension in increasingly complex systems.

The academic debate has shifted accordingly. In The Future of Computing Beyond Moore’s Law (Royal Society, 2019), James Shalf argued that computing’s next chapter will depend less on transistor scaling and more on architectural diversity—neuromorphic computing, quantum processors, and photonics. Each of these represents a break from Moore’s linear trajectory, embracing heterogeneity and domain-specific efficiency.

Quantum computing provides a case in point. Though still experimental, it reflects the post-Moore ethos: performance redefined through entirely new paradigms. Rather than stacking more transistors, quantum processors exploit superposition and entanglement to perform operations in parallel at unprecedented scales. IBM, Google, and Rigetti each project that limited commercial quantum systems will become viable within five years—not as replacements for classical computers, but as accelerators for specialized tasks such as encryption and optimization.

Similarly, neuromorphic computing, modeled on biological neural networks, represents another departure. Intel’s Loihi 2 and IBM’s TrueNorth chips process information through event-driven spikes rather than clock-based cycles, achieving orders-of-magnitude efficiency in sensory and inference workloads. These architectures may not follow Moore’s curve, but they embody its spirit: extracting more from less through structural ingenuity.

The deeper question, then, is whether Moore’s Law was ever truly a law—or simply a narrative of ambition. From a historical perspective, it was descriptive, not prescriptive. It captured a period of exponential convergence between science, manufacturing, and market demand. That convergence may have run its course, but its legacy endures in the mindset it fostered: relentless optimization, competitive iteration, and faith in the compound returns of innovation.

Shift in ICT Innovation Drivers (2010–2025)
Shift in ICT Innovation Drivers (2010–2025)

Today, ICT faces a new epoch that demands different metrics. Instead of counting transistors, the industry measures value in computational efficiency, throughput, latency, and total cost of ownership. The metric of success has shifted from “smaller and faster” to “smarter and more efficient.”

This does not mean that progress is slowing. On the contrary, ICT innovation is accelerating—just not along the same axis. Software-driven scaling, domain-specific hardware, and cloud-native architectures are extending the digital frontier far beyond what Moore could have imagined. The spirit of exponential improvement survives, but it manifests through integration rather than miniaturization.

Moore’s Law is not wrong—it is fulfilled. Its predictive power has faded, but its philosophical influence persists. The ICT industry that it inspired now faces the challenge of redefining progress in multidimensional terms: performance, economics, and adaptability.

To move forward, technologists must replace the myth of endless doubling with a new design ethic grounded in diversity and intelligence. ICT’s future will be shaped less by transistor count than by system coherence—how seamlessly hardware, software, and networks evolve together.

The age of exponential growth may be ending, but the age of exponential creativity is only beginning.


Key Takeaways

  • Moore’s Law is no longer a reliable predictor of ICT progress; transistor scaling faces physical, economic, and architectural limits.
  • ICT growth is shifting toward specialization, co-design, and system-level optimization rather than transistor miniaturization.
  • Semiconductor economics have entered a cost inversion era, making traditional scaling economically unsustainable.
  • New paradigms—AI accelerators, quantum computing, and neuromorphic systems—represent the next wave of ICT advancement.
  • The industry must redefine progress in terms of efficiency, adaptability, and intelligent design, not transistor density.

Sources

  • Moore, Gordon — Cramming More Components onto Integrated CircuitsLink
  • Shalf, James — The Future of Computing Beyond Moore’s LawLink
  • MIT Microsystems Technology Laboratory — The Cost Inversion Era of Semiconductor EconomicsLink
  • International Energy Agency — Electricity Use and ICT Growth 2024Link
  • Nature Electronics — Domain-Specific Architectures in the Post-Moore EraLink
  • Yazdanbakhsh, Amir — Beyond Moore’s Law: Harnessing the Redshift of Generative AILink
  • IEEE Computer Society — Algorithmic Efficiency and Software-Hardware Co-DesignLink
  • ASML — EUV Lithography Systems OverviewLink
  • Intel Labs — Loihi 2 Neuromorphic Research PlatformLink
  • Google Research — The TPU Architecture and the Future of AI HardwareLink

(Approx. 1,420 words)

Author

Latest News

AI, Data, and the Future of Digital Marketing

Artificial intelligence has redefined marketing from an art guided by intuition into a data-driven science of prediction. Once centered...

More Articles Like This

- Advertisement -spot_img