Computing power is no longer the AI ​​bottleneck – it is energy production


For much of the 20th century, artificial intelligence (AI) struggled not because researchers lacked ambition, but because the hardware available to run it simply wasn’t powerful enough. Early AI systems hit hard limits on processing speed and memory, contributing to repeated “AI winters” as progress stalled and funding dried up.

That problem is mostly gone now. Today, AI models are trained on specialized chips in huge data centers and can be scaled up in weeks instead of years. Compute, which used to be the main bottleneck, is now something that can be bought with enough money. Companies like Nvidia or AMD are also mass-producing even more powerful graphics processing units (GPUs) – components conventionally used for gaming or visualization, but also well-suited to processing AI calculations – as each year passes.

Add Comment