
Elon Musk announced on X this week that xAI’s “Colossus 2” supercomputer is now operational, describing it as the world’s first gigawatt-scale AI training cluster, with plans to scale to 1.5 gigawatts by April. This single training cluster now consumes more electricity than San Francisco’s peak demand.
There is a particular cadence to announcements like this. They arrive wrapped in the language of inevitability, scale, and achievement. Bigger numbers are offered as evidence of progress. Power becomes proof. The gesture is not just technological but symbolic, and it signals that the future belongs to those who can command energy, land, water, labor, and attention on a planetary scale (same as it ever was).
What is striking is not simply the amount of electricity involved, though that should give us pause. A gigawatt is not an abstraction. It is rivers dammed, grids expanded, landscapes reorganized, communities displaced or reoriented. It is heat that must be carried away, water that must circulate, minerals that must be extracted. AI training does not float in the cloud. It sits somewhere. It draws from somewhere. It leaves traces.
The deeper issue, though, is how casually this scale is presented as self-justifying.
We are being trained, culturally, to equate intelligence with throughput. To assume that cognition improves in direct proportion to energy consumption. To believe that understanding emerges automatically from scale. This is an old story. Industrial modernity told it with coal and steel. The mid-twentieth century told it with nuclear reactors. Now we tell it with data centers.
But intelligence has never been merely a matter of power input.
From a phenomenological perspective, intelligence is relational before it is computational. It arises from situated attention, from responsiveness to a world that pushes back, from limits as much as from capacities. Scale can amplify, but it can also flatten. When systems grow beyond the horizon of lived accountability, they begin to shape the world without being shaped by it in return.
That asymmetry matters.
There is also a theological question here, though it is rarely named as such. Gigawatt-scale AI is not simply a tool. It becomes an ordering force, reorganizing priorities and imaginaries. It subtly redefines what counts as worth knowing and who gets to decide. In that sense, these systems function liturgically. They train us in what to notice, what to ignore, and what to sacrifice for the sake of speed and dominance.
None of this requires demonizing technology or indulging in nostalgia. The question is not whether AI will exist or even whether it will be powerful. The question is what kind of power we are habituating ourselves to accept as normal.
An ecology of attention cannot be built on unlimited extraction. A future worth inhabiting cannot be sustained by systems that require cities’ worth of electricity simply to refine probabilistic text generation. At some point, the metric of success has to shift from scale to care, from domination to discernment, from raw output to relational fit.
Gigawatts tell us what we can do.
They do not tell us what we should become.
That remains a human question. And increasingly, an ecological one.
Here’s the full paper in PDF, or you can also read it on Academia.edu: