In a recent press conference, Fed Chair Jerome Powell acknowledged that the current AI-driven data center boom is contributing to inflation in the short term. He explained that the massive physical infrastructure required to build these data centers is placing significant pressure on goods and services, thus pushing prices up. While acknowledging the potential for future productivity gains from AI, Powell suggested that the demand-side buildout is currently outpacing any disinflationary benefits, potentially raising the neutral interest rate rather than lowering it in the near future. The empirical question remains whether demand will grow faster than supply, leaving the ultimate impact of AI on inflation and interest rates uncertain for now.

Read the original article here

It’s a sentiment many of us have been feeling, and now, even the head of the Federal Reserve, Jerome Powell, is acknowledging it. He’s saying that those ever-expanding data centers, the ones powering the artificial intelligence boom, are indeed contributing to the rising cost of living. In short, you’re right to blame them for making your bills more expensive, as Powell himself stated they are “probably pushing inflation up.”

This admission comes as a bit of a relief for many who’ve felt like they were being gaslit, seeing technology touted for its future productivity gains while current realities pointed to increased costs. Powell clarified that while AI promises long-term productivity, the immediate impact of building the infrastructure to support it is inflationary. Think about it: these data centers require vast amounts of electricity, land, and resources to construct and operate, all of which puts a strain on existing supplies and drives up prices for those very same goods and services.

Powell elaborated on this, explaining that in the short term, AI is actually raising the neutral interest rate rather than lowering it. This is because the demand side – the massive physical buildout needed for AI – is currently outpacing any significant productivity payoff. It’s a bit like building the fanciest kitchen in the world but not yet having the chef or the ingredients to cook a meal; the investment is massive, but the immediate benefit in terms of output (or lower prices, in this case) isn’t there yet.

This phenomenon is particularly noticeable in the cost of essential components. We’ve seen significant price hikes in computer hardware like RAM and SSDs, directly linked to the AI and data center craze. Some reports suggest RAM prices have multiplied several times over in just a year. It’s not just about the raw materials; there’s also the aspect of hoarding. Companies are reportedly buying up hardware, even GPUs, and storing them in warehouses, not yet having the infrastructure to utilize them, further exacerbating supply shortages and price increases.

The physical footprint of this AI boom is becoming increasingly apparent in communities. In rural areas, for instance, massive data centers are springing up, often with significant implications for local resources. Concerns about water usage are particularly acute, especially in regions already facing drought conditions. While residents may be struggling with water availability for their own needs, data centers, often operating under lucrative contracts, can secure vast amounts of water for cooling and operation.

This disparity highlights a broader issue of resource allocation. We’re seeing immense financial and logistical efforts poured into building data centers, which many argue offer little direct benefit to the average citizen’s daily life. Instead, their primary function is data storage, raising further questions about privacy and the implications of accumulating vast amounts of personal information. This focus on data centers, while simultaneously facing hurdles in building affordable housing or addressing other community needs, strikes many as a misaligned set of priorities.

The increased demand for electricity to power these data centers also raises questions about our energy infrastructure. While some advocate for the transition to electric vehicles, which also require significant energy, the sheer scale of data center power consumption is often overlooked or downplayed. This creates a paradoxical situation where the infrastructure is deemed capable of handling massive data centers but not necessarily individual shifts in energy consumption patterns.

It’s understandable why people feel frustrated. The promise of technological progress has historically come with the expectation of reduced burdens and costs, but often, the reality is that increased productivity simply leads to higher expectations for output from workers. This AI revolution appears to be following a similar pattern, where the benefits, if they materialize, seem poised to accrue to a select few, while the costs are distributed more broadly.

Powell’s acknowledgment is a crucial step in validating these concerns. It shifts the conversation away from abstract promises of future gains and grounds it in the tangible economic realities many are experiencing today. While the long-term impact of AI on productivity and inflation remains to be seen, it’s clear that the current build-out phase is creating headwinds, and it’s about time that was officially recognized.