A Failing Power Grid: The Aftermath of Innovation

Recently, during a conversation about NVIDIA and AI, something came up that I had not previously considered: the impact of AI on the power grid. In a place like California, which has already seen difficulties simultaneously cooling houses and charging electric vehicles, AI poses a threat to the power grid. I’m all for innovation, but it seems that other necessary and critical areas impacted by such innovations are frequently an afterthought at best (e.g., infrastructure). This is not AI-specific but for innovation across the board. During the conversation, my thoughts kept coming back to consultation and creative, outside-the-box thinking. Innovation is new, you’re forging ahead on a path not previously taken. You might not be able to think through every possible outcome or consequence, but it’s important to bring in the necessary individuals who have the expertise to think through as many of those potential consequences as possible.

All too often, concerns are thoughts quickly pushed aside in the name of innovation and advancement in an industry. How often are we 5, 10, or 15 years down the road from unleashing a certain advancement, seeing the potentially avoidable consequences, and left wondering, why no one considered or acted on this before?

The demand for AI is great, but this demand also impacts the power grid. I’ll put energy consumption and respective load on the power grid into perspective. According to James Vincent at The Verge, training a large language model like GPT-3 is estimated to use just under 1,300 megawatt hours (MWh) of electricity. This is roughly the same amount of power consumed by 130 US homes annually. For further clarification, James Vincent goes on to write that streaming an hour of Netflix only requires around 0.8 kWh (0.0008 MWh) of electricity, meaning you’d have to binge 1,625,000 hours to consume the same amount of power it takes to train GPT-3. Not to mention, that number is likely trending upward and doesn’t take into consideration the ongoing electricity required to generate responses and images.

If the power grid was already facing issues, like what is seen in California, did no one take into consideration the increased load due to the high demand for AI-powered technologies? Both Bloomberg and the Washington Post have recently published articles detailing the seriousness of such consequences. AI is great, but not if we’re expending greater energy at an even greater cost to the environment.

I write this not to frown on AI, of course, it has many benefits, but to bring up how important a fully thought-out, collective discussion is when rolling out new technologies. Innovation requires creative, outside-the-box thinking. It’s crucial to think through what other technology it has the chance of impacting. What needs to be in place to allow this new technology to function? Are your company’s computers set up to handle the innovation? This doesn’t stop at AI and Large Language Models (LLMs). This goes for innovations across the board, from rolling out new medical software to developing a new product capability. Proper innovation demands a certain level of patience and restraint before forging ahead. It’s important to approach problems and new ideas from diverse perspectives, consulting with experts to determine any gaps or stones left unturned. At Ideba, we think through creative strategies to solve problems facing SAAS companies every day. I would love to hear your thoughts on AI and the concerns you have when assessing new technologies.

Kristen Higgins – Research Manager