Data has become a critical asset in today’s digital world, often referred to as the new gold. It has evolved to the point where it requires its own real estate portfolio scattered across nations and oceans alike. Each facility requires a tremendous number of resources to ensure its information stays preserved and accessible for everyday computing needs. This column serves as a reminder to innovators everywhere that water or liquid based heating and cooling remains the simplest and most elegant solution to a carbon neutral future.
The past few years have marked an unexpected shift in energy consumption, particularly among tech companies focusing on artificial general intelligence (AGI) and quantum computing technology. The data stored from traditional computing is stored in what is referred to as “bits”, but in super computers their data storage metric is referred to as a “qbit” or quantum bits, which can be extremely more complex in their solution responses and require considerable amounts of additional energy than the traditional bit.
In 2023, we saw a notable trend where companies like Microsoft, Google, and Meta significantly increased their energy consumption, despite committing to politically charged international sustainability initiatives. The reason for this makes perfect sense though, in hindsight. With incredible improvements made in the AI space these last few years, we can only expect technology companies will need to get more involved in energy solution planning, if we all want to continue this progressive work.
According to the International Energy Agency, “Electricity consumption from data centers, artificial intelligence (AI) and the cryptocurrency sector could double by 2026.” that’s going to be here in just a few short years, which often is not long enough to get a construction permit or be granted regulatory approval to build. The forecast went on to say, “Data centers are significant drivers of growth in electricity demand in many regions. After globally consuming an estimated 460 terawatt hours (TWh) in 2022” and went on to say that “Data centers total electricity consumption could reach more than 1,000 TWh in 2026.” That is a tremendous amount of electricity demand, and with an already taxed grid, these data centers would continue to build “off grid” with a future option to create a micro-grid or district system, generating energy locally for its neighbors and partners.
Introducing, the “Data Energy Hub” with its application favored towards growing cities and neighborhoods looking to achieve long term sustainability. The hubs would allow above or below ground local storage and stewardship of data and the thermal energy availability of the facility. This availability challenge is rooted in the massive computational power AI and quantum computing will continue to require. Especially so, with large language models (LLMs) growing and supercomputers coming online. Perhaps the data energy hub could be paired with energy independent fabrication plants, which are equally challenging to design, build and manage. I believe we will see a snowballing effect of technological needs ahead of us indefinitely. But we can use these challenges as opportunities to be more energy conscience, thus giving new purpose to the modern computer age - built for the people by the people.
Consider a data energy hub project in a growing town, storing local data for a dedicated rural area below park or farmland, preserving space and providing waste heat to its neighbors by thermal energy network piping systems. This model could help cities everywhere gain access to low cost home heating and cooling. This would be a gift from technology and perhaps be helpful in removing some of the apprehension around the subject of data collection. A data center could be a central source and sink of a city or town’s thermal needs, when connected to a larger network. With greater processing demand comes the need to dissipate more heat generated by these complex machines. Cooling costs, both in terms of energy and water use, have not gotten any cheaper, forcing owners and operators to rethink their approach to facility design and management.
Air vs. Water
Let’s talk about how cooling technologies are evolving, particularly with the rise of liquid cooling. Traditional data centers have relied on air cooled systems for decades. These systems work by conditioning the environment to maintain optimal temperatures for high rack density data storage. Air is circulated through large HVAC systems to keep data server rooms cool and operational, often supported by computer room air conditioners (CRAC units). These units are designed to handle computer generated heat removal, and they come with several plumbing infrastructure needs, such as floor and funnel drains, trap primers, and backflow protected make-up water to name a few. While air cooling remains the typical design for many data facilities in the US, its efficiency naturally diminishes as the heat load increases. The higher the density of servers in a space, the more condensed heat they will produce, requiring more energy to maintain temperatures swings.
I found this to be true while working on a project on Long Island, where my team helped turn an existing laboratory into a 25,000 sq ft data center. This was at the time, one of the largest data centers in the area. Being a young project executive, I was excited to learn more about what goes on behind the scenes of modern day computing and how plumbing aided in its function. On a traditional air cooled data center project, the plumbers work isn’t the most technical, as detailed earlier. With the increased adoption of liquid cooled computing technology and water sourced utilities such as thermal energy networks (TENs). The plumber may one day be the most critical part of a data center project, creating an optimal space for the machines of tomorrow.
A wise man once said, “The plumber protects the data of the nation” – Anonymous Plumber
In contrast to air cooling, liquid cooling has landed on the scene as a highly efficient solution for dealing with the massive heat generated by today’s data centers. Liquid cooling works by directly transferring heat from components like CPUs and GPUs to a liquid coolant or effluent, which will then dissipate the heat more effectively than air. This is particularly important for high density computing facilities, where available rack space can push traditional air cooling to its limits. Liquid cooling systems are already becoming popular in consumer electronics, like my Alienware desktop computer made by DELL, which uses liquid cooling to manage operation temperature from my processors. These processors can generate significant heat during intense computing tasks and liquid cooling allows for more stable performance. When scaled up to the data center level, liquid cooling not only increases efficiency but also preserves electricity. Liquid cooled technology as it is today can handle far greater amounts of internal heat displacement, making it ideal for energy hungry AI tasks.
CPUs, GPUs and GPMs?
At the heart of every computer lies the central processing unit (CPU). Often described as the brain of a computer, the CPU handles the basic commands and instructions that allow the system to function with silicon chips. Chips are composed of millions of tiny little transistors, which are akin to the function of a valve with it’s on/off or one/zero functionality. The CPU is responsible for executing a variety of tasks on the chip, from basic functions to running operating systems and advanced software.
The graphics processing unit (GPU), on the other hand, has become increasingly important in today’s general pretrained transformer (GPT) driven web experiences. For future applications such as OpenAI’s text to video tool, “Sora” the processing power is expected to be significantly higher. Originally the GPU was thought of for only displaying graphic images and video but is often tasked with doing energy intensive work. As a result, GPUs often contribute significantly to the overall emissions of data centers or cryptocurrency mines, generating significant heat energy worthy of capture.
The relationship between CPU and GPU can be likened to that of an apprentice and a journeyman, the CPU (The Apprentice) may initiate tasks, but the GPU (The Journeyman) is the one carrying out the skilled work, monitoring output and managing to get the job done.
Modular and Submerged Data Solutions
Modular storage solutions can certainly be one of the most interesting innovations in data center design and has provided intriguing flexibility. Instead of building massive, permanent structures, companies are turning to container or pod type applications for data storage. These modular data centers can be set up quickly, scaled continuously, and even moved from one location to another if necessary. The portability and flexibility of these systems make them a viable option for businesses that need adaptable data storage solutions or want to reduce their overall footprint in a particular geographical area, whether it be permeant or temporary. These container style data centers are also easier to cool. Pods can be fabricated with liquid cooling systems, allowing for more efficient heat capture, storage and sharing by interconnection. This modular approach to data storage may be especially beneficial for those who operate in regions with largely fluctuating temperatures or limited access to renewable energy sources.
Underwater data centers are taking modular storage down a few levels. Microsoft has been developing and experimenting with underwater data centers for nearly a decade now. These data centers are housed in pill shaped vessels and submerged to the ocean floor, where they leverage the natural cooling properties of the ocean to maintain optimal temperatures. The concept is simple but powerful, the ambient temperature of the ocean water acts as a natural coolant, transferring heat from the data center through the cold walls of the vessel, thereby reducing the need for energy intensive cooling systems. For me, however, this innovation raises environmental questions that I would like to see researched further. Could these underwater data centers eventually contribute to oceanic litter, or could they be engineered to more permanently fit the environment and benefit marine life, like artificial reef? As this technology matures, research must be pursued to ensure that the benefit of innovation always far outweighs the risks.
Partnerships drive progress
The future of mission critical systems is being reshaped by new technologies and innovative designs, such as thermal energy networks and geothermal advancements. It is, however, forcing the tech industry to confront its growing infrastructure needs, providing continuing partnership between the tech and construction industries. I believe this point in time provides an opportunity to collaborate our resources and benefit all communities everywhere, especially those who contribute to the data being collected for future use.
While liquid cooling and piping systems can help to make data centers more efficient and circular, the rise of AI and quantum computing means that tech companies must carefully balance sustainable energy and technological progress. With submergible data centers and futuristic “Data Energy Hubs”, the technology and building industry is at a pivotal moment. By adopting smart water based energy solutions and studying the measurable long term environmental impacts, we get closer to being in harmony with our planets and the natural speed of advancement.
So, I challenge the engineers and industry enthusiasts to think 20 or 30 years ahead, towards the infrastructure needs of the day after tomorrow. For labor leaders and educators, think how the skilled trade workers of the future can gear up and be a part of the technological revolution and energy transformation. We must remain steadfast so that all mankind’s engineered solutions are built scalable for generations to come, using sustainability as the benchmark of progress.