Massive data centers for generative AI are bad for the Earth. How about launching them into orbit?
Data centers are being built at a frantic pace all over the world, driven by the AI boom. These facilities consume staggering amounts of electricity. By 2028, AI servers alone may use as much energy as 22 percent of US households.
Of course that demand will raise energy prices for everyone, and we’ll need more power plants, which means more global warming. Then there’s the water problem. High-density AI chips run so hot that air cooling isn’t enough. New facilities are turning to water cooling. The technique of choice is water evaporation. It’s more effective and energy-efficient than recirculating water, but a large data center using this method consumes millions of gallons of water a day, draining local water supplies. So it’s no surprise that more and more towns are pushing back on data center projects in their area. But if everyone goes NIMBY, it gets sort of NOMPY—like “not on my planet, you bastards.” What to do? People aren’t going to stop using AI. That’s why some folks are saying we should build data centers in space. Just think: You could get 24/7 energy from solar panels—it’s always sunny in space—and the thermal stuff wouldn’t be an issue because it’s so cold out there. You could do the heavy processing in orbiting data centers and beam the results back to Earth just like satellite internet. That’s the claim, anyway. Could this really work? Or is it about as practical as colonizing Mars? I asked Google’s AI Overview, and it said, “Yes, data centers can be built in space.” But of course it would say that. I think we’ll have to go full renegade and dial up some old-fashioned human intelligence on this. Power Up One of the really big ideas in science is called conservation of energy. This says that for any “system” , the total energy going into the system equals the change in energy of that system plus the energy going out of the system: Or rearranging, any change in the amount of energy in a system equals the difference of energy inputs and outputs. What this says is that energy can’t be created or destroyed, only transformed from one form to another—like solar panels convert light energy to electric energy. Energy is measured in joules, but it's often easier to talk about power instead. Power is the change in energy per unit of time , so it is measured in joules per second, also known as watts. In terms of power, conservation of energy says the power into a system equals the power out of the system plus the power of the change in internal energy. For example, say the “system” is a desktop PC with a 300-watt power supply. That means the maximum power input is 300 watts. What about the energy changes in the system? Well, it gets hot, so there’s an increase in thermal energy. But it soon reaches a staable operating temperature. There's really no other energy changes in the computer, so all 300 watts of power coming in must equal the power going out. Where does that 300 watts of output go? Well, your PC has a fan that moves air across the processor and GPU. The hot components interact with the air to heat it up. The fan then moves this air out, transferring heat from the computer to your room. Yes, your PC is basically a 300-watt space heater that also plays video games. Two Kinds of Heat Transfer If two objects are at different temperatures, thermal energy moves from the warmer one to the colder one. So, that hot computer transfers energy to the cooler air. Because the CPU and air molecules are in contact, we call this heat conduction. It works fast. That’s why 70-degree pool water feels so cold: You’re immersed in it, so it rapidly sucks a lot of heat energy out of your body. But there’s another way heat can be transferred. If the objects aren't touching but have a direct line of sight, there can be a radiation interaction. This is what happens in an electric oven with no airflow. The heating element doesn't touch your pizza, but it’s so hot that it radiates infrared light, which heats up your food. Computers in Space Now, what if you put your gaming PC in low Earth orbit? How would we siphon off the waste heat generated? Those fans inside won’t help. They can't move air over the processors if there's no air. The only option is to have a radiation interaction with the surroundings, and radiation is not as efficient as conduction. This is where people often go wrong in thinking about computing off-planet. Actually, space isn’t even “cold.” Temperature is a property of matter—it measures molecular motion—and space is pretty much a vacuum. With no molecules to vibrate, it has no intrinsic temperature. And with radiation as the only means of heat transfer, objects in space actually cool down slowly. We can calculate the rate of thermal radiation for an object using the Stefan-Boltzmann law. It looks like this: Here ε is the emissivity of the object—how effective it is as a radiator , σ is the Stefan-Boltzmann constant, A is the surface area, and T is the temperature . Since we have temperature to the fourth power, you can see that hotter things radiate much more power than cooler things. OK, say you want to play Red Dead Redemption in space. Your computer is gonna get hot—maybe 200 F . To keep it simple, let's say this is a cube-shaped PC with a total surface area of 1 square meter, and it's a perfect radiator . The thermal radiation power would then be around 1,000 watts. Of course your computer is not a perfect radiator, but it looks like you’d be fine. As long as the output is greater than the input , it’ll cool down. Now say you want to run some modest AI stuff. That’s a bigger job, so let’s scale up our cubical computer with edges twice as long as before. That would make the volume eight times larger , so we could have eight times as many processors, and we need eight times as much power input—2,400 watts. However, the surface area is only four times larger, so the radiative power would be about 4,000 watts. You still have more output than input, but the gap is narrowing. Size Matters You can see where this goes. If you keep scaling it up, the volume grows faster than the surface area. So the larger your space computer, the harder it is to cool. If you were picturing an orbiting Walmart-size structure, like the data centers on Earth, that's just not going to happen. It would melt. Of course, you could add on external radiation panels. The International Space Station has these. How big would they have to be? Well, say your data center runs on 1 megawatt. Then you’d need a radiating area of at least 980 square meters. This is getting out of hand. Oh, and these radiators aren't like solar panels, connected by wires. They need systems to conduct heat away from the processors out to the panels. The ISS pumps ammonia through a network of pipes for this. That means even more material, which makes it that much more expensive to hoist into orbit. So let’s take stock. Even though we set this up with favorable assumptions, it’s not looking very good. We’re not even taking into account the fact that solar radiation will heat up the computer as well, which will require even more cooling. Or that intense solar radiation will likely damage the electronics over time. And how do you make repairs? However, one thing is clear: Because cooling is inefficient in space, your “data center” would have to be a swarm of small satellites with better area-to-volume ratios, not a few large ones. That’s what most proponents, like Google’s Project Suncatcher, are now suggesting. Elon Musk’s SpaceX has already requested FCC permission to launch a million small AI satellites into orbit. Hmm. Low Earth orbit is already congested with 10,000 active satellites and some 10,000 metric tons of space junk. The risk of collisions, maybe even catastrophic Kessler cascades, is already real. And we’re going to add a hundred times as many satellites? All I can say is, “Look out below.”
Dot Physics Energy Data Centers Artificial Intelligence Climate Change Environment
United States Latest News, United States Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Alabama data centers multiply as artificial intelligence fuels growthAlabama has emerged as a major data center hub with 27 facilities now operating or under construction statewide, including billion-dollar investments from Google and Meta.
Read more »
Data Centers Spark Debate: Residents Voice Concerns Amidst Rapid ExpansionCommunities across the United States are grappling with the rapid proliferation of data centers, sparking debate among residents. Concerns revolve around rising utility costs, potential property value impacts, noise pollution, and the overall benefits to local areas. The White House and Congress are reportedly working on policies to address financial burdens.
Read more »
POLL: Do you support data centers in Central Texas?The San Marcos City Council has decided not to move forward with a rezoning request that could have brought a new data center into the city.
Read more »
New polymer capacitor stores 4x energy at 482°F for electric vehicles, data centersMeta description: New polymer alloy capacitor stores four times more energy and survives up to 482 F, boosting EV and data center systems.
Read more »
AI Data Centers in New Jersey Spark Concerns Over Rising Electricity CostsNew Jersey residents and lawmakers are expressing concerns about the potential impact of rapidly expanding AI data centers on electricity costs. The energy-intensive facilities, operating continuously, are raising questions about the state's energy supply and the possibility of increased rates for consumers. A Rutgers University study is underway to assess the effects.
Read more »
Texas data centers challenge Virginia for top market statusBusiness Insider tells the global tech, finance, stock market, media, economy, lifestyle, real estate, AI and innovative stories you want to know.
Read more »
