clock menu more-arrow no yes mobile

The Cloud Is Not Green

There is a catch.

Neyro / Shutterstock

The cloud requires immense amounts of energy to provide services to consumers, so data-center operators build where they can find cheap local power rates. The National Security Agency (NSA) constructed a massive $1.5 billion facility in Utah because power there is cheap.

But there is a catch: This data center lives in a desert. Its location receives just over one inch of rainfall monthly. Temperatures routinely reach 100 degrees Fahrenheit or more in the summertime. Yet here sits a facility that its planners say will consume 6,500 tons of water daily, mainly for cooling purposes. The only way this could possibly make sense is if the savings in power make up for the use of a rare resource, and apparently the NSA thinks it does: Officials believe the site will consume more than $40 million worth of electricity every year, and the deal to bring the data center to Utah included access to cheap water.

Companies like Google and Facebook operate some of the world’s largest data centers, and they recognize the same thing the NSA does: Cooling densely-packed computers, which are piled one on top of another, stack upon stack, row upon row inside these cloud facilities, requires a tremendous amount of energy.

Energy does not follow Moore’s law the same way technology does, getting exponentially cheaper every year. Energy costs have remained mostly flat for the past decade, which means they now dominate data-center operations spending. These costs put a cap on how much the cloud can provide to consumers. It’s a desperate situation for cloud providers, and they know that it calls for desperate measures. This is why Facebook has gone to the extreme of building a data center north of the Arctic Circle, and why Google is experimenting with data centers cooled with seawater.

While these efforts are admirable in their inventiveness, they are playing a bit of a game of Whac-A-Mole. Remote data centers built in frigid locations increase building, transportation, and staffing costs. Data centers built near seawater are more vulnerable to flooding and storm damage, and Google had to invest more than $600 million upfront in order to make its seawater-cooling data center come to life.

The fact that companies are willing to go to these lengths is an admission that cloud energy costs are a big problem. Recent studies confirm this. An overview of cloud electricity usage recently found that if data centers were their own nation, they would be the fifth-largest consumer of electricity in the world — only the U.S., China, Russia and Japan use more power annually.

Why do data centers consume so much power?

When you store data in the cloud, that data has to travel — sometimes over thousands of miles — through many hops of electrically powered networking equipment. Computing equipment also generates heat, and when it is packed together tightly, it generates a lot of heat. To deal with the heat, cold-conditioned air is forced through the system 24 hours a day. The fans in these boxes run at such high speeds and are so loud that many data centers are classified as occupational hazards if entered without earplugs.

Because a single data center facility might need more power than a small town, power utilities often have to build out new substations or other infrastructure to support them. Power utilities can’t afford to do that unless the data center promises to use a set level of power per month over a number of years, so data centers often enter into contracts with stiff financial penalties if their power usage doesn’t match predictions. In more than one instance, data centers have literally burned excess power just to avoid those fines. And data centers can’t count on local utilities providing power with the uptime that the cloud demands, so they must also purchase and run on-premise diesel backup generators. These generators don’t sit idle if not used; they must be routinely fired up and burn fuel in order to remain operational when needed.

In a world in which we are increasingly tied to our mobile devices, we are also increasingly contributing to cloud power consumption. A recent report estimated that the total energy usage of a single mobile phone, watching one hour of video per week over a one-year period, is greater than what is used by a typical kitchen refrigerator.

What is the answer?

As an industry, we should minimize data center energy costs any way we can, continuing to find innovative ways to increase efficiency inside datac enters. But many of today’s solutions address symptoms as opposed to root causes, and it is becoming increasingly obvious that the highly centralized nature of data centers is the dominant contributor to energy inefficiency. Fundamentally, there is no reason that the cloud needs to be architected this way, no law of nature preventing us from building less-centralized systems, where devices and computing resources live at or near the edge of the network instead of in a handful of centralized locations — a solution which sheds much of the cooling problem entirely, and can drastically reduce power overhead associated with data centers.

I’m not the first to suggest this. The early Internet itself heralded decentralization, its pioneers eschewing centralized solutions as a core design principle. Others have suggested making datacenters smaller, or even moving racks of datacenter equipment into homes or businesses. My own company, Space Monkey, hopes to make data centers obsolete for cloud storage.

The Cloud isn’t green, and its negative environmental impact grows each day. To fix that, we need to go beyond today’s highly centralized data centers. We need to decentralize the cloud. This not only makes a lot of sense from a sustainability perspective, but the efficiency gained by putting resources closer to users also translates into tangible end-user benefits in cost, speed, capacity and safety.

Alen Peacock is co-founder of Space Monkey. Follow him @spacemonkey.

This article originally appeared on Recode.net.