“Data is the new oil” may have outlasted its usefulness as a metaphor, but one aspect still rings true: Both industries have a serious environmental footprint. According to the Department of Energy, data centers account for about 2 percent of all electricity use in the US.
That means the cloud—which powers every Netflix binge, PUBG match, and email—has a lining made not of silver, but of carbon. For individuals, the bits in question don’t amount to much. The digital footprints of businesses, however, can be large enough to ding the environment. For them, finding the greenest way to store their data would help cut down on their emissions. But how does a high-minded plutocrat go about that? The answers are not always obvious.
The top three cloud providers—Amazon Web Services, Google Cloud, and Microsoft Azure—account for approximately two-thirds of all rentable computing services, so WIRED has compiled a guide to help you understand how they decarbonize your data.
What Makes a Cloud Green?
Some companies still store their data in blinking black boxes in a hallway closet. Others have such massive computing needs that they’ve built their own data centers. For everyone in between, there are basically three options: pay either Amazon, Microsoft, or Google for the privilege of stuffing your data into one of their mind-boggingly large “hyperscale” server farms.
To assess the relative greenness of different clouds, Jonathan Koomey, an expert on the topic, highlights three metrics: The efficiency of a data center’s infrastructure (lights, cooling, and so on), the efficiency of its servers, and the source of its electricity.
Each of the Big Three cloud providers has ironed out inefficiencies in the hardware and software running in their data centers. They run virtual machines on their servers to limit downtime, install custom cooling systems, automate wherever possible, and so on. This ruthless pursuit of efficiency has helped the data center industry keep its energy needs fairly constant over the past decade. It also means that when companies move their data from in-house servers to the cloud, they will almost certainly end up reducing their energy consumption.
It won’t stay that way forever, warns Dale Sartor, a staff scientist at the Lawrence Berkeley National Laboratory who studies energy efficiency. Someday we’ll hit a tipping point, when most organizations have already moved their data centers offsite. Then the energy demands of the cloud will start to rise. “I don’t think anybody envisions a reduction in the growth of our appetite for computation,” Sartor says. “So the chances we’re going to see an explosion in energy use sometime in the next couple of decades is pretty high.”
That’s why a critical measure of a data center’s greenness is the source of its energy. The Big Three have all pledged to completely decarbonize their data centers, but none has entirely ditched fossil fuels yet.
To clean up their carbon footprints, these companies lean heavily on a tool known as a renewable energy credit, which is basically a token representing a utility’s green energy generation. RECs are how companies like Google and Microsoft can claim their data centers are powered 100 percent by renewables while still being connected to grids that use fossil fuels. In reality, only a fraction of each company’s energy comes directly from solar or wind installations; the rest comes from RECs.
Calculating the greenness of a cloud is rife with nuanced distinctions. In the report card below, we’ve highlighted some of the most important factors to consider if you’re looking to decarbonize your data.
What they say:
Of the Big Three, Google has the smallest share of the market, but it has arguably done the most to decarbonize its data. In 2017, the company announced it achieved 100 percent renewable energy across all of its operations, including its data centers. It claims that all data processed by Google Cloud has “zero net carbon emissions.”
How they do it:
Google says it is the “largest corporate buyer of renewable energy in the world,” and in September it increased its renewable energy portfolio by 40 percent through power purchase agreements with utilities around the world. These types of deals are designed to fund the construction of new renewable energy projects in exchange for access to their energy once they’re online. The idea, which Google helped pioneer, is to expand renewable resources on the grid.
In addition to renewable energy plays, Google uses machine learning to continuously optimize its data centers. An algorithm trained on historical weather data, for example, knows how to tweak a data center’s cooling system in response to the environment, says Joe Kava, Google’s vice president of data centers. The system samples various weather conditions every 5 minutes so if there’s a sudden drop in temperature, the facility knows to devote less energy to cooling the servers.
What’s the catch?
In 2018, Google started an oil and gas division with the explicit aim of attracting the fossil fuel industry. The company promised that its machine-learning tools combined with its cloud service could help those companies better act on their data—in other words, helping them extract oil and gas from existing reserves faster and more efficiently.
It’s also worth noting that in parts of the world with little to no renewable energy installations, Google’s data centers still rely on fossil fuels. To atone, the company purchases RECs.
What to expect:
Google Cloud does its carbon accounting on an annual, global basis. At the end of a year, the company tallies up its energy use and renewable energy purchases and makes sure they’re equal. Kava says Google wants its data centers to be powered by 100 percent renewable electricity on an hourly basis. This is a lot more ambitious—the sun isn’t always shining, the wind isn’t always blowing, but the internet firehose never shuts off. That will require not only more renewable installations, but a lot of new technologies, such as long-term energy storage.
Making better use of what renewables do exist will help stretch them farther. Earlier this year, the data center team partnered with Google’s AI unit, DeepMind, to make a machine-learning model to predict wind farm output up to 36 hours in advance. Utilities could use this information to better plan around wind variability and, as a result, increase the amount of wind energy available on the grid……..Read More>>