Yes, but what about the 'carbon emissions' of PlayStation® climatology computers?

From the department of ‘press releases we never quite bothered to finish reading’ comes this from Yale’s Journal of Industrial Ecology :

Here, extreme weather effects from climate change are modeled, such as this fire tornado, wghich we all know is caused by climate change and nothing else.
Here, “extreme weather effects from climate change” are modeled, such as this fire tornado, which we all know is caused by climate change and nothing else.

Console Games and Climate Change – Researchers Reveal Carbon Emissions of PlayStation®3 Game Distribution

It’s not always true that digital distribution of media will have lower carbon emissions than distribution by physical means, at least when file sizes are large.

That’s the conclusion of a study published in Yale’s Journal of Industrial Ecology that looked at the carbon footprint of games for consoles such as PlayStation®3. Researchers found that Blu-ray Discs delivered via retail stores caused lower greenhouse gas emissions than game files downloaded over broadband Internet. For their analysis, the investigators estimated total carbon equivalent emissions for an 8.8-gigabyte game because data for 2010 indicated that to be the average game size. The bulk of emissions resulted from game play, followed by production and distribution.

The Internet will become more efficient with time, but game files sizes are likely to continue to increase, making predictions about the relationship between online services and climate change a matter for further research.

Full Citation: Mayers, K., Koomey, J., Hall, R., Bauer, M., France, C. and Webb, A. (2014), The Carbon Footprint of Games Distribution. Journal of Industrial Ecology. doi: 10.1111/jiec.12181


Report here: JIEC Games Distribution PDF


NCAR’s dirty little secret:

WUWT readers of course have heard about the Met Office and their giant new supercomputer called “deep black” that they use for climate simulation and short term forecasts.

Not to be outdone, the National Center for Atmospheric Research (NCAR) in Boulder, CO has commissioned a new supercomputer project of their own: The NCAR-Wyoming Supercomputing Center (NWSC) shown in artist rendering below.

Measuring 108,000 square feet in total with 15,000-20,000 square feet of raised floor, it will be built for 8 megawatts of power, with 4-5 megawatts for computing and 3-4 for cooling.

(it runs on coal fired electricity from Wyoming, BTW)


0 0 votes
Article Rating
Newest Most Voted
Inline Feedbacks
View all comments
September 2, 2014 4:51 pm

Why NCAR’s new climate supercomputer games won’t accomplish diddly-squat:
New paper explains why a new approach to climate modeling is necessary

September 2, 2014 5:08 pm

Will they be able to play Super Mario Brothers?

Jeff L
September 2, 2014 5:10 pm

Wow.. they couldn’t think of anything better to research ?? I am guessing tax payers are footing the bill via grant money. Sad.

September 2, 2014 5:19 pm

Where is the study of carbon emissions from the education and make-work employment of fake scientists engaged in inconsequential research ?

Reply to  jimash1
September 2, 2014 5:34 pm

AND 10,000 or so attendees flying to exotic locations for useless climate conferences?

September 2, 2014 6:00 pm

The new computers will probably spend most of their time minting bitcoins for crooked tech staff.
It’s the perfect crime. It’s almost impossible to make money mining bitcoins these days, if you have to pay for the electricity.
But it’s very possible to make millions of dollars, if someone else is footing the bills.
The crime is difficult to detect – no money goes missing, the only evidence is a small blip on the organisation electricity bill.

Leon Brozyna
September 2, 2014 6:08 pm

Yet another “study” in a never-ending string of such studies that, when stripped of all their gobbledygook, amount to nothing less than a call for us all to sacrifice, to give up, surrender, to live a life of austerity … all for the benefit of the elite.

September 2, 2014 6:37 pm

Who cares?

September 2, 2014 6:40 pm

Since this is akin to measuring the weight of mosquito nuts versus lice nuts while bull elephants are wandering around the same room, the first thought that came to my mind was in regard to what marketing department was trying to influence consumer choices in what manner. Skimming through to the acknowledgments, we indeed find that the lead author:
Kieren Mayers is presently Head of Environment and Technology Compliance at Sony Computer Entertainment Europe. The article is written in his private capacity as a researcher at INSEAD. The research does not necessarily reflect the views of Sony, and no official endorsement should be inferred.
LOL. No vested interest there Kieran?

Gary Hladik
Reply to  davidmhoffer
September 2, 2014 8:41 pm

I skimmed the paper, couldn’t find the relative size of game distribution CO2 emissions to, say, general transportation. I suspect it’s even less than, um, lice to elephants.
Given the lack of data, number of assumptions, and uncertainties in estimates, I’m not surprised that this study fails to replicate others. I could make up a conclusion just as valid as the one they worked so hard to get. The authors themselves admit that their result may or may not be valid for the near future.
Hope they at least had as much fun as the other gamers. 🙂

Reply to  davidmhoffer
September 2, 2014 9:57 pm

In other words, “don’t download the games, buy the Blu-ray Discs” so that Sony gets a share of the pie. LOL

Reply to  davidmhoffer
September 3, 2014 6:26 am

They should sack him.

September 2, 2014 6:51 pm

“Here, “extreme weather effects from climate change” are modeled, such as this fire tornado…”
And in the foreground game hero “Weepy” McKribben, lone survivor in a post AGW fantasy world, does desperate battle with his last “ice cube” attack as his Sanity Points dwindle…

Reply to  Konrad
September 3, 2014 5:21 am

“Weepy” McKribben,

Make that “Weepy Willy McKibben”–it fits because his first name is Bill.
“Weepy Willy” was an epithet in common circulation about 60+ years ago–it deserves reviving.

September 2, 2014 6:55 pm

Sept 2, 2024. NY NY, MexiUSA. The UN confirmed today that the chief cause of climate change has been determined: it is caused by running climate models on massive carbon-spewing HAL computers. Climate models produced by these computers show that, unless the plug is pulled immediately on these carbon hurlers, the Earth will soon enter a climate death spiral.
However, Chief Scientist Morkle Mangg, representing the government-funded Consensus R Us Consortium of Concerned Scientists (CRUCCS), disputed the results. He said a cautious, go-slow approach was appropriate: “Really, we don’t know what in HAL is going on”, he said. “This is no time for alarmism. A bit of skepticism is in order here, a bit of epistemological humility. Look, climate is not some rinky-dink greenhouse glass closed system that can be known by tree rings; it is unbelievably complex; there are unknown knowns, known unknowns, known knowns, unknown unknowns — none of us really has a clue what’s going on. What we are sure of is this: this is no time to pull the plug on anything — especially climate science funding. Of that we are absolutely certain.”

F. Ross
September 2, 2014 7:55 pm

Same olde GIGO but much faster, doncha’ know.

Dave Wendt
September 2, 2014 8:26 pm

supercomputers are another product we pioneered in which we are now very much second rate.
Tianhe-2 (MilkyWay-2) – TH-IVB-FEP Cluster, Intel Xeon E5-2692 12C 2.200GHz, TH Express-2, Intel Xeon Phi 31S1P
Site: National Super Computer Center in Guangzhou
Manufacturer: NUDT
Cores: 3,120,000
Linpack Performance (Rmax) 33,862.7 TFlop/s
Theoretical Peak (Rpeak) 54,902.4 TFlop/s
Nmax 9,960,000
Power: 17,808.00 kW
Memory: 1,024,000 GB
Processor: Intel Xeon E5-2692v2 12C 2.2GHz
Interconnect: TH Express-2
Operating System: Kylin Linux
Compiler: icc
Math Library: Intel MKL-11.0.0
MPI: MPICH2 with a customized GLEX channel
China’s Tianhe-2 Supercomputer Retains Top Spot on 43rd Edition of the TOP500 List
June 22, 2014, 9:09 a.m.
LEIPZIG, Germany; BERKELEY, Calif.; and KNOXVILLE, Tenn.—For the third consecutive time, Tianhe-2, a supercomputer developed by China’s National University of Defense Technology, has retained its position as the world’s No. 1 system with a performance of 33.86 petaflop/s (quadrillions of calculations per second) on the Linpack benchmark, according to the 43rd edition of the twice-yearly TOP500 list of the world’s most powerful supercomputers.

September 2, 2014 9:20 pm

So, how do they figure out the related carbon emissions from the plastic, burning, packaging, mailing, transportation, and transferring of the disk files to the consumer’s computer? All of that HAS to be considered.
In calculating the energy budget of biofuels, there are usually a lot of overlooked energy-invested or using items that are conveniently overlooked. For corn ethanol, they entirely skip over the energy cost of manufacturing the equipment that is dedicated to that industry, both on the farm and in the factory, and focus only on the costs of using the equipment. When all is properly totaled, such biofuels are a clear loss and only a feel good waste of money, land, equipment, and human time as well as taxpayer funds in the subsidies.

September 2, 2014 9:30 pm

Yea, but can it run Crysis 3?

Mac the Knife
September 2, 2014 9:50 pm

If you’re worried about your ‘carbon foot prints’, just remember to wipe your feet on the doormat before you come in the door.

September 2, 2014 10:28 pm

Still…. GIGO
Are you sure you have established all the variables ?

September 2, 2014 10:32 pm

They will only get anything sensible as output from the NCAR super-computer if they put something sensible in to it. This immediately precludes the use of anything based on CAGW theory. At present a five day weather forecast looks ambitious.

September 2, 2014 10:45 pm

And what about all the CO2 generated by the Krebs cycle digesting food?
Mick G

September 2, 2014 11:22 pm

it’s emissions from China that will break the back of CAGW:
some reality at last by Edward P. Lazear, chairman of the President’s Council of Economic Advisers (2006-09) and head of the White House committee on the economics of climate change (2007-08), & professor at Stanford University’s Graduate School of Business and a Hoover Institution fellow:
2 Sept: WSJ: Edward P. Lazear: The Climate Change Agenda Needs to Adapt to Reality
Limiting carbon emissions won’t work. Better to begin adjusting to a warmer world.
Were China to continue at this pace for 27 years until it reaches today’s U.S. GDP per capita, it would emit 99 gigatons of carbon in 2041 alone, or three times the world’s current emissions…
Unless an economical low-carbon source of power generation becomes available, it is unrealistic to expect that countries, especially developing ones, will accede to any demand to produce power in a higher-cost manner merely to emit less carbon…
Proponents of strong anti-carbon measures seem to believe that even considering an alternative to mitigation will weaken the public’s willingness to bear the costs of mitigation.
Carbon math makes clear that without major effort and a good bit of luck, we are unlikely to control the growth of emissions enough to meet the standards that many climate scientists suggest are necessary. It is time to end the delusions and start thinking realistically about what can and will be done.

September 3, 2014 2:57 am

how many millions of peripherals got turfed when they wouldnt run with mr “im so clean/green/caring” gates cruddy vista program?
then how many Billions of perfectly good pcs got ditched when 7 and 8 made it impossible to run on anything but a super duper new and expensive pc?
but dorkboy reckons he can save us by geo enginnering a planet he could seriousl;y care less about if actions are louder than words..

The Iconoclast
September 3, 2014 5:41 am

Based on browsing it, they actually worked pretty hard on the paper. The thing I’m most suspicious of is the claim that it takes 0.46 to 1.46 KWh to transfer 1 GB. I think that is wicked high. A single 1.75″ tall rackmount server simply serving files can sustain 10 Gbits/sec and no way is it using more than a couple hundred watts. That’s like 20,000 gigabytes/KWh. They assumed 8 GB/KWh for the server.
There are tons of reasons why e-distribution is better and in the long run the energy required to transmit is going down even though as they say games will get bigger.

Rob Potter
Reply to  The Iconoclast
September 3, 2014 6:44 am

More to the point, what energy is used by the server when it is NOT distributing the game file? Probably close to the same amount so there would actually be no saving from not downloading the game, but there would be a saving from not making a new Blu-Ray disc to replace the one that is still on the shop shelf.
This is another argument like the “biofuels reduce CO2 emissions” argument in that everyone starts with an opinion and then adjusts where they draw the lines on what bits to include or exclude in the energy calculations.

Verified by MonsterInsights