There’s a runway joke in here somewhere, but it seems that this is a pitch for a new satellite program.
From the National Physical Laboratory
Uncertain climate models impair long-term climate strategies
New calibration satellite required to make accurate predictions, say scientists
A new paper published in Philosophical Transactions of the Royal Society A, explains weaknesses in our understanding of climate change and how we can fix them. These issues mean predictions vary wildly about how quickly temperatures will rise. This has serious implications for long term political and economic planning. The papers lead author is Dr Nigel Fox of The National Physical Laboratory, The UK’s National Measurement Institution.
The Earth’s climate is undoubtedly changing, but how fast and what the implications will be are unclear. Our most reliable models rely on data acquired through a range of complex measurements. Most of the important measurements – such as ice cover, cloud cover, sea levels and temperature, chlorophyll (oceans and land) and the radiation balance (incoming to outgoing energy) – must be taken from space, and for constraining and testing the forecast models, made over long timescales. This presents two major problems.
Firstly, we have to detect small changes in the levels of radiation or reflection from a background fluctuating as a result of natural variability. This requires measurements to be made on decadal timescales – beyond the life of any one mission, and thus demands not only high accuracy but also high confidence that measurements will be made in a consistent manner.
Secondly, although the space industry adheres to high levels of quality assurance during manufacture, satellites, particularly optical usually lose their calibration during the launch, and this drifts further over time. Similar ground based instruments would be regularly calibrated traceable to a primary standard to ensure confidence in the measurements. This is much harder in space.
The result is varying model forecasts. Estimates of global temperature increases by 2100, range from ~2-10◦C. Which of these is correct is important for making major decisions about mitigating and adapting to climate change: for instance how quickly are we likely to see serious and life threatening droughts in which part of the world; or if and when do we need to spend enormous amounts of money on a new Thames barrier. The forecasted change by all the models is very similar for many decades only deviating significantly towards the latter half of this century.
Dr Nigel Fox, head of Earth Observation and Climate at NPL, says: “Nowhere are we measuring with uncertainties anywhere close to what we need to understand climate change and allow us to constrain and test the models. Our current best measurement capabilities would require >30 yrs before we have any possibility of identifying which model matches observations and is most likely to be correct in its forecast of consequential potentially devastating impacts. The uncertainties needed to reduce this are more challenging than anything else we have to deal with in any other industrial application, by close to an order of magnitude. It is the duty of the science community to reduce this unacceptably large uncertainty by finding and delivering the necessary information, with the highest possible confidence, in the shortest possible time.”
The solution put forward by the paper is the TRUTHS (Traceable Radiometry Underpinning Terrestrial- and Helio- Studies) mission, a concept conceived and designed at NPL. This which would see a satellite launched into orbit with the ability to not only make very high accuracy measurements itself (a factor ten improvement) but also to calibrate and upgrade the performance of other Earth Observation (EO) satellites in space. In essence it becomes “NPL in Space”.
The TRUTHS satellite makes spectrally resolved measurements of incoming solar radiation and that reflected from the ground, with a footprint similar in size to half a rugby field. The unprecedented accuracy allows benchmark measurements to be made of key climate indicators such as: the amount of cloud, or albedo (Earth’s reflectance) or solar radiation, at a level which will allow differences in climate models to be detected in a decade (1/3 that of existing instruments). Its data will also enable improvements in our knowledge of climate and environmental processes such as aerosols, land cover change, pollution and the sequestration of carbon in forests.
However, not only will it provide its own comprehensive and climate critical data sets but can also facilitate an upgrade in performance of much of the world’s Earth observing systems as a whole, both satellite and ground data sets. By performing reference calibrations of other in-flight sensors through near simultaneous observations of the same target, it can transfer its calibration accuracy to them. Similarly its ability to make high accuracy corrections of atmospheric transmittance allow it to calibrate ground networks measuring changes at the surface e.g. flux towers and forests and other reference targets currently used by satellites such as snowfields of Antarctica, deserts, oceans and the Moon. In this way it can even back correct the calibration of sensors in-flight today.
TRUTHS will be the first satellite to have high accuracy traceability to SI units established in orbit. Its own measurements and in particular the calibration of other sensors will not only aid our understanding of climate change but also facilitate the establishment and growth of commercial climate and environmental services. One of the barriers to this markets growth is customer confidence in the results and long-term reliability of service. TRUTHS enable a fully interoperable global network of satellites and data with robust trustable guarantees of quality and performance.
The novelty of TRUTHS lies in its on-board calibration system. The instruments on the TRUTHS satellite will be calibrated directly against an on-board primary standard – an instrument called a CSAR (Cryogenic Solar Absolute Radiometer). This compares the heating effect of optical radiation with that of electrical power – transferring all the difficulties associated with existing space based optical measurements (drift, contamination, etc) to more stable electrical SI units. In effect, this mimicks the traceability chain carried out on the ground in orbit.
This would make climate measurements ten times more accurate and give us models on which we could make important decisions about the future.
The project, which would be led by NPL, is being considered by different organisations. The European Space Agency has recommended looking into ways to take it forward, possibly as a collaboration with other space agencies. NASA is also keen to collaborate formally.
Nigel concludes: “Taking this forward would be an excellent investment for the UK, or any other country which supports it. This is not only an effective way to address the problem of understanding climate change, but also an excellent opportunity for business. It would grow expertise in Earth Observation and showcase the UK’s leading space expertise – an industry which is growing by 10 per cent a year. It would also provide a platform to underpin some of the carbon trading which will be a big international business in the near future.”
The full reference for the paper is:
Phil. Trans. R. Soc. A (2011) 369, 4028-4063
doi:10.1098/rsta.2011.0246
The URL after publication will be: http://rsta.royalsocietypublishing.org/lookup/doi/10.1098/rsta.2011.0246
Nigel Fox delivered a lecture on this subject as part of NPL’s Celebrating Science lecture series, which can be viewed here: http://www.youtube.com/watch?v=BalCag7fQdE&feature=player_detailpage
More details can also be found at http://www.npl.co.uk/TRUTHS
About the National Physical Laboratory
The National Physical Laboratory (NPL) is the UK’s National Measurement Institute and one of the UK’s leading science facilities and research centres. It is a world-leading centre of excellence in developing and applying the most accurate standards, science and technology available.
NPL occupies a unique position as the UK’s National Measurement Institute and sits at the intersection between scientific discovery and real world application. Its expertise and original research have underpinned quality of life, innovation and competitiveness for UK citizens and business for more than a century.
Maybe the Team will soon find themselves facing a few inconvenient TRUTHS.
Thank you.
People like DA Stainforth, in the UK, were calling attention to the inadequacy of the models way back in 2007.
The models are basically poor and climate scientist who work with them know that, but they are extremely reluctant to admit it in public because bang goes their funding.
Doesn’t precise measurement of earthshine provide an accurate measurement of albedo?
Note that this project has two observatories, so there are measurements from each hemisphere.
http://www.bbso.njit.edu/Research/EarthShine/
If any of you were CEO of a capitalistic company competing for market share, how many of you would go to your board of directors and thank them for the huge amounts of money spent on something with such dubious value that it needs to be essentially replaced?
I thought not.
But such is the way of climate models.
Friends:
NPL is truly a world centre of excellence in mensuration, and Nigel Fox makes good points concerning the need for accurate, precise and reliable climate data acquisition. His comments on the models are also worthy.
More than a decade ago a group of scientists from around the world was assembled to make presentations on climate change at the US Congress. There were three panels each of three persons (one of whom chaired the panel) and who each gave a presentation. The presentations from each panel were followed by a Q&A session. Fred Singer chaired the first panel on climate data, I chaired the second on climate model performance, and David Wojick chaired the third on policy implications.
The first question to the panel on climate models was voiced in an aggressive manner and was;
“There seems to be a dichotomy. The first session said the climate data is wrong, and this session says the models are wrong. Where do we go from here?”
Gerd Rainer-Weber rose to reply but – as chairman of the panel – I signalled him to desist then I looked the questioner in the eyes and replied;
“Sir, there is no dichotomy. The climate data are right or they are not. If the climate data are right then the models cannot emulate past climate. Alternatively, the climate data are not right. In either case, we cannot assess the models’ performance and, therefore, they are not useful as predictive tools. So, I agree your question, Sir, where do we go from here?”
The questioner studied his shoes and said nothing.
More than a decade has passed since then, but I would give the same answer to that question today.
Richard
The TRUTH is out there!
(Hey, somebody had to say it.)
🙂
“It would also provide a platform to underpin some of the carbon trading which will be a big international business in the near future.”
————————————–
The question is “How will they search differently and come up with a higher number for CO2?”
Did the NPL question or did they investigate Dr Jones, the Team, climategate, or the harry read me file?
I wonder if they will be the science wing of the IPCC.
From NPL in the head post: “Similar ground based instruments would be regularly calibrated traceable to a primary standard to ensure confidence in the measurements. This is much harder in space.”
Traceable to a primary standard means the measurements would have a known absolute accuracy, as contrasted to precision, with a known accuracy standard deviation. This is exactly what has been entirely missing from the surface air temperature measurements across the entire 20th century and right up through the present.
Comparing regional time series does not test for accuracy in surface air temperatures. Regional correlations in temperature records only imply that weather-induced systematic errors — most notably from solar loading and wind speed effects — can be as regionally correlated as the weather itself. Without an external reference standard, systematic sensor measurement errors are hopelessly convoluted into the surface air temperature record. And the surface air temperature record is one of the primary fine-tuners of climate models.
mpaul your question is at once highly appropriate and a tragic commentary on how the systematic dishonesty in climate science has cast a dark shadow of legitimate suspicion over the integrity of the entire scientific enterprise.
Twenty years ago, your question would have been so far beyond reason that it likely would have occurred to no one. Today, it’s perhaps among the first questions that come unbidden to mind. No one can trust these guys anymore, and for good evidential reasons. Not only that, but virtually every single major official science organization has taken a public stand in support of a position reached dishonestly. They are no longer worthy of trust to do the right thing.
Let’s eliminate 20 of the 22 climate models, take the funding and invest in high quality measurement of all things critical to climate science.. And while we are at it, let’s measure this “positive feedback” thingy and put this issue to bed.
See Nigel’s full 55 slide presentation
Accurate radiometry from space: An essential tool for climate studies Dr Nigel Fox 25 Jan 2011
http://www.bipm.org/utils/common/pdf/RoySoc/Nigel_Fox.pdf
That goes with the video of Nigel’s talk: Seeking the TRUTHS about climate change
For paper abstract posted at: Accurate radiometry from space: an essential tool for climate studies
http://rsta.royalsocietypublishing.org/content/369/1953/4028
Note especially Slide 13 of 55 showing the Feedback factor uncertainty
Cloud uncertainty 0.24 (simple ratio 93% of total) compared to Total uncertainty 0.26.
Slide 14 – impact of improving uncertainty on predictions
Slide 17 – Cooler sun = warmer earth because of opposite trends in Visible vs UV
How do they intend to prevent calibration loss during launch ?
I love the bit about the Thames barrier…. has no one read all the literature from the 60`s up to now, the South Eastern `plate` of England (which includes London) is declining at an incredibly well define rate. It is slowly dipping into the English Channel, it is not a question of “if” but “when” a new barrier needs to be built.
Haven’t they just told us we won? What more needs to be said?
Is this guerilla science?
Hard data is the enemy of AGW, but if someone tried to make a case for a satellite based on the idea that it disproves AGW the chance of funding would be zero. By wrapping it up as a way of distinguishing between AGW models the chance of funding is good. We get improved data, and data is data!
If CLOUD had been described as an experiment to DISPROVE solar influence on climate then funding would have been less likely to be delayed. If the experiment happened to show the opposite then it is not the experimenter’s fault – that is what science should be about.
Desertyote says:
September 19, 2011 at 1:33 pm
“Sorry dude, old hat. This reads like a gee wiz “New Scientist” article.
BTW, I have 35 years of experience with electronic test and measurment, including uncrtainty measurment and calibration. Half of this was for space based electronics some of which are on other worlds.”
=====
Thanks for posting that. Let me add that many satellites have included on board calibration standards for some sensors since the very early days of satellites. Unfortunately, it’s been nearly 50 years since I worked with those, and I no longer remember specific examples. And worse yet, even if I did remember, I probably couldn’t remember what was classified and what wasn’t and therefore couldn’t cite examples if I wanted to.
That said, if some types of observations really are a bit shaky and could do with better calibration — on board or ground based — who can argue? I would ask however why the improved instruments can’t just be developed and deployed on future satellites within existing observation programs. Do we really need a new satellite program to make these observations?
Us normal people know man cannot control Earth’s climate, so our plan is to try to adapt as Mother Nature does her thing. But my big question is, what are these “man-caused climate change” idiots going to do when they finally realize there’s not a darn thing man can do to prevent “climate change”? Or are these people so stupid they will never realize their mistake?
Hank Zentgraf says:
September 19, 2011 at 3:44 pm
“Let’s eliminate 20 of the 22 climate models, take the funding and invest in high quality measurement of all things critical to climate science.”
This is an EXCELLENT suggestion. And I know at least one candidate for elimination…
I would also suggest an old fashioned approach. Instead of mooching relentlessly off the taxpayers, let’s have a telethon to raise needed funds for climate research! They could get entertainers, politicians, and movie stars to plead with the American people that climate research is vital. Maybe something like this…
Ted Dansen: “Just look at this poor little GCM, Martin!”
Martin Sheen: “[sniff] Yeah, it has no computer to run on…it just sits there, lonely, unfulfilled!”
Bono [to audience]: ” But YOU can help…don’t let this climate model die! Just a small donation of $10/month could give it a home on a nice parallel computer.”
Everyone [singing]: “We are the world! We are the children! We will be broke by 2012, unless you start givin’…….”
On average, the Earth receives 5,419,305,496,166,480,000,000,000,000,000,000,000 visible light photons from the Sun each day.
It also emits 32,515,832,976,998,900,000,000,000,000,000,000,000 Infra-red photons each day.
Each one of those solar photons, spends time in 31,679,996,832 individual molecules on average before it escapes to space as an infra-red photon.
Imagine a climate model that can simulate that.
We have to measure what really happens instead of relying on a 5.35*ln(CO2/CO2orig)-based climate model.
—
If we do put new satellites up, the organization that is in control of the data has to be objective and not already committed to the 5.35*ln climate models and the theory.
P Walker says:
How do they intend to prevent calibration loss during launch ?
Exactly what my first thought was. The only way to really calibrate the instrument is to fly a calibrating instrument aboard a Shuttle like craft that is in orbit briefly and can do “crossing runs”. This is where the short term calibrating instrument takes data at the same time and place the original satellite instrument does. The calibrating instrument is then returned to earth and closes the loop as its final calibration is determined. Not absolutely guaranteed to catch drift because launch and landing jumps might not be caught but there is some chance to get to the TRUTH.
Bernie McCune
Chaps we need a plan, something to keep us in the money for 20 odd years and also something to discredit those poxy satellites that keep showing nothing drastic is happening.
Interstellar Bill: The last thing the Warmistas want is any kind of empirical truth,
especially from a satellite called ‘Truth’. Only computer models need apply for that label.
Are you not familiar with the blockbuster Harries et al Nature paper of 2001 or any of several followups and confirmations? That’s all real, empirical data.
Runway joke? Let me volunteer a 30 year old runway joke:
During the development of the F-16, the Air Force assigned an office to ride herd on the weight budget for the plane. One day he visits the avionics department in a lather. He was sure that the plane was way over budget on weight. The Manager goes over his table for avionics. “Nothing wrong that see.”
The lieutenant says, “No, those figures are just for the computers, radar and display. You are spending just as much money on computer programs and they are not on this table. That’s what’s missing.” Slowly, the manager said, “ Soft ware does not weigh any thing.”
The flustered lieutenant left, only to return with a cart filled with trays of IBM punch cards. “These are only a part of one of you programs. You want to tell me they don’t weigh something!?!?” The manager replied, “We only use the HOLES.”
“Estimates of global temperature increases by 2100, range from ~2-10◦C. Which of these is correct is important for making major decisions about mitigating and adapting to climate change”
The range does not include negative numbers, and that is because warming is presupposed.
Warming and cooling have been presupposed in decades past, and have turned out wrong.
What is there, that has been proven, about this time that makes it so special?