I was poking around the Hadley site and found this interesting announcement. It seems Hadley CRU / Dr. Phil Jones is looking for a candidate to do this project, with the goal of (as I read it) creating some sort of merger between surface data and MSU satellite data to create a wholly new (and one would hope, imporved) MSU data set. Sort of a “ground truth” for Lower Tropo data I suppose. Given Hadley’s latest antics of purging publicly available data, one wonders if we’ll even get to see the results of this. Any candidates out there? – Anthony

Can we create a better Microwave Sounding Unit climate record through the use of high-quality in-situ data?
School: Environmental Sciences
Supervisor(s): Professor Phil Jones; Dr Roland von Glasow
Application Deadline: 30th September 2009
Description:
SELF FUNDED STUDENTS ONLY. The Microwave Sounding Unit and replacement Advanced Microwave Sounding Unit have been operational since late 1978. They are flown on-board the NOAA series of polar-orbiting satellites and more recently the METOP platform. They measure upwelling microwave emissions from Oxygen molecules which are dependent almost entirely upon temperature. This makes their analysis as a fundamental climate data record appealing. However, the measurements have been made with forecast input in mind leading to inevitable and insidious non-climatic influences permeating the record.
The challenge is how best to remove these to retain an unambiguous estimate of the true long-term changes. To date MSU satellite climate datasets of bulk atmospheric temperature profile characteristics have been created solely by comparison between records from individual satellite platforms. Although this is adequate to identify the likely issues, such a two-point intercalibration is mathematically fundamentally ill-posed for the unambiguous removal of non-climatic influences from the time series.
To adequately remove the non-climatic influences requires multiple independent estimates of the true field value to be able to identify which instrument is behaving anomalously and then remove the non-climatic artefacts with minimal uncertainty. Furthermore, this historical approach runs a fundamental risk to the climate record if at any point there is either no satellite or only one satellite measuring, which could plausibly be the case at some point in the future and has been the case for a limited time in the historical record.
We have high quality processed (non-operational) time series available from a number of sites around the world. These sites include stations participating in the ARM program and a number of national observatories and special research sites. Their data cover most, if not all, of the MSU record which begins in 1979. We also have access to a wealth of information from reanalysis feedback files from the more recent reanalyses, and from the global radiosonde network. Taken together these data should be sufficient to allow a fundamentally different approach to be undertaken to MSU dataset development and hence to re-evaluate currently available MSU-based series. At a minimum they should permit a realistic assessment of the range of plausible time series evolution in a way that is well constrained.
Lessons learnt from the work would have the opportunity to significantly inform development of the recently instigated GCOS Reference Upper Air Network (GRUAN). It would also benefit major upcoming scientific assessments and reanalyses efforts. The student will be expected to spend substantial amounts of time both at the Met Office Hadley Centre and the DWD Lindenberg observatory facility that acts as the lead centre for GRUAN. However, the bulk of their time will be spent at UEA. Financial support will be available for these placements and they will afford a substantial opportunity for the right candidate to develop their skills and expertise.
The candidate will be given training in the critical analysis of disparate observational datasets, statistical analysis techniques, computing techniques as well as standard UEA and Met Office training courses as seen fit by the supervisory panel. The placement in different locations will aid teamwork, transferability of computing skills and problem solving development. The candidate will be expected to provide at least one presentation to a Met Office audience and one to a UEA audience and to attend one high profile conference or workshop. Opportunities may arise to attend further international meetings as funding permits.
References
GCOS, 2003: The Second Report on the Adequacy of the Global Observing Systems for Climate in support of the UNFCCC. Global Climate Observing System GCOS-82, WMO/TD No. 1143, 74 pp
GCOS, 2004: GCOS Implementation Plan for the Global Observing System for Climate in support of UNFCCC. GCOS-92, WMO/TD 1219, 136 pp
Mears, C.A. and F.J. Wentz, 2005: The effect of diurnal correction on satellite-derived lower tropospheric temperature. Science, 309, 1548-1551
Thorne P.W., et al., 2005a: Revisiting radiosonde upper air temperatures from 1958 to 2002, J. Geophys. Res., 110, D18105, doi:10.1029/2004JD005753
Thorne, P.W., et al., 2005b: Uncertainties in climate trends: Lessons from upper-air temperature records. Bull. Amer. Meteor. Soc., 86, 1437–1442
Govt job, civil service, international conferences, pension? Where do I sign up?
I guess they need someone to remove those insidious non-climate factors like temperature, that keep contaminating the data and making it deviate the HadleyCRUT truth. At last, a bid for independence from the pro-skeptic RSS and UAH data.
Good job for another mole.
However, the measurements have been made with forecast input in mind leading to inevitable and insidious non-climatic influences permeating the record.
Is he talking about Al Gore?
Yep, I’m sure it will be imporved.
Mike McMillan (22:17:14) :
Govt job, civil service, international conferences, pension?
I don’t think “self funded students” get any of the civil service perks. Unless the British university graduate programs are radically different that those in other countries.
Well this little job posting is nice but I want to see the July Stats!
There has been no small amount of controversy over the site last month the numbers should be impressive!
Or looking at alexa maybe not 🙁
“Although this is adequate to identify the likely issues, such a two-point intercalibration is mathematically fundamentally ill-posed for the unambiguous removal of non-climatic influences from the time series. ”
I dunno. Sounds like he is looking for a chiropractor to “adjust” the satellite date in order to remove “non-climactic influences”. Sounds like they mean “the data is currently lying so we need to waterboard it and get it to tell the truth”.
Having read the description twice, I do not see what the student will be doing. It is just a long preamble without anything resembling a ‘job description’. Perhaps the message is all buried in what appears to be a language that only the chosen few can understand.
@……….merger between surface data and MSU satellite data to create a wholly new (and one would hope, imporved) MSU data set………..”
Surely you have had enough experience of (UK) bureaucrats to know that’s not how it works. Adjustments will need to be made………
My bet is that they will define the satellite as 1 record and each of the hundreds of ground stations will be 1 record EACH. And so the troubling discrepency between ground & sat records disappears, like state-sponsored magic!
I’m a bit confused as to why this is even necessary. I looked up some of the specifications of the AMSU hardware on each of the NOAA satellites just to try and understand the mechanism by which they calculate a temperature better and it would seem that each unit on each satellite performs a self-calibration every 8 seconds by comparing to an on board black body source at ~300k and also to the cosmic background radiation at ~2.73k. It would also appear that the European Radiocommunications Committee as far back as 1997 were aware of the difficulty in calibrating the AMSU measurements to prevent interference from being interpreted as artificial warming of the atmosphere and that is why it was “necessary to correlate microwave measurements and conventional radiosonde measurements each 12 hours. This is often done near large cities where interference could lead to wrong calibration, retreivals, and statistics for NWP.”
So if this is indeed the case and it would appear from just a cursory look at the hardware used that extensive self calibration routines are used from both on board and external sources why would they need to rework the data set to do something that should already be done every 8 seconds and every 12 hours?
Of course even with the internal and external calibration it would appear that the electronics have an accuracy to about 0.2K and there are acceptable levels of interference in the microwave bands that could contribute another 0.06K of error as well. They also state that “interference up to 5K can not be detected at the data processing level because it does not differentiate from the values normally expected.” In order to reject 5K of potential error the origin and location of interfering sources must be perfectly known.
It just seems rather ridiculous to me to try and deduce fractions of a degree of warming when you have a sensor that could potentially be 5.26K off and has a spatial resolution of 40km and you “calibrate” that to a station located on the tarmac of some airport.
For those interested in the ERC report http://www.erodocdb.dk/Docs/doc98/official/Word/REP046.DOC.
JM
The things would be quite simple if we stop thinking that the rate of photon excitations in the Sun is equal to the rate of photon deexcitations (Г heating = Г cooling). The later is not true and the relation sometimes is Г heating > Г cooling, and other times it is Г heating < Г cooling.
Those imbalances have effects on the temperature and density of the interplanetary medium, taking into account that the temperature and the density of the IPM (around 100 thousand K) decrease in inverse proportion to the square of the distance from the Sun.
We cannot expect that any small change of temperature on the surface of the Sun, and consequently of the IPM, would affect the Earth’s surface temperature instantaneously because the cooling or heating events of the IPM are modified by other factors like GCR, density of plasma, shock waves from solar wind, etc., although those changes are manifested minutes, hours, months, years or decades after the event of photonic excitation-deexcitation imbalance in the Sun occurs.
The IPM is modulated by the solar activity and the Earth is in the middle of the field where those events take place. This is the reason by which I cannot discern why some solar physicists insist on ignoring the clear interaction between the solar activity and the thermal conditions of the Earth.
Please, let me know if this is off topic before you erase or relocate it. Thank you… 🙂
Am I just being cynical or is this an attempt to take satellite data that shows a current cooling trend and sanitise it to fit the AGW alarmists mantra?
I find some of this hard to fathom – someone please tell me where I’m getting it wrong …
“They measure upwelling microwave emissions from Oxygen molecules which are dependent almost entirely upon temperature. This makes their analysis as a fundamental climate data record appealing. However, the measurements have been made with forecast input in mind leading to inevitable and insidious non-climatic influences permeating the record. ”
So – you forecast what the microwave emissions (“input“) will be, then you measure them, and that gives you temperature. Simple. But in that case the “insidious non-climatic influences permeating the record” are your own forecasts, so in order to remove them all you have to do is measure the microwaves without having made a forecast in the first place.
If it was simple, now it’s simpler!
But then it says “To adequately remove the non-climatic influences requires multiple independent estimates of the true field value ..“. But this is saying that to remove the influence of the forecast you need multiple independent estimates. It seems to me that in this context an “estimate” can’t be a forecast (the forecast is what you’re trying to get rid of), so it must be a measurement.
So it’s saying that wherever possible you need multiple measurements. Well that’s good science. But if you’ve got multiple measurements then you have no need for the insidious forecast in the first place.
Phillip Bratby (23:24:15) : “I do not see what the student will be doing.”
The students duties are clear : 1. Spend substantial amounts of time at Hadley and DWD Lindenberg. 2. Spend the bulk of their time at UEA. 3. Provide at least one presentation to a Met Office audience and one to a UEA audience. 4. Attend one high profile conference or workshop.
Nice work if you can get it.
They’ve left out a crucial part of the job description, for what it is, but they haven’t given is a clue as to what “result” they are expecting so how can this data be processed efficiently without knowing the required outcome? Isn’t that how it works?
Hmm, I wonder if this site could fund Steve MacIntyre to take the job, or maybe some realist Institutes might take him on. Certainly imagining Phil Jones face as he sees S M’s name on the application is entertaining.
OT, but the “SOHO MDI Continuum Latest Image” has not been updated for a few days and is fixed at 28 July. Any trouble?
I can see three possibilities.
1. Let’s look at the strengths and weaknesses of satellite and ground-based systems and see if we can come up with something that best represents the real-world situation and gives us the best data possible.
The science option.
2. Let’s find some ways of minimising the annoying satellite data and ‘reconcile’ it with our own superior measurements.
The ideological option.
3. Let’s use a ‘blended’ system to quietly move our data to a position that lets us off the hook of the discrepancy between what people are experiencing and what we postulate.
The CYA option.
Any guesses? I understand there might be an election next year in the UK.
Is this the best they can do to solve a known problem – try to find a self-funded student to do the donkey-work.
What are they paid for if not to do this work themselves?
They need look no further. The ideal candidiate Self-Funded student who will fulfill all their expectations is none other than Al Gore.
Movie not included.
This is a blatant attempt to CORRECT the satellite data, just as these so-called climate scientists are CORRECTING the pre 1945 temperatures to get rid of that pesky natural global warming.
This CORRECTION of the temperature record is being simply done to make the data fit the models. That belief must sustained at all costs.
This is a direct result of climate science transforming itself into a severe form of political-CORRECTNESS.
Skepticism is growing, but don’t be too surprised if a state sponsored climate-of-fear is once more generated to CORRECT the public mind over AGW.
They missed out the most critical part of the advert: ” Candidates with the initials S.M. or A.W. need not apply”……
A self funded student would not have to make his methodology or data available to any FOI request….
They want to adjust the good satellite data with bad surface data.
It’s a winning formula. Look where it has got them with the surface stations.
Hard to believe that Phil Jones & Co are so unsure about the accuracy of their current GST numbers, with world governments set to waste trillions of our hard earned cash based on this supposedly accurate data.
Just shows what a sorry state the whole of climate science has got itself into.
This smacks of a political solution to the problem of resolving temp divergence without addressing the real issue of resolving the cause of the differences & fixing the problem. We will want to know how they propose to Weight the different series, how they propose to merge the series, and if any “adjustments” are going to be made.
Lots of opportunities to sweep the current adjusted data sets under the carpet and hide the extent to which they have been contaminated.
But a good opportunity to make a fresh start without admitting the serious problems the current data sets have.
It will be interesting to see how the project develops.
Let the student do it- then blame all mistakes on the student. CYA. BTW, does anybody else here want the records updated by a student?