Claim: Researchers create means (a model) to monitor anthropogenic global warming in real time

From the UNIVERSITY OF CALIFORNIA – SAN DIEGO and the two most manipulated datasets on the planet- Karl’s “pause buster” and NASA GISS.

Study points to primacy of Pacific Ocean as a global climate force

A research team including a Scripps Institution of Oceanography at the University of California San Diego climate scientist simulated in a computer model, for the first time, the realistic evolution of global mean surface temperature since 1900.

This graph shows observed global mean surface temperature (GMST) based on three datasets (black curves in degree C), and the new estimates of anthropogenic global warming (AGM). The simulated GMST change without considering tropical Pacific internal variability is plotted as reference (white curve with blue shading indicating the uncertainty). CREDIT Nature Geoscience
This graph shows observed global mean surface temperature (GMST) based on three datasets (black curves in degree C), and the new estimates of anthropogenic global warming (AGM). The simulated GMST change without considering tropical Pacific internal variability is plotted as reference (white curve with blue shading indicating the uncertainty). CREDIT Nature Geoscience

In doing so, the researchers also created a new method by which researchers can measure and monitor the pace of anthropogenic global warming, finding that the contribution of human activities to warming in the surface waters of the Pacific Ocean can be distinguished from natural variability.

Former Scripps researcher Yu Kosaka, now at the University of Tokyo, and Shang-Ping Xie, the Roger Revelle Chair in Environmental Science at Scripps, created the simulation by forcing sea surface temperature over the tropical Pacific to follow the observed variability.

“The climate system includes naturally occurring cycles that complicate the measurement of global warming due to the anthropogenic increase in atmospheric greenhouse gases,” said Xie. “We can isolate the anthropogenic warming by removing the internally generated natural variability.”

Climate policymakers have sought to limit the rise of global temperatures to 2° Celsius higher than pre-industrial levels. That figure is considered a threshold beyond which society and natural systems are virtually assured of experiencing significant and dangerous instability. Scientists have estimated that the planet is already roughly 1° C warmer at the surface than before the Industrial Revolution.

The 2° C target was reaffirmed during the 2015 Conference of the Parties, known as COP21, that was held in Paris in December. Kosaka and Xie’s research could provide an easily generated and more accurate means to measure society’s success in keeping temperatures below that threshold.

The research is further confirmation of the primary importance of the Pacific in controlling global-scale climate that researchers have come to understand in recent decades. Kosaka and Xie plotted the rise of global mean temperatures over the past 120 years. The rise of temperatures ascends in a staircase fashion with the steps becoming larger over the past 50 years.

When Kosaka and Xie removed as a variable the natural warming and cooling of the Pacific Ocean, the rise of global mean surface temperature became a more linear increase, one that began to accelerate more sharply in the 1960s. It had been natural Pacific decadal variations that temporarily slowed down or speeded up the warming trend, leading to the staircase pattern.

For example, global mean surface temperature has not changed much for 1998-2014, a time period known as the hiatus that has been tied to naturally occurring tropical Pacific cooling. Raw data show a warming of 0.9° C for the recent five-year period of 2010-2014 relative to 1900 while Kosaka and Xie’s calculation yields a much higher anthropogenic warming of 1.2° C after correcting for the natural variability effect.

“Most of the difference between the raw data and new estimates is found during the recent 18 years since 1998,” said Xie. “Because of the hiatus, the raw data underestimate the greenhouse warming.”

Kosaka and Xie suggest that though Pacific Ocean trends are an essential variable control on global temperature rise, the accuracy of their warming estimate will be improved in the future as other climate modes are added as variables. An international initiative involving more than a dozen climate models is being planned to improve the estimates included in upcoming assessments by the Intergovernmental Panel on Climate Change (IPCC).

###

0 0 votes
Article Rating
85 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Ron Clutz
July 18, 2016 10:01 am

Assumption: Remove warming from ocean oscillations and what remains is man-made.

Ron Clutz
Reply to  Ron Clutz
July 18, 2016 10:04 am

Here’s a more rigorous approach to temperature sensitivity to natural factors.
https://rclutz.wordpress.com/2016/06/22/quantifying-natural-climate-change/

Aphan
Reply to  Ron Clutz
July 18, 2016 10:05 am

Because Nature is totally and completely understood, predictable, and static. *snark*

MarkW
Reply to  Ron Clutz
July 18, 2016 12:17 pm

Assumption 2: We know all the ocean oscillations.
Assumption 3: We also know these ocean oscillations with sufficient accuracy to fully model them.

Ron Clutz
Reply to  MarkW
July 18, 2016 1:12 pm

Agree. A quasi-60 year cycle is only a first approximation.

Reply to  MarkW
July 18, 2016 2:45 pm

Ah no. they actually argue that other cycles can be added and should be added.
The approach is pretty straightforward and functions much like data assimilation in weather forecasting

Reply to  MarkW
July 19, 2016 9:05 pm

Mosher says, “much like data assimilation in weather forecasting”
Ah right, three days good, ten days crapshoot, anything longer: fantasy.

george e. smith
Reply to  Ron Clutz
July 19, 2016 2:44 pm

“””””….. Scientists have estimated that the planet is already roughly 1° C warmer at the surface than before the Industrial Revolution. …..””””””
Out of a surface Temperature range from say +60 deg. C on the floor of the hottest deserts, to the -94 deg. C over a large area of the Antarctic highlands, that seems like a piddling amount to me.
So was the industrial revolution 150 years ago or would it go back as far as say 1492 ??
Notice the “estimated” and the “roughly”.
Izzere also a “maybe” ??
So now what data set do we have of actual surface Temperatures, rather than say lower troposphere air temperatures or SSTs which are NOT representative of real surface Temperature range.
This stuff gets tiresome. When do they plan on actually knowing something ??
g

george e. smith
Reply to  Ron Clutz
July 19, 2016 2:50 pm

How do you determine what the warming from ocean oscillation is.
So I have two coins that add up to 15 cents, and one of them is not a nickel ??
How do you separate variables when you know none of them for sure ??
G

Ron Clutz
Reply to  george e. smith
July 19, 2016 3:29 pm

George, to be clear I do not agree with the study authors’ assumption.
But we do have measures of oceanic oscillations going through warming and cooling phases. The linked article I provided above shows one way to analyze these as a fluctuation of heat content and temperatures internal to the climate system.
The remaining secular, long term rise in temperature is a different matter, involving external heating. Again, the linked article provides an alternative explanation for that, other than CO2.

July 18, 2016 10:02 am

“forcing sea surface temperature over the tropical Pacific to follow the observed variability”
So let me get this straight. When you force (curve fit) a simulation to follow natural variability, this somehow isolates the human fingerprint, which for all intents and purposes is far below the threshold of detection?

Mark from the Midwest
Reply to  co2isnotevil
July 18, 2016 12:05 pm

Yes, as with all science your explanation of a residual is as good as mine. The important factor in determining the appropriate explanation of the residual is, in descending order, 1) The likelihood that the explanation leads to publication, 2) The likelihood that the explanation leads to more funding, 3) The likelihood that the explanation gets you invited to speak at a cushy conference in a popular tropical location, (complete with first class airfare for you, your partner, and 10 of Leo DiCaprio’s friends), 4) The logical consistency and validity of the explanation.

Reply to  Mark from the Midwest
July 18, 2016 1:03 pm

The insane gyrations they go through to prove something that can’t be proven would be humorous if only the consequences weren’t so detrimental to science, economics and politics.

Ernest Bush
Reply to  Mark from the Midwest
July 18, 2016 6:58 pm

Not to mention the continuation of the human race.

Reply to  co2isnotevil
July 18, 2016 2:52 pm

Err No.
the Ocean covers 70% of the globe. In a typical GCM run you will see ( ask bob tisdale ) that the predictions averaged globally doesnt look so good, and is biased hgh
And if you look at various basins they are also off,…
Now Imagine That you take 10% of that entire ocean and force it to follow observations.
In weather modelling this is called data assimilation.
In engineering it would be something akin to a Kalman filter.
Now imagine if you force that 10% to match and every other part of the ocean suddenly snapped into place.
it would one thing if you preoscribe the whole ocean to match.. that would be “fitting”
but when you only force a small part to match and the rest lines up you can surmise two things
A) the physics of how that 10% is tied to the rest WORKS
B) you better study that region really really well.. because it maybe a key to unlock the initilization problem

Reply to  Steven Mosher
July 18, 2016 7:06 pm

Steven,
The problem with your analysis is that even if the rest ‘snapped’ into place, you are not predicting the temporal behavior, i.e. the future behavior, because you are essentially resetting the initial conditions on every time step. You will be able to get the proper future answers only when you have the 10% of the measurements available for that future time.

Reply to  Steven Mosher
July 18, 2016 7:42 pm

Steven,
One more point is that this kind of re-initialization per time step to avoid the divergence problem will even match highly maladjusted data as long as its consistently maladjusted. Keep in mind that accurately simulating the climate at the level of detail GCM’s apply is at best, an NP complete problem with many similarly valid solutions which is why weather forecasting models diverge after a short amount of time.
In any event, the second you stop recalibrating the model to the data, the simulated data will diverge just as before.

Reply to  Steven Mosher
July 18, 2016 9:01 pm

wrong

schitzree
Reply to  Steven Mosher
July 19, 2016 9:48 am

My computer model for the performance of a NASCAR racer doesn’t come very close to the actual speeds measured on the track, but if I force the model to continually match the data for the RPM of the tires, everything else falls into place.

Reply to  Steven Mosher
July 19, 2016 10:44 am

schitzree,
And as soon as you stop matching the model to the RPM of the tires, it will diverge at the same rate it was before. My point is that this does nothing to help predict the future much past the last adjustment.

schitzree
Reply to  Steven Mosher
July 19, 2016 9:48 pm

CO2isnotevil
Indeed. And just as the revolutions of the tires are a primary driver of the speed of the car, the ENSO is a primary driver of world temperature. Of course most of the rest of the model will fall into alignment if you force that one piece to fit. But that tells you nothing more then that your model has at least some of the gross physical interactions right. If my NASCAR model can’t produce the tire RPM’s correctly by itself then it’s worthless for making predictions. And if a climate model can’t produce ENSO correctly by itself then it’s also worthless for making prediction

Tom Halla
July 18, 2016 10:05 am

Kosaka and Xie still looks stepped on in historical temperatures, as no peaking in the 1930’s is evident, or any real decline from the late 1940’s to the 1970’s. Why not just use unadjusted data?

njsnowfan
Reply to  Tom Halla
July 18, 2016 10:12 am

They use adjusted & manipulated
data so they can make their models show the results they want.
Mann made feed back effect.

Reply to  Tom Halla
July 18, 2016 2:16 pm

How stupid do they think we are?

Michael of Oz
Reply to  Stephen Greene
July 18, 2016 5:38 pm

As stupid as a politician.

Reply to  Stephen Greene
July 18, 2016 5:50 pm

Stephen Greene asks: “How stupid do they think we are?”
Better phrased I think by Firesign Theater, who asked “What kind of fool do you take me for!” and received the answer “First Class!”

Reply to  Stephen Greene
July 18, 2016 5:59 pm

I think that was “Nick Danger; Private Eye”. Not sure.
Did we just pass a gas station?
No, but the Fox did…

Reply to  Stephen Greene
July 18, 2016 9:01 pm

dont worry you exceed our expectations

July 18, 2016 10:06 am

If they went back and did Bayesian Inference, Bayes Factors (you have a hell of a lot of Posterior data), or even a Likelihood test with well tested and confirmed models for comparison, then they might be on to something. Otherwise, it’s just garbage in–>garbage out and filling up scientific journals.
They haven’t yet tried to fit their models to reality either…….so that throws out the Likelihood Test.

Reply to  Eric Slattery (@Technos_Eric)
July 18, 2016 2:18 pm

Real data, they just manipulate it!

Reply to  Eric Slattery (@Technos_Eric)
July 18, 2016 5:53 pm

Eric I have to admit I’ve never been or ever will be an acolyte of the Bayesian Mystique, but really? Posterior Data? As in data pulled directly from your posterior? That I might be able to get “behind”. 🙂

Stephen Singer
July 18, 2016 10:45 am

I’m noticing that since about 1998 it does a lousy job of matching reality. So much for putting any credence to any claims about future forecasts by this model.

AnonyMoose
July 18, 2016 10:46 am

Yup, they’re totally monitoring warming in real time. Except when they’re not, decades at a time.

John W. Garrett
July 18, 2016 11:03 am

Assumptions built on guesses on top of estimates based on conjectures filled with suppositions.

BallBounces
Reply to  John W. Garrett
July 18, 2016 11:24 am

John W. Garrett – You misspelled suppositories. 😉

Reply to  John W. Garrett
July 19, 2016 4:56 am

Don’t forget my favorite, the ever present proxy data, standing in for actual real empirical temperature data.

July 18, 2016 11:05 am

We can isolate the anthropogenic warming by removing the internally generated natural variability.

Nonsense, as mentioned it implies they can distinguish between the two.
Another point of matter that isn’t mentioned Co2 forcing is going to be dependent on the surface temp underneath it. Nobody talks about that.

NW sage
Reply to  micro6500
July 18, 2016 4:36 pm

What they actually remove is their own unsubstantiated values/version of what they assume is ‘internally generated natural variability’. As in “Lets see, this bit of data doesn’t result in the conclusion we like so we will leave it out”. But it’s OK because those who were invited to COP 21 voted that it was all right to do that. The science must be good because of the vote.

Reply to  micro6500
July 18, 2016 6:05 pm

“Natural variability”, as anyone who is experienced in any way with physical science, is code for “the stuff we don’t understand”.
To suggest they now understand the stuff we don’t understand is, to put it bluntly, completely round the bend.

July 18, 2016 11:10 am

“Claim: Researchers create … a model … to monitor anthropogenic global warming in real time”
Ah, another computer simulation with faked data. And we should believe that has any real world value? I think “Pokemon Go” is more “real world”.
When will mankind learn that biased speculation using a computer is just biased speculation?

george e. smith
Reply to  markstoval
July 19, 2016 3:00 pm

Well if they are going to monitor anthropogenic global warming in real time, they better get started, because real time begins now, so you can’t rely on anything from the past, which was back before I wrote this. Too late to take notice of the past in real time.
G

jorgekafkazar
July 18, 2016 11:13 am

This is insane behavior. Reminds me of the ancient Chinese curse: “May you live in times when people believe in ancient Chinese curses.”

John
July 18, 2016 11:26 am

Something seems wildly amiss. Model… real time… aren’t these opposites?

Reply to  John
July 18, 2016 2:56 pm

err no.
google nowcast

BallBounces
July 18, 2016 11:27 am

We need an app that monitors data tampering in real time.

Ian Macdonald
Reply to  BallBounces
July 18, 2016 11:48 am

Just detect key presses. (On the same principle that if a politician’s lips are moving, then…)

July 18, 2016 11:46 am

OK so for a start, the historical temp record is mostly made up, we know that much, with the SH being entirely made up.
Then this made up monthly ?data? has been changed so many times that it bears so resemblance to the original final, much less raw data.
So right, we have the fact the temperature record is entirely unreliable pre 1960 really and it’s reliability just gets worse the father back you go, we still have massive “E” values in data sets.
So that’s the main problem, the lack of any real certainty in their data from the offset, they fed this into a model that does not replicate the system they are observing and ran it, and I can say, any results from this are completely bogus and nothing but shot to nothing guessing, and giving idiots another junk paper to point to.

DredNicolson
Reply to  Mark - Helsinki
July 18, 2016 1:29 pm

Like the Texas sharpshooter, if the targets drawn around his shots were as big as the Pacific Ocean.

schitzree
Reply to  Steven Mosher
July 19, 2016 10:01 am

http://i.stack.imgur.com/Z3XwI.jpg
First thing I thought of. ^¿^

george e. smith
Reply to  Mark - Helsinki
July 19, 2016 3:05 pm

Well actually the measured temperature record is unreliable pre 1980, when they started that study of oceanic water and air temperatures in the same place,
And that is also about the start of the satellite data age, which I think starts in 1979.
G

July 18, 2016 11:49 am

Peer Review no longer demands that your results be testable, but it’s for profit, they are enjoying this climate change nonsense.
Springer published papers with gibberish from an MIT program, just the abstract was re written, meaning Springer peer reviewers never read the paper, just the abstract, IEEE published 100 of these junk papers.
Science is well and truly in trouble, really.

July 18, 2016 11:50 am

A study built on a pile of turds will only produce turds. This study stinks

A C Osborn
July 18, 2016 12:12 pm

CO2 warming the Oceans ha Ha ha.

July 18, 2016 12:19 pm

With similar logic and science: We used a random number generator to predict the winning lottery number set for the July 17, 2099 drawing. I’ll bet my life on the accuracy of the result.

Reply to  Joel O'Bryan
July 18, 2016 2:59 pm

some people bet their lives, the lives of soldiers, and whole business on the notion of feeding back observed data into a forecast.. or constraining forecasts with observed data..
its called
https://en.wikipedia.org/wiki/Data_assimilation

Yirgach
Reply to  Steven Mosher
July 19, 2016 6:35 am

From the Wiki article:

Dealing with biased data is a serious challenge in data assimilation.
Further development of methods to deal with biases will be of particular use.

Can you say biased?

george e. smith
Reply to  Steven Mosher
July 19, 2016 3:12 pm

Presumably the data is fed back into a model that is actually a model of the system from which the data was measured.
Of the 57 widely accepted computer climate models or GCMs if you will, none of those models is a model of the actual earth.
G
Maybe it’s 97 widely accepted models; presumably of 97 different systems, none of which is the earth.
How do you get even 57 different models of purportedly the very same system ?? And most notably, no two of them agree with each other.

July 18, 2016 12:28 pm

The graph they present has the previous models in blue-and-white, and their supposedly improved curve as a thick magenta line. The raw adjusted data are plotted as thin black lines.
The new and supposedly improved model has done absolutely nothing to eliminate the post-1998 divergence between model and “reality”. The only visible improvement is a very slightly better match to the 1940-1970 cooling.
They can’t be serious, can they? Perhaps somebody can tell me I’m reading the graph wrong.
Apparently, the text is saying that the Pacific Ocean surface temperature is all that controls the natural climate variation.. So the Atlantic doesn’t matter and the sun is irrelevant. I must be reading it wrong – they can’t be that out of touch. Can they?

July 18, 2016 12:37 pm

Another effort to explain the pause that Karl says does not exist. This one apparently logically flawed in two ways that escaped peer review. 1. The data say the tropical Pacific is not cooling. See Bob Tisdale’s post earlier today. 2. The Argo system says OHC 0-700 meters is increasing.

Wim Röst
July 18, 2016 12:53 pm

Do I read the graph well: was the estimated anthropogenic global warming negative until 1905? Anthropgenic Cooling?

Greg Cavanagh
Reply to  Wim Röst
July 18, 2016 5:13 pm

If I’m reading the chart correctly; the “Estimated AGW” is negative until about 1985.

July 18, 2016 1:04 pm

It wasn’t that many years ago where the warmists would have never admitted to the fact that natural cyclic variation even existed to any significant degree.

willhaas
July 18, 2016 1:14 pm

The climate change we have been experiencing is caused by the sun and the oceans. They are trying to somehow include the ocean effects in their model but are forgetting the sun. What about the Earth’s albedo? They are assuming that CO2 is somehow responsible for the residual but are not including the effects of the primary so called greenhouse gas, H2O. There is no real evidence that CO2 has any effect on climate.

Bruce Cobb
July 18, 2016 2:00 pm

As always, the ends (pushing the CAGW ideology) justifies the means. In this case they created them as well.
Models all the way down.

Chris Hanley
July 18, 2016 2:38 pm

Oh now I understand, by AGW alarmists mean “realistic” model output not mere “raw data”.

Michael Jankowski
July 18, 2016 3:46 pm

“…simulated in a computer model, for the first time, the realistic evolution of global mean surface temperature since 1900…”
Well at least they admitted climate models have never otherwise been accurate at getting temps right.

Michael Jankowski
July 18, 2016 3:57 pm

…”We can isolate the anthropogenic warming by removing the internally generated natural variability”…
Gee, why didn’t anybody ever think of that before? Must be a true genius.

Michael Jankowski
July 18, 2016 4:02 pm

“…created the simulation by forcing sea surface temperature over the tropical Pacific to follow the observed variability…”
Why not just force all temperatures to follow the observed variability? Or is that the next trick/paper/funding opportunity?
Of couse we all know that would be an exercise in over-fitting and has been done by any number of people. But if you only do the teenie-tiny Pacific Ocean (lol), it’s somehow valid.

M Seward
July 18, 2016 4:26 pm

Another stall in sixdeshow alley. Should fit well between the ‘throw the ball in the clown’s mouth’ and ‘shoot the tin ducks’ to win your special eco cuddly toy prize. It’ll be the Go Poke a Moron of the CAGW world.

catweazle666
July 18, 2016 4:39 pm

Oooh, another computer game! Is it available for Xbox?

JohnKnight
July 18, 2016 4:51 pm

Cool . . er, I mean great . . Now we can have daily updates, with yellow, orange or red alert symbols, depending on “real time” track keeping of our impact on global temperature! Oh the wonders of modern Siants . . bestest buddy of our friend Big Brother.
(Let’s all cut one at once, and see if we can impact the climate, eh?)

July 18, 2016 5:46 pm

“Most of the difference between the raw data and new estimates is found during the recent 18 years since 1998,” said Xie. “Because of the hiatus, the raw data underestimate the greenhouse warming.”
This must be the most completely worthless observation ever recorded in history. Who writes this crap?

Robert from oz
July 18, 2016 7:43 pm

Does this mean they can now predict the weather ?

Robert from oz
Reply to  Robert from oz
July 18, 2016 7:44 pm

Oops should have added “accurately”.

July 18, 2016 9:31 pm

Moderately interesting, but the temperature record is probably not known with enough resolution for this to be very useful.
Hey, that reminds me — did Zeke’s team ever decide on a single temperature for July 1936? Or is that still changing pretty often?

July 18, 2016 10:08 pm

I wonder if the relative flatness of the new “guessed” AGW temperatures up to mid-1900’s versus the steep late-1900’s and 2000’s “guessed” is because they forgot to take out the aerosols used in the models to make the old hindcasts?
Absent from their “natural” variation (Pacific Ocean oscillation) is the natural rebound of temperatures from the Little Ice Age. I don’t know its magnitude, potential oscillation or even if it is still operating. Does anyone else know? I was surprised to learn that the authors believed that the oscillation of one ocean basin was the sole driver of worldwide climate, other than protean man’s assault.
Additionally, as noted above, they ignored AMO. I was under the impression its oscillation profoundly affected the Northern Hemisphere climate, at a minimum.
I’d like to see their analysis of the lack of model-predicted increased water vapor in the upper troposphere. Does that track Pacific Ocean temperatures?
Maybe Bob Tisdale can tease out some relevant data? If so, I’ll contribute to his tip jar again!
Dave Fair

PA
Reply to  dogdaddyblog
July 19, 2016 1:21 am

You don’t get -0.65°C AGW in 1908 by teasing the data.
The data has to be waterboarded to get this kind of result.

Reply to  dogdaddyblog
July 21, 2016 8:53 am

For some fun, plot all the different GISS temps for the period 1900-1940 since they started reporting on the same chart. It’s a spaghetti graph.

July 18, 2016 10:46 pm

I do know at least this; I’m an existentialist. In short, that means whatever seems to float my boat is probably real.
So here’s the deal; I’m alive. Looks like that might continue for awhile. Life, as they say, is good. Anyone have a problem with that? Talk to my attorney. Butch? Please escort these folks to the conference room. Thanks.

July 18, 2016 11:14 pm

I honestly wish I could blow off stuff like this. I really do. I would like nothing more than to think this is something I don’t need even be concerned about. I’d like to think there aren’t thousands of “activists” pounding the sidewalks trying (and succeeding) to convince Grandma to cough up just $25 to make sure her great grandchildren won’t suffer a horrible death frying alive on the streets of Taft.
Yep. I’d like that. I’d like it if a house fell on the Wizard. And his little dog. Just once.

July 19, 2016 12:53 am

How do they distinguish between man-made CO2 increases and man-made energy/power production increases – the latter being the so called Heat Island effects! Has anyone plotted man-made CO2 increases and man-made power/energy generation increases over the last 200 years to see which has the best correlation?

July 19, 2016 2:11 am

With a bit of luck non linear equations will follow in a few years time…;-)

July 19, 2016 6:51 am

As neither global warming (agw/cagw) nor made climate change exist in reality and are nothing but a left wing construct; it does not matter how many computer climate models are developed they are all meaningless. In addition the differential equations needed to define the models are non linear and cannot be solved by mathematicians (Laplace, Lagrange, Bessel etc) .

July 20, 2016 6:52 am

Producing a real time climate model with all its algorithms and stuff might be really clever and time consuming as an exercise in maths but apart from that must be absolutely meaningless.