Michael Jonas
In March, my article “Traffic Lights and Roundabouts – Why the Climate Models will never work” was presented on WUWT. That was a somewhat light-hearted analogy between road traffic and climate, saying in essence that the techniques used in climate models wouldn’t work for road traffic, so why would you trust them to work for climate. The reason for writing it was to give people an argument that could be used in conversation with those whose eyes would glaze over if you tried to talk about the inner workings of climate models.
Another reason for writing it was that I had been putting off writing a proper critique of climate models, knowing how much work it would be. Well, comments on “Traffic Lights and Roundabouts” have spurred me into action, and I have now written up a proper analysis, and it has been published – General Circulation Models cannot predict climate.
The paper is based on Chaos Theory, of course, and two very interesting (to my mind) facts emerged:
1. Climate is sufficiently complex that its various parts have different ‘prediction horizons’. A prediction horizon is the length of time beyond which we can no longer accurately forecast a chaotic system’s behaviour. So some parts of a climate model, like hydroclimatic processes (the water cycle) break down very quickly, while other parts, like tropical ocean surface temperatures, can work for quite a long time.
2. For climate, prediction horizons are nested. If you get past the short term prediction horizon of maybe a few weeks, you hit a new one of maybe a few years. Past that, there is a decadal horizon, then centuries, then millenia, etc. It may even be better to think of prediction horizon as a continuum rather than nested.
In “Traffic Lights and Roundabouts”, I said that I was not the first person to say that the climate models will not work. In this ‘GCMs cannot predict climate’ paper, I also recognise that I am not the first person to make many of the points in the paper, and hopefully I have made this clear via references. However, I might be the first to put it all together in a journal. If not, I apologise, I couldn’t find it in the literature.
In summary, the main point is that the grid-based physical processes and parameteristions in the GCMs cannot predict climate because there is a short prediction horizon for most of what goes on in climate. That is, a tiny error will very quickly increase in size until it has completely swamped the predictions. It has been shown that GCM results can be dramatically improved if a grid-level process is replaced by a higher-level parameterisation (see “seasons” in the paper). My argument is basically that this applies to just about all longer term climate features in the GCMs (I actually think it really is all). In other words, when the physical processes and small-scale parameterisations in the GCMs (I’ll call these their “grid-level processes”) hit a prediction horizon for a particular feature, the barrier can be overcome by analysing the feature externally and then feeding it back into the model. There is no point at which the model, after being fed with a number of such longer term features, can ever reliably predict any other longer term features, because it necessarily hits a new prediction horizon when it steps outside the areas that it has been given.
The end result is that the grid-level processes in a GCM cannot predict anything into any kind of longer term future. All longer term features must be analysed externally and then be fed into the GCM if the GCM is to produce reasonable results. But then the grid-level processes in the GCM aren’t predicting anything. If the grid-level processes are still in the GCM, they are now simply ‘obeying orders’.
Even longer term features, like ocean oscillations, have their own prediction horizon. Will they speed up or slow down, get stronger or weaker, or even stop for a while – we don’t know. So there is a limit to how far we can extrapolate them into the future. For example, we are used to the 11-ish year cycle of sunspots, but for several decades within the Maunder Minimum they virtually stopped. Maybe Earthly cycles can do that too. Maybe William Herschel was right, that there really was a causal connection between what we now call the sunspot cycle and wheat prices, it’s just that things changed at the end of the Dalton Minimum. Today’s scientists often claim that William Herschel was wrong, based on the fact that the correlation he observed did not continue, but they do not take into account the fact that the Dalton Minimum did not continue either.
Many years ago, a well-known climate scientist told me they didn’t know the mechanisms that caused periods like the Medieval Warming Period (MWP) or the Little Ice Age (LIA), so they could not code them into the climate models. My paper says that they can now put in the MWP/LIA pattern without knowing the mechanisms.
The paper ends up arguing that a GCM calculates weather at each time step and this is then amalgamated into a final prediction of climate, but a realistic long term climate model would instead calculate climate and then weather would be deduced from the climate.
The abstract of the paper:
Abstract
This study draws on Chaos Theory to investigate the ability of a General Circulation Model to predict climate. The conclusion is that a General Circulation Model’s grid-level physical processes and parameterisations cannot predict climate beyond maybe a few weeks. If a General Circulation Model is to be used at all, longer term climate features can be analysed externally and fed into the model but they cannot be represented by the model any better than by the external analysis. The external analysis, which is likely to be simpler, has the added advantage that the assumptions that are used, and the uncertainties in the results, are much more likely to be explicitly identified, quantified, and understood. Consequently it would be clear which aspects of the climate are being predicted, and how reliable those predictions are. The longer the timescale is, the less relevant the grid-level physical processes and parameterisations in a General Circulation Model become. Although a General Circulation Model can be made to represent climate over a longer time scale, its grid-level physical processes and parameterisations cannot predict the climate. A General Circulation Model calculates weather at each time step and this is then amalgamated into a final prediction of climate. This process is back to front. A realistic long term climate model would calculate climate and then weather would be deduced from the climate.
The full paper is here.
Maybe no-one has ever put it all into a paper before because, once you see it, it is all so blindingly obvious – except that the way the climate models are revered it seems not to be so blindingly obvious to some people. Well, now that there is a paper that states explicitly that GCMs can’t predict climate, and explains why, will it make any difference? I doubt it. As Upton Sinclair said nearly a century ago: It is difficult to get a man to understand something when his salary depends upon his not understanding it.
IPCC TAR14.2.2.2 Balancing the need for finer scales and the need for ensembles
In sum, a strategy must recognise what is possible. In climate research and modelling, we should recognise that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible.
________________________________________________________________
Even our understanding of relevant fundamental molecular forces is inadequate/incomplete. https://www.pnas.org/doi/full/10.1073/pnas.2312751120
You mean the science isn’t settled?!
Oh, but algorithms and computers have gotten so much better since the TAR that this is no longer true. (/sarc necessary?)
The next sentence is also important and it is dishonest not to include that context.
The problem is that the averaging out of individual runs, which they admit are never real, into “ensembles” of statistics of individual runs does not solve the problem either since the models are not purely deterministic runs of “basic physics” equations anyway. The long term behaviour is INPUT into the models by the choice of “parameters” representing the key processes which ARE the earth’s climate, mainly the water cycle.
Many values of dozens of parameters are tried and the ones which give the desired (expected) results are retained as being “best” parameter values.
So, me not being a scientist (though not unfamiliar with science), what can be concluded? At least for me, the conclusion is that we have no idea what the ECS is, if it is even a valid concept. Since we have no reason to believe it’s high, given the extreme complexity of the climate, there is no reason to rush to any conclusion- like a desperate need for net zero.
Back in the 1990’s physicists and mathematicians were saying the same thing and were ignored
But they weren’t Climate Scientists.
Ah but James Hansen of NASA who did the 1988 presentation to congress was just a physicist
And some a lot more recent than that. Happer comes to mind!
The models do predict the climate
Anyone can predict the climate
The average model from the 1970s predicted warming of +3.0 degrees C. per CO2 x 2
That’s about one third faster than the actual warming rate since 1975
Of course the global average temperature is far from useful as a proxy for local climates, where people live and work
The propaganda magic of a long term prediction is you need to wait a long time to see if it was correct.
There are only two general climate predictions
A warmer climate or a cooler climate
With the accelerating CO2 emissions from the 1950s to mid-1970s, a prediction of warming was more likely to be accurate than a prediction of cooling
The warming could be predicted to be small, medium or large. The Climate Howlers covered a wide range with +1.5 to +4.5 degrees C. per CO2 x 2.
Do humans have any ability to predict the climate in 100 years, or 10 years?
Is it even possible to predict a long term climate trend with far more climate knowledge than is available today?
Never mind that.
Predictions of global warming doom can be used to create fear and control people.
These predictions seem more believable when they are presented by government scientists and their climate models.
Both are props for a political strategy of controlling people by controlling their energy use. Government control is better described as Leftist Fascism
The models are doing exactly what they are intended to do: Leftist climate scaremongering propaganda. The Russia INM model may be an exception.
Climate models are a political tool.
They are just computer games
That’s why I started calling them Climate Confuser Games in 1997
When I first started reading WWWT and Climate Audit 15 years ago, many people were careful to make the distinction that models were NOT predictions, they were projections based on our imperfect knowledge of the climate. That point seems to have vanished since then. People are treating the model projections as predictions because they get the answers they want.
There is a subtle difference between a prediction and a projection. A prediction, is what will happen. Whereas, a projection is what will happen for a given scenario, which cannot be predicted.
Even back then, the so called climate scientists were relying on the fact that 99% of the population doesn’t know the difference between a projection and a prediction.
They knew back then that they were deliberately confusing people, but as long as they were successful in protecting their paychecks, all was good.
oeman,
Around when WUWT and Climate Audit started, we had many more startup blogs. These days, Jo Nova is strong, Notalotofpeopleknowthat is there, Judith Curry does sterling service at Climate Etc, Jeff Condon’s The Air Vent is quiet, Kenskingdom continues, to name but a few. There was, I suspect, a greater volume of sceptical blogging say 7 years ago than now.
But, the quality of the commenters on such blogs was, IMO, higher back then. Especially on Climate Audit and WUWT, there were several experienced statisticians who kept us on the straight and narrow. There was a higher proportion of comment on good science versus bad back then. Later, there was more socio/political comment. Of course people wrote about what concerned them and wrote about pathways to a better world, but there has been a swing from high level science to a lower level in quantity and quality. People get old. Educational standards have declined. Growing pressure from The Establishment to silence comment has taken its toll.
Geoff S
It has become harder to discuss the science, since only one side was participating, and we lost. The social/political articles are more prevalent now, as there may actually be some participation by “the other side”. However, I think that the place to go now is to publish real analysis about the effects of Net Zero, and the general public will understand that, as the costs increase.
However, alarmists seem to want a steady-state climate.
Are you saying that natural systems are cyclical around a mean?
Look at the chart of Global mean temperatures over the last 500,000 years. “https://www.researchgate.net/figure/Global-mean-temperatures-over-the-last-500-000-years-11_fig3_356606430”
Should be intuitively obvious.
That was my point. I don’t know any natural systems that are static.
A prediction of warming was just as accurate as a prediction of less cooling. When you can’t account for all of the significant factors, you cannot say either way.
Temperature is not a useful proxy for any climate; temperature is a feature of climate. There are very hot deserts and very cold deserts. There are tropical rain forests and rain forests like the US pacific northwest rain forest with large temperature differences between the types. These examples have different median temperatures and different standard deviations. Temperatures seem more a result than a cause. Other factors are more prominent than temperature.
Mid-range temps don’t tell you anything either. Vastly different climates have the same mid-range temperature. How can this me a useful metric for climate?
Regarding global warming, never have so many believed in so much for so long based on so little. (Apologies to Winston Churchill! Although I suspect that if he were still alive Winston Churchill probably would have agreed.)
The “water cycle” is very important and something climate models continue to get completely wrong. Doesn’t matter what else they do until they get this key part of the science fixed.
“Global rainfall increases typically cause an overall reduction of specific humidity (q) and relative humidity (RH) in the upper tropospheric levels of the broader scale surrounding convection subsidence regions. This leads to a net enhancement of radiation energy to space over the rainy areas and over broad areas of the globe. ”
https://tropical.atmos.colostate.edu/Includes/Documents/Publications/gray2012.pdf
One should read this book by Mototaka Nakamura that is available on Amazon.
Confessions of a climate scientist The global warming hypothesis is an unproven hypothesis (Japanese Edition) . Kindle Edition. Published in 2019.
The translation from Japanese is not perfect, but you get the drift.
The biggest mistake being made in everything to do with climates is to treat them as THE climate, which results in THE science.
There are hundreds if not thousands of different climates all around the world.
Averaging the metrics of all their characteristics is arrant nonsense.
One day soon I’m hoping that some celebrated child genius comes out and declares that THE science has no clothes.
If averaging measurable parameters of climate is “arrant nonsense,” then how does one characterize any of the many regional climates that you acknowledge? The issue is one of finding regions that behave similarly for long periods of time (if such exist) so that opposite trends aren’t cancelled out by the averaging. A good place to start would be to break down the weather parameters into the various Köppen–Geiger climate classes.
I’m afraid that the first and biggest mistake occurred even before the false transition from “global warming” to “climate change”. To claim that the calculated “global average of the annual average of the daily average of sporadic recordings of selected local thermometer readings of Tmax and Tmin” was a useful proxy for complex simultaneous global phenomena was a huge error. “Climate science” has proceeded downhill from there.
Plus 100.
See my comment above “Averages of averages are Garbage”
P.S. Over 60 years ago I noticed that the High temperature for the day was normally within +/- 5 degrees of the temperature one hour after Sun Rise with a probability better than 75%. I have found that to be true even today and in the eight cities and three different countries I have lived in.
Clyde, I wouldn’t attempt to “characterize” the prevailing climates of any particular localities.
They are what they are – individuals.
Just like people.
(Despite what one of Brian’s acolytes said) –
AND CAHAOTIC!
Don’t know why the down votes.
From what I have found in my studies, is that there are probably few if any “climates” scattered around the globe that are sufficiently similar to provide anything approaching a “usable” average.
“Averaging the metrics of all their characteristics is arrant nonsense.”
Averages of averages are Garbage. Using Tmax – Tmin Increase’s the absurdity.
This can be demonstrated by creating a list of 1,000 numbers between -70 degrees and +100 degrees. Add up these numbers and divide by 1,000. _______
Now take the list and evenly divide the list into 10 lists of 100 numbers. Determine the Average of each list THEN, Find the Average of these 10 lists. _______
Now take the list and randomly select 100 numbers in the list crossing out each number selected. Make 9 more lists of 100 randomly selected numbers, do not select any number already selected. Determine the Average of each list THEN, Find the Average of these 10 lists. _______
Compare these three numbers.
The problem with computer chosen random numbers is that the default is to use a Gaussian distribution. You can force some packages to use other distributions but you need to know the distribution of the data you are observing to know which one to use. Skewed distributions are hardly treatable at all using computer random choices. You need to determine a PDF that fits the skew use it to choose numbers.
The variance of Tmax/Tmin is in the tens digit place. That makes the uncertainty so high as to be unusable. Climate science has decided that averaging disparate stations somehow cancels this variance. What a joke.
“The problem with computer chosen random numbers is that the default is to use a Gaussian distribution.”
In excel, the sampling appears to be evenly distributed. Equiprobable I.e., if I graph the frequency of 100 rankings of 10000 individual rand()s, they are quite even, with no gaussian bulge in the middle. Same with freecalc.
I’m not intricate into the sampling of more sophisticated packages, except to note that some MC packages allowed you to fix your seed to find more precisely the outcome differences with varying inputs. To which packages are you referring?
Sorry. I did misspeak. The default of many is a uniform distribution including excel.
In R:
1. Uniform Random Numbers – Function: runif()
2. Normal (Gaussian) Random Numbers – Function: rnorm()
3. Random Integers – Function: sample()
4. Binomial Random Numbers – Function: rbinom()
5. Poisson Random Numbers – Function: rpois()
6. Exponential Random Numbers – Function: rexp()
7. Gamma Random Numbers – Function: rgamma()
Other packages have similar varieties of random number generators.
The problem when it comes to temperature distributions is that they seldom have a uniform distribution. If you are trying to evaluate a statistical process using random numbers you need to generate numbers that match the distribution of your data.
In some cases, like skewed temperature data, there is no method to obtain a pseudo-random distribution matching a skewed distribution from standard software. This requires some sophicated math.
Check the GUM to read about uncertainties with uniform distributions.
The NIST Uncertainty Machine also allows uniform distributions to be input and will evaluate standard uncertainty for it.
I have asked many times how the global climate has “changed”… apart from a slight beneficial warming since the coldest period in 10,000 years.
Never seen a satisfactory answer, particularly not one with any evidence of human causation.
In fact, I don’t think I’ve ever seen an answer at all.
Climate modelling is the equivalent of central economic planning and control – both coincidentally (?) based in Socialist ideology and its aim, the control of Human behaviour.
F A Hayek called it (economic planning) the fatal conceit, that anyone could have sufficient knowledge of all transactions and results to predict outcome and plan a route towards it.
He pointed out, there are millions and millions of tiny pieces of information, and knowledge of them is local.
Climate modellers when confronted with the fact their models have always failed, make the same excuse as Socialists when confronted with the fact that Socialism has always failed – it just hasn’t been done correctly yet, needs more time, more effort, more money then Bingo.
Good comment.
Your last sentence reminds me that I heard Biden kept shouting “Bingo” at the G-7 meeting.
I have come to believe that what distinguishes progressives from sane people is their detachment from reality and the arrogance to believe that only they are smart enough to have solutions for the world’s problems.
Here’s a good article on why socialism is still a ‘thing’ despite its consistent record of horrific failures.
https://mises.org/mises-wire/socialists-it-doesnt-matter-if-socialism-works-what-matters-power
The author states that socialism can only be successfully contested by focusing on the certain toxicity of its invariant means rather than its historic inability to deliver on its presumably noble ends.
I think the same approach is applicable to climate alarmism, i.e., many of us understand the ‘science’ behind it is junk. But it will remain a ‘thing’ until the public realizes that the means fronted as necessary to combat climate change are every bit as toxic as those required to implement socialism.
And faster, bigger computers.
GI -> faster computer -> even more GO !
From the paper :
This is an example of what I believe is known as “rewriting the dictionary”, which climate science has been doing for over 30 years now.
This is what the IPCC does when it makes a “projection” along the lines of “If we follow the SSP5-8.5 emissions pathway until 2100 then GMST will ‘very likely’ rise by 3.3 to 5.7 degrees Celsius” (see AR6 WG-I Table SPM.1, page 14).
You renaming it a “prediction” again … which I personally happen to prefer as well … doesn’t change the fact that the conditions around that “projection” are clearly stated.
Here you are “rewriting the dictionary”, and conflating “weather” with “climate (= averaged weather)”.
This made following your paper difficult, for me at least, as I had to keep “translating” your uses of the word “climate” throughout to “…a few weeks (/ years) ??? … hang on a second, that actually means ‘weather’ …”.
.
For illustrative purposes only, I’ll use the standard (and default) WMO “integration period” of 30 years and apply it to the definition “Climate = averaged Weather”.
Your “realistic climate model”, with “time-step = 1 year”, calculates the GMST proxy of “Climate” of, say, 16°C (289K) for a specific (simulated) 30-year time period.
Please provide a worked example of exactly how you propose to “deduce” the 30 input values of “Weather” that are averaged to make up that 16°C value.
Now show exactly how you propose to “deduce” the “regional weather patterns” that make up the GMST values for each of those 30 time-steps.
I think your questioning shows that that the “definition” of climate as the “average of weather” is meaningless because it is not rigorously defined.
In the paper, I tried to work logically step by step from the starting point (a grid-based model with tiny time steps) to a genuine climate model. Necessarily, along the way there were remnants of the original – in fact I sort of had to leave them in to prove they weren’t needed.
The “1 year time step” came in along the way to much longer prediction periods with no time steps.
Deduction of future weather from a long term climate prediction is a new field (ie, doing it top down instead of the current bottom-up). As I say in the paper “With higher-level models as described in this paper, a model would predict climate at a future point in time. From that, it can be estimated what kind of weather each place or region can expect. The estimate could be carefully based on knowledge of what kind of weather each region experienced in the past when conditions were similar, or it could use any other meaningful technique.”. Note: no mention of time steps. So I’m indicating that estimating weather from climate is possible, but I’m not telling anyone how to do it. As things stand, I think it is all a reasonable assessment of the current primitive state of climate science.
NB : I don’t know if this is still an “active” post. One lives in hope …
You appear to once again be inverting the “commonly understood” … i.e. “as ***I*** think about the subject” (!) … definitions of (short-term, timescales of hours to weeks) “weather” and (long-term, multi-decadal averages) “climate”.
As Mark Twain (allegedly ?) put it : “Climate is what we expect, weather is what we get.”
You are claiming that from a calculated value of (local / regional) “climate” — e.g. the “30-year Reference Period” values used to subsequently calculate “anomaly values” during a given simulation — you are able to “deduce” on which occasions you do (and do not) get a local “extreme weather event”, e.g. an unusually severe downpour in a given 24-to-48 hour period leading to flash-flooding.
I remain “sceptical” of that claim, and reiterate my request for a “worked example” of exactly how you imagine that “deduction” process could be implemented.
I would have thought ut pretty obvious that the weather derived from a future climate prediction would not go as far as predicting anything on a particular date We’re in new territory. The picture of climate would depend on how far ahead we are predicting, and that would affect what sort of weather estimates we could make. Stormier or cloudier, perhaps, rather than a storm on a particular Monday afternoon. Whether that really is weather, not climate, depends on the definitions used.
PS. You say ” the conditions around that “projection” are clearly stated”, but my argument was about the underlying assumptions not being visible. The underlying assumptions are in the model code itself and are indeed not visible.
Researchers keep coming up with new facts on how the climate system works. For example, sulfur emissions were creating more clouds cooling the climate and smog was cooling the climate plus the way that the oceans and the atmosphere interact was just recently discovered.
Any good model will have errors above and below the actual values. Their models run hot so something is wrong with them.
Story tip – Study: Biden Administration’s EPA Rules Could Cause Blackouts for Millions of Americans › American Greatness (amgreatness.com)
I don’t remember the details, but speaking of the problem of resolution, a decade or two ago, one of the organizations that maintain one of the GCMs acquired a more powerful computer. This allowed them to decrease the size of the cells in their model without increasing the time it took for a run.
To test the change, they first ran the model with the old cell size, then they ran the model with the smaller cell size. Then they compared the results of the two runs.
The only change was the size of the cells, but the results were dramatically different.
Thank you, Mr. Jonas, for sharing all your painstaking, hard, work with us. You provide further confirmation that the climate models are, as Bob Tisdale pointed out in his ebook, Climate Models Fail, unskilled, thus, unfit for purpose.
The following is for anyone who reads your paper and is left wishing for something easier to understand for a non-technical audience:
Six Impossible Things Before Breakfast and Climate Models — lecture by Christopher Essex
For excerpts to share from Essex’s lecture, here are my notes:
NOTES: “Six Impossible Things … “ – Re: CO2 Models
Dr. Christopher Essex lecture
https://wattsupwiththat.com/2015/02/20/believing-in-six-impossible-things-before-breakfast-and-climate-models/
[5:24] “Scientific thinking is about things and political thinking is about what other people are doing.”
[5:31] “A consensus is wrong way to think about a scientific question.”
[5:50] (quoted in slide) “’In climate research and modeling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that long-term prediction of future climate states is not possible.’”
[6:25] (Source of above quote: IPCC (Intergovernmental Panel on Climate Change), Third Assessment Report (2001), Sec. 14.2.2.2, p. 774)
[7:08] Mentions: book by Essex and Dr. Ross McKitrick, Taken by Storm: The Troubled Science, Policy, and Politics of Global Warming (Paperback – May 28, 2008)
[9:25] “An extraordinarily complex problem [] is represented [] as an extremely simple one.” (cites the thermometer in a shoebox, grade school, “science” experiment re: greenhouse effect)
[12:03] When you plot a .04 deg. C increase/decade on a thermometer graph – unlikely to be significant.
[14:00] “There is absolutely no physical argument [] to connect this value to any of the things that people talk about as climate impacts.”
[14:50] “There is a cultural problem with science in general.” — “Cultural carbon” – Example: “Certified carbon free” (formula for sucrose = C12H22O11 (without: 11H2O))
[15:40] “Oxygen-free carbon dioxide” – (re: ignorant use of “carbon” to mean CO2)
[17:14] Dominoes Sugar TV commercial (re: “carbon-free sugar”) (Note: they do say that it is because of 0 CO2 emissions that they call it “carbon-free”)
[18:20] By modifying language, you modify thinking – result: junk science (carbon in auto glass makes car hot quote of David Suzuki, “London Free Press,” May 12, 1990)
[20:20] Classic atmosphere-earth energy flow diagram – misleading because wide v. narrow arrows mislead.
[20:40] Energy flow diagram done more accurately IN: Solar radiation OUT: (1)Radiation (2)Fluid Dynamics –
[21:55] “Greenhouse Effect” (introduce gas that changes how Radiation can flow out) versus
[22:22] How greenhouses really work (Fluid Dynamics – flow of air stopped by glass) – THIS is a KNOWN physical effect, governed by the laws of radiative transfer =” ”Completely Certain Outcome”
[23:19] The “Greenhouse Effect” = Fundamental, Unsolved Scientific Problem — the temperature gradients could cause enough cooling in this NOT-closed system to compensate for warming (unlike in an actual greenhouse)
[25:20] List of fundamental unsolved math problems.
2 Math Equation Problems Not Yet Solved (needed for meaningful climate modeling)
1) [26:10 – 27:00] Navier-Stokes Equations ([27:08]non-linear differential equation — unsolved. – they govern the flow of fluids (e.g., air and water) If you don’t have a handle on how air and water move [] then you really can’t [] have an intelligent conversation about climate[].
2) [27:15] Computer Science unsolved problem is the P v. NP (Polynomial v. Non-polynomial Time Problem of Computational Complexity) math problem – this limits how well (not at all, at this time) a computer can be used to solve the math equations needed to solve climate model’s queries –
[27:40] BOTH the above must be solved to be able to meaningfully do climate simulations re: CO2
Physics Problem Not Yet Solved
[27:50] Closure Problem of Turbulence — thus, cannot use Navier-Stokes flow equation to solve flow even in a closed pipe if there is any turbulence from first principles.
[28:40 – 29:08] Cannot even determine average flow with turbulence (because to average, you have to do the entire original calculation anyway) from first principles.
[29:09 – 29:51] Experience or data cannot overcome the non-closure problem because we have far too little data and or the time of measurement given what is being measured, climate, is far too short.
[29:53] People DO use models to do empirical “closures,” but they are not doing so from first principles.
[30:18] (James Cameron’s) Computer Water Versus Real Water – [32:10] The point is: there are no math or physics equations that give the result pictured (it is fake water and not an accurate representation of the physical world).
[Continued in next comment — this site cut off my pasting — didn’t happen when I’ve pasted in my Essex notes before…. hm…]
[32:40]Issue: Finite Representation of Computers –
[34:02] Red spot graph – point is (0,0) on grid representing Error – Re: 2 variables with 2 unknowns, computer can solve, however, computer only has finite # of decimal places to use to solve, thus, rounding errors will sometimes occur [35:00] – Residual error –
[35:25] – Demonstration of residual error using computer plotting of 2 mathematically equal ways of solving an equation done 100,000 times each on graph – blue and red not overlapping shows computer’s finite representation of the 2 equal math methods (magnified = plotted along lines, not a true scatter – the machine epsilon ɛ indicates the finite representation power of a given computer, i.e., the smallest number, ɛ, such that ɛ + 1 > 1; if you add a number < ɛ to 1, it = 1)
[40:33] Finite representation of computers goes beyond GIGO (Garbage In Garbage Out), they can easily give you NGIGO (NOT-Garbage In Garbage Out).
[42:15] – Re: Turbulent Flow – [43:00] for the “swirls” in air, turbulence at smallest scale (the Kolmogorov microscale) is: 1mm) – thus, to do proper calculation, grid must be smaller than 1mm
[43:20] GIVEN, you have a grid < 1mm [NOTE: for aerosols and other factors, you would need micrometer-sized grid – leaving that aside…. And also leaving aside that you would have to be able to “stop action” the air situation (dogs running around, cars, etc.), TO CALCULATE FLUID DYANMIC CALULATION 10 YEARS OUT (what air turbulence would be) [44:12] (easily 10 variables, using 1 floating point calculation per variable (that is likely only a minimum)), [45:00] the number of calculations per second (billion or so), [45:22] YEARS TO CALCULATE 10 YEAR FORECAST: 1020 years (the universe is 1010 years). IOW: cannot be done.
[46:05] Re: Parameterizations – engineering uses in their models – to approximate a calculation in a reasonable time – [46:25] Engineers tune using data from experiments (e.g., wind tunnel data) – Can’t put the earth into a laboratory (“wind tunnel”) –
[46:42] Thus, climatologists are using a non-empirical model for an empirical problem.
[46:51] Best resolution for climate model grid is HUNDREDS OF KILOMETERS. You will miss a lot, e.g., thunderstorms (! – much energy transfer done by them – there are ~ 2 million thunderstorms a year on earth)
[47:25] All significant weather phenomena are beneath the resolution of the parameterizations of the models — FAKE PHYSICS.
[59:20] Clear to Essex that the IPCC code overstabilized the climate models – in AR4 IPCC trumpeted their “fake” energy flows to stabilize the system – appears to overstabilize –
[49:07] Finite Representation – again – “Numerical schemes don’t usually conserve energy!” – [49:35 – equation for ideal oscillation of a pendulum (the energy is conserved – unlike a real grandfather clock which is losing energy to friction, etc. and must have weights keeping it going)
[50:47] Note: the equation below the line – it is not the same as the ideal oscillation equation – it is what you must use to compensate for the finite representation of a computer
[51:40] When you run the oscillation equation (as modified for FR, i.e., the computer’s ɛ ), it not only conserves energy (as per “ideal” equation – not likelyl going to be such in physical world), it accelerates [51:50] and gains energy infinitely (upward sloping curve) –
[53:33] Implication for the Navier-Stokes equations – to account for all the quantities which must be conserved, you would have to write an algorithm which would solve the N-S equations – not yet done by anyone.
[54:30] Re: the Constancy of Climate – example of a differential equation plotted on graph next to computer’s calculation (using a modified version of that differential equation) – NOT the same after time passes (for awhile, tracks pretty much exactly, then, WHOA, diverges [55:10] – becomes unstable, i.e., issue: computer instability.
[55:52] To prove that chaos was not just an artifact of computer FR, Essex, et al. (1991) used the computer FR to do the reverse, i.e., over-stabilize (or suppress known instability) – [56:00] they turned a chaotic system and turned it into a harmonic oscillator.
[56:30] Cf. IPCC models trying to handle ENSO [56:45] different models’ outputs on graph – focus on one section [57:06] blow it up – everything (all the model runs’ paralleling each other) is flat
[57:23] – known as a white spectrum, Fourier power spectrum ([57:40]which per the Wiener-Pinchum (sp?) theorem says that the individual output which produces that spectrum has to be uncorrelated from moment to moment – take one model out 1,000 years (using PCMDI (Livermore) climate model) — FLAT
[58:28] Comparing Observed to Model data – Note: Observed up and down with definite peaks and troughs while the model sluggishly mimics with much less amplitude, much flatter than observed [59:00] IMPLICATION: Climate doesn’t change on its own; it just stays STABLE, unless something “pushes” it.
[59:20] Clear to Essex that the IPCC code overstabilized the climate models – in AR4 IPCC trumpeted their “fake” energy flows to stabilize the system – appears to overstabilize –
[1:00:32] – There is no proof that climate is naturally stable – and there are good arguments that there are internal cycles making climate system change on its own without any external forcing.
[1:01:12] – Climatological timescales demonstrated using timelapse photography – sun’s travel over 6 months – cars on road, people “invisible” Q: Would you see “weather” on this time scale? No one knows and there is no physical basis to say you can –
[1:04:33] Niagra River using stop image, 1image/15 sec. (a lot of events happening in the level 6 rapids in a short time, so, give idea of what it’s like to see only in terms of long timescales) [1:05:30] created a 3-minute average (as if only seeing once every 3 minutes what it happening) – invariance introduced, “flattened” implication: physics of what is being observed changes – [1:06:12] need to formulate the physics of long versus short (more information) time-scale activity.
[1:06:38] “There are no experts on what nobody knows. So, the whole idea of using ‘experts’ to decide on matters of this type is completely foolhardy, because there really aren’t any experts on it.”
[1:07:06] 8 Main Points – Summary
1. Solving the closure problem.
2. Computers with infinite representation.
3. Computer water and cultural physics.
4. Greenhouses that don’t work by the greenhouse effect.
5. Carbon-free sugar.
6. Oxygen free carbon dioxide.
7. Nonexistent long-term natural variability.
8. Nonempirical climate models that conserve what they are supposed to conserve.
Thanks, Janice. I love your [1:06:38] “There are no experts on what nobody knows. So, the whole idea of using ‘experts’ to decide on matters of this type is completely foolhardy, because there really aren’t any experts on it.”.
Thanks for taking the time to let me know. 😊
FYI, I had an email from a professor in the climate field referring to my paper, saying “I believe it is unsafe to use those models for real-world problems.”. There are a lot more people who ‘get it’ than the powers that be would like us to know about.
PS. The professor also said “I would say that climate models cannot predict climate at all. While prediction for a couple of weeks is possible, that’s weather prediction, not climate prediction.”.
Your text that said:
I tried to explain that to a person on another thread here. There is a difference between doing a “Mathematical Proof” and a proof of a physical phenomenon. It appears the mathematicians working on modeling don’t have a clue about how to validate the outputs of models. It is more than saying, “this is what I expected, so the model is working perfectly!”
Mr. Gorman:
After reading many of your excellent, technically accurate, comments, I know that if that person couldn’t understand your explanation they were either:
1. stupid (or, perhaps, drunk…)
OR
2. pretending not to understand in the hopes of keeping others fooled.
Thanks for taking the time to read those notes! 😀
Thank you for the mention.
Your notes were right on point. Climate science should concentrate on the physical aspects of this old world. It has been led down the wrong path by mathematicians and statisticians who wouldn’t know a measurement if one bit them on the rear end.
By contrast, contrarians are predictable.
So are gormless twits.
Oh, hullo Mr. Nice. How’s Australia these days?
Why is that a bad thing? People who deal with the World as it is reality tend to be.
And another excellent lecture demolishing the GCM’s (a bit more technical than Essex’s, overall, but, understandable to a non-tech):
“No Certain Doom” – re: error propagation in GCM’s by Dr. Patrick Frank
“Yes, a genuine expert can always foretell a thing that is five hundred years away easier than he can a thing that’s only five hundred seconds off.”
Mark Twain. A Connecticut Yankee in King Arthur’s Court.
Climate predictions via forcings and feedback are fundamentally flawed because they assume Climate does not wander.
Like Joe Biden at the G7, or the meander of a river, Climate can wander without any need for forcings or feedback.
How about using a jailbreak AI to predict climate? Probably more reliable that GCMs.
Very nice keep up the good work.
These things claim to produce ‘projections’
But really they make prophesies
Tarot cards would be just as useful for that.
Some people “believe” in them, too !
Not sure that Tarot cards really ‘cut the mustard’ except for the most gullible
Minimum requirements to make some headway seems to be;
A] A model :

B] Special training:
https://en.wikipedia.org/wiki/Haruspex
I am always bemused by the large number of people here who simultaneously appear to believe that it is impossible to predict the climate while also believing in the prediction that if you double the amount of CO2 in the atmosphere nothing much will change.
Of course if Mike has enough money to waste and is foolish enough to publish in bottom feeding journals then he surely shouldn’t expect that people will take any notice of what he is claiming.
We have almost doubled co2 and nothing much has changed. No prediction necessary. Lol.
Do you have any evidence that CO2 in the atmosphere does ANYTHING except enhance plant growth.
No fantasy conjectures…. they are from the realm of fairy-tales.
You obviously have enough time to waste makes gormlessly stupid comments…
… and no brain to realise that is what you are doing.
You surely don’t expect anyone to take any notice of any of the gibberish that a sewer-dweller like you types, do you !!
And of course, you are TOTALLY INCAPABLE of countering anything said.
So you KNOW it is correct, but just don’t it.. boo-hoo !!
missed word…… but just don’t like it.
There is a lot that is missing from the paper. It is just a bunch of assertions without
any rigorous proof. If you want to prove anything about a nonlinear system then you would need to start with the underlying equations to begin with. And since Mike does not do that — there is not a single equation in the whole paper there is no way of proving anything that he asserts.
For starters there is no evidence that the climate is chaotic and in fact there is a lot of evidence that it isn’t. The very fact that it makes sense to talk about things like a Mediterranean climate shows that the climate in that region has been stable and hasn’t varied significantly for hundreds of years. Similarly global temperature reconstructions shows that the temperature has not varied by more than a degree over the last couple of thousand of years — again suggesting that there climate is stable and not chaotic.
Now even if the climate is chaotic that does not mean that you can’t say anything meaningful or useful about its long term behaviour. For starters chaotic systems often have a strange attractor which means that no matter where you start you always end up on the attractor and thus the long term dynamics are known. So you can predict with confidence the average behaviour of the system since that is the same no matter where you start even if you cannot predict where you will be on the attractor. In the Lorenz system for example trajectories oscillator around known fixed points and so the average behaviour can be predicted both analytically and numerically.
Then there is the question of the size of the chaotic attractor. It is certainly possible that the climatic might be chaotic but only in a very small region of phase space which is smaller than what can be measured. While a numerically approximation to that chaotic attractor would end up in the same region of phase space but not at the same point meaning that it would be possible to predict the climate within the bounds of error.
Thanks for telling everyone that ..
THE CLIMATE IS STABLE
And THERE IS NO CLIMATE EMERGENCY.
Apart from that, the rest was just more empty gibberish, pertaining to nothing.
Basically just WASTED SPACE.
So if you think that the climate is stable then you must also think that Mike is wrong when he claims that it is chaotic. It cann’t be both.
And the fact climate is stable if there are no large perturbations applied does not mean that it will be unchanged if is perturbed.
Stupid argument as usual
Climate is stable within bounds
Within those bounds it is chaotic.
eg the two pendulum problem is totally unpredictable, except that you know they can only act within certain bounds.
So yes, you can have both.
The rest of your comment is just more gibberish.
Again you are agreeing with me. I stated clearly above that an important issue is the size of the chaotic attractor. If the climate is stable within bounds then it could be the case that the bounds are small enough so that the system is predictable.
You are an idiot..
You said absolutely NOTHING of any importance to anything
“then it could be the case that the bounds are small enough so that the system is predictable”
BUT IT ISN’T.. Stop your moronic and empty fantasy musings..
You are the one that said the climate was stable..
… so you must agree there is absolutely ZERO climate emergency…
And all the “climate science” mumbo-gumbo about CO2 is just a total fantasy…
Well done… you have just destroyed your whole reason for making moronic comment.
Reported to the Church Nonbirthing People.
Mike is simply quoting what they/ther wrote.
Better check your social credit score, Izaak, before they take your internet away – or worse.
“For starters there is no evidence that the climate is chaotic”
There is no evidence CO2 causes atmospheric warming, either.
But that hasn’t slowed your idiotic chicken-little yapping
Noted that yet again, you failed utterly to answer the question…
Do you have any evidence that CO2 in the atmosphere does ANYTHING except enhance plant growth.
Isaak Walton says “For starters there is no evidence that the climate is chaotic”. Not true. Firstly, I don’t have to show that the climate is chaotic, I only have to show that the models are chaotic. The paper references Kay, which shows massive instability – a change in initial conditions of less than a trillionth of a degree resulted in regional temperatures differing by more than 5 degrees. That is indeed a chaotic model.
“The paper is based on Chaos Theory, of course”
Well, I read the paper, and did not see any Chaos Theory expounded. Not even any diffeerential equations, which is where the theory comes from.
Lorenz was mentioned. Most people know of his butterfly, which is actually the solution of anon-linear three equation set. And yes, for a given trajectory you can’t predict far ahead exactly where you will be. The trajectories diverge, which impresses Mike greatly.
But you can predict that you will be on the butterfly. Weather is that unpredictable point, but the butterfly is the climate. The equation has three parameters; Lorenz chose an arbitrary set which produces the pictures you see. But a small change to those parameters will produce a different butterfly (climate). You can predict that, but not the weather.
Fluid dynamics, with CFD, routinely deals with turbulence (chaos). You can’t predict where points of fluid will go, and in fact they get thoroughly mixed. But you can usefully predict the behaviour of the fluid. This is major engineering.
You seem to be almost totally clueless about fluid dynamics and engineering… as usual.
What you have just said is a bunch of gibberish…
You cannot predict climate .. period. !
“But you can predict that you will be on the butterfly.”
NO, you cannot.. best you can do is create a totally imaginary “butterfly.”..
And climate is NOT a butterfly… probably one of the most absurdly ludicrous and gormless comments even you have even made.
Put a butterfly in any decent turbulence, and it will get mangled. (as often happens with wind turbines.)
“But a small change to those parameters will produce a different… blah blah..”
Yes and climate models have many garbage parameter..
ie GIGO !! dependant on random guesses at garbage parameters.
Nick can’t distinguish between a “result”, i.e. a state of being, and a “cause”, the factors that generate the state of being.
You can’t predict the future state of being without knowing the causes in detail.
from wikipedia: “Small differences in initial conditions, such as those due to errors in measurements or due to rounding errors in numerical computation, can yield widely diverging outcomes for such dynamical systems, rendering long-term prediction of their behavior impossible in general.[7] This can happen even though these systems are deterministic, meaning that their future behavior follows a unique evolution[8] and is fully determined by their initial conditions, with no random elements involved.[9] In other words, the deterministic nature of these systems does not make them predictable.[10][11] This behavior is known as deterministic chaos, or simply chaos. The theory was summarized by Edward Lorenz as:[12]”
Fluid flow problems, as mentioned by Nick, has defined boundaries with initial conditions that can be defined with small uncertainties. This simply doesn’t apply to the climate.
Lorenz said: ““The phenomenon that a small alteration in the state of a dynamical system will cause subsequent states to differ greatly from the states that would have followed without the alteration.””
Climate is perpetually exposed to small alterations to its current state. It is those small alterations that are unknowable and yet determine future dlimate.
You keep mentioning THE climate Nick.
Which one are we discussing?
(there’s heaps of them)
Do you even abstract concepts, Mr. Puppet?
What???
I think…. Willard mis-posted that non-sequitur… 😏
You keep using that expression, dear Janice.
It may not mean what you make it mean.
No need to play dumb, Mr. Puppet.
In “weather is that unpredictable point, but the butterfly is the climate,” the expression “the climate” refers to an abstract entity. That entity helps build theories about specific climates, like the Earth’s climate.
Nick’s claim works in general.
Wrong, as usual. All one can say is that the state of the system is constrained. The behaviour of the system within those constraints is completely unpredictable.
Every time I fire my kettle I think about that, Cat.
*Sips tea.*
In case you need to be told this, Willard:
the coupled atmosphere-ocean system of the Earth is not a closed system like, say, for instance, a tea kettle.
There is no need to tell me any irrelevancy, my dear Janice. More so when it’s false: imagine if the kettle was truly closed! The point you try to dodge is fairly simple: no need to predict the various atom configurations to know that it will boil.
Even you got to admit that there’s a certain elegance in refuting Cat’s silly argument with a phenomenon every contrarian could observe every day, don’t you think?
“There is no need to tell me any irrelevancy”
That is all your posts ever have, Dullard !!
Irrelevant gibberish.
No U, Mr. Nice.
Mr Dullard: Is the assertion true or false?
If I gave you a pendulum with a second pendulum attached underneath (the simplest chaotic system), would you be able to predict its state after more than a few seconds?
Oh, Cat. That’s so nice of you to presume that GCMs are analytical tools, missing the whole point of building them along the way.
You’re about to rediscover the very first lessons of an introduction to Earth Sciences. Do continue!
Mr Dullard: OK, let’s use your analogy. Predict where and when the first bubble of steam will occur on the base of your kettle.
*sips tea*
A “bubble of steam,” Cat – is that the *system* you want to predict?
You have quite a vivid imagination!
The paper references Kay, which shows massive instability – a change in initial conditions of less than a trillionth of a degree resulted in regional temperatures differing by more than 5 degrees. That is indeed a chaotic model. I didn’t need any formulae.
Climate is a RESULT, it is a state that results from multiple *causes*. Climate is *NOT* a cause. The butterfly is a CAUSE, not a RESULT.
The state of a flowing fluid is also a state, it is not a cause. You can only predict the behaviour of a flued by first defining the causes that *result* in the state.
Until you can distinguish between a state of being, a RESULT, and the factors causing the state of being all you have to offer is gibberish.
Well, actually, the “butterfly effect” comes from a phrase used for effect in a talk Lorenz gave. See https://www.bbvaopenmind.com/en/science/leading-figures/when-lorenz-discovered-the-butterfly-effect/ or https://en.wikipedia.org/wiki/Butterfly_effect
The butterfly plot is one of the pretty pictures which can be produced from some paths around some strange attractors.
Nice paper, Mr. Jonas. I would take issue with only one of your statements, and that is that “…there is no reason to suppose that these GCMs will vary randomly from the correct result.”
You noted that Kay et al had varied an initial condition in a GCM run by 1 trillionth of a degree C, and that model run differed substantially from the baseline. I submit that two GCM runs having identical initial conditions, with no change in the model and running on the same machine, have some probability of differing substantially. Some number of such runs would eventually produce a completely different result from all the rest.
My reasoning lies in the nature of computers. They’re “deterministic” in the eyes of the average user, so that the same program using the same inputs should always produce the same results. But in fact, they are subject to bit errors having a number of causes. Computers used in space (my area of expertise) are plagued by “single event upset”, where a cosmic ray will change the state of a bit in memory or in transit on a bus, causing a system crash. Radiation hardened processors and old-fashioned core memory are used in critical space computer applications. Highly developed error detection and correction algorithms (beyond a simple parity bit) are also part and parcel of modern computers, but they have their limitations. And Earth-bound computers are not immune to single event upset.
A GCM has be run on a very large computer. The larger it is, the more vulnerable hardware, and the higher the bit error rate. Error detection and correction can handle most of them, but at least one study has shown that it cannot handle them all. I think this is especially true for a computer running a 100 year climate simulation, where the computer has to be blazingly fast, and thus can’t afford the full-blown error detection and correction overhead theoretically available. Since a trillionth of a degree difference in an initial condition can cause a divergence from a baseline case, it is reasonable to think that an undetected bit error somewhere during the billions of calculations in a GCM run would cause an output that was truly random when compared to other runs on the same machine.
It would be an easy thing to test. No one I know of has shown any interest in doing so.
“Error detection and correction “
How do you find any real error if you have a pre-conceived result . !
Correction always depends on what the programmer wants to correct …
re Varying non-randomly from the correct result: This is about a lot more than the Kay finding re a trillionth of a degree or a bit change in a computer. It’s about the things that go into the models introducing biases, ie, making the errors non-random. And also the biases that come from things being left out of the models.
I think you missed my point. The model biases in any given collection of runs make the models give result that differ from model to model. Since the differences are biases, averaging doesn’t produce a “better” result. In between ensemble averages, the various modeleers may tweak their parameters and initial conditions, and if the code hasn’t changed, it will still give results differing from the previous run. In some sense, then, each successive ensemble average might be considered to have randomness in addition to the model biases. My hypothesis adds actual randomness to successive runs of the same code, on the same machine, with same parameters and same initial conditions. If it is investigated and found to be significant, then each model would have to be run several times, witht he final result being an average of all of the run results. That would be legitimate, though it would still leave the model biases which don’t average out from model to model.
Just to give you a flavour of the journal that this paper is published in have a look at
the article entitled “Conceptual study of vatakantaka” which includes the paragraph
“Vatakantaka is mainly vyadhi of vitiated vata dosha. Acharya charaka has mentioned it under Vatavyadhi chikitsa adhyaya. The local Vayu enraged by making a false step on an uneven ground, finds lodgment in the region of the ankle (Khudaka, instep according to others), thus giving rise to a disease which is called Vata Kantaka5. Treatment explained in Ayurveda for Vatakantaka are Bahya and Abhyantara Chikitsa such as Snehana, Upanaha, Agnikarma, Raktavsechana, Bandhana Suchikarma(Viddhakarma) and Abhyantartaha Erandsnehapaana.”
So I am guessing that the sort of person who believes that foot pain is caused by the actions of Hindu gods are the sort of people that Mike is hoping to convince with this paper.
Izzydumborwhat obviously can’t argue against the actual contents of Mike’s paper.!
His pathetic non-arguments convince nobody of anything, except that he is empty of content.
We have all seen some of the absolute dross that is published in propaganda rags like Nature Climate etc. !
Seems Izzy has a follower who is just as incredibly ignorant as he is.
Is that you Luser, ???
And of course Izzydumb has zero comprehension of what he has just posted about
“Walking in irregular or uneven ground, the structural deformity of foot or excessive strain on foot lead to provocation of vata and brings about severe pain in ankle joint which is referred to as Vatakantaka.”
Vatakantaka => Calcaneal Spur in western medicine.
Nothing to do with Hindu gods or anything.
(PDF) Conceptual study of vatakantaka (researchgate.net)
Way to make an ABJECT FOOL of yourself, Izzy!!
Here is a similar paper, a bit easier to understand.. remember, English is not their first language.
Management of Calcaneal Spur _Vatakantak_ – A Case Study (ijtsrd.com)
Poor pathetic Izzydumb.
Caught out being a complete moron…
Best it can manage is a red thumb.
How do you live in that tiny little mind of yours ?
Ayurveda, traditional system of Indian medicine.
Ayurvedic medicine is an example of a well-organized system of traditional health care, both preventive and curative, that is widely practiced in parts of Asia.
Ayurveda has a long tradition behind it, having originated in India perhaps as much as 3,000 years ago.
Today it remains a favoured form of health care in large parts of the Eastern world, especially in India, where a large percentage of the population uses this system exclusively or combined with modern medicine.
What is being described in this paper, in a journal published in India, is a Traditional Indian treatment regime for a very well-known and painful heel problem.
—-
Poor Izzy.. maybe a bit of basic education would stop him making idiotic comments. !
But probably not. !
Talking to yourself again, Mr. Nice?
Don’t forget your meds.
Well done, B.! 🙂 I am very grateful for all your clearly well-informed, accurate, powerful, comments — often, I am too busy (or lazy) to write responses and how WONDERFUL it is that you stalwartly stand up and fire off riposte after riposte.
WUWT has dwindled (due to causes I won’t elaborate on, here) to a shadow of its former self. A few years ago, you would have been joined by several others. Now, you are often left alone to defend data-driven science.
Again, well done!
(((APPLAUSE!)))
P.S. Your comments are needed to prevent the trolls from confusing/misleading readers. Sometimes, though, as with “Willard,” their nonsense does not deserve to be dignified with a response. So, good for you to IGNORE when you choose, also. 🙂
Thanks Janice 🙂
I could have gone further into what some of the other Indian terms mean, but just the basics was more than enough to show Izzy for the ignorant clown he really is.
Isn’t that nice, Mr. Nice?
You got a cheerleader!
Luser !!
Hey, Mr. Nice. I need to ask –
What was your favorite name among all your previous sock puppets?
Thanks for admitting I was totally correct about the ignorance of your comrade Izzydumb. !
Unable to counter a word I said.
Sorry, Mr. Nice. I usually do not pay much attention to what sock puppets proffer. What did you say, and how much are you willing to pay to be proofread?
At what point during the spending of tens of billions on ‘GCM model development’ the past 34 years should the skeptical questions be posed, namely: