Since there has been a lot of discussion about Monckton here and elsewhere, I’ve offered him the opportunity to present his views here. – Anthony
Guest post by Christopher Monckton of Brenchley
At www.scienceandpublicpolicy.org I publish a widely-circulated and vigorously-debated Monthly CO2 Report, including graphs showing changes in CO2 concentration and in global mean surface temperature since 1980, when the satellites went on weather watch and the NOAA first published its global CO2 concentration series. Since some commenters here at Wattsup have queried some of our findings, I have asked Anthony to allow me to contribute this short discussion.
We were among the first to show that CO2 concentration is not rising at the fast, exponential rate that current anthropogenic emissions would lead the IPCC to expect, and that global temperature has scarcely changed since the turn of the millennium on 1 January 2001.
CO2 concentration: On emissions reduction, the international community has talked the talk, but – not least because China, India, Indonesia, Russia, Brazil, and South Africa are growing so quickly – it has not walked the walk. Accordingly, carbon emissions are at the high end of the IPCC’s projections, close to the A2 (“business as usual”) emissions scenario, which projects that atmospheric CO2 will grow at an exponential rate between now and 2100 in the absence of global cuts in emissions:
Exponential increase in CO2 concentration from 2000-2100 is projected by the IPCC on its A2 emissions scenario, which comes closest to today’s CO2 emissions. On the SPPI CO2-concentration graph, this projection is implemented by way of an exponential function that generates the projection zone. This IPCC graph has been enlarged, its ordinate and abscissa labeled, and its aspect ratio altered to provide a comparison with the landscape format of the SPPI graph.
On the A2 emissions scenario, the IPCC foresees CO2 rising from a measured 368 ppmv in 2000 (NOAA global CO2 dataset) to a projected 836[730, 1020] ppmv by 2100. However, reality is not obliging. The rate of increase in CO2 concentration has been slowing in recent years: an exponential curve cannot behave thus. In fact, the the NOAA’s deseasonalized CO2 concentration curve is very close to linear:
CO2 concentration change from 2000-2010 (upper panel) and projected to 2100 (lower panel). The least-squares linear-regression trend on the data shows CO2 concentration rising to just 570 ppmv by 2100, well below the IPCC’s least estimate of 730 ppmv on the A2 emissions scenario.
The IPCC projection zone on the SPPI graphs has its origin at the left-hand end of the linear-regression trend on the NOAA data, and the exponential curves are calculated from that point so that they reach the IPCC’s projected concentrations in 2100.
We present the graph thus to show the crucial point: that the CO2 concentration trend is well below the least IPCC estimate. Some have criticized our approach on the ground that over a short enough distance a linear and an exponential trend may be near-coincident. This objection is more theoretical than real.
First, the fit of the dark-blue deseasonalized NOAA data to the underlying linear-regression trend line (light blue) is very much closer than it is even to the IPCC’s least projection on scenario A2. If CO2 were now in fact rising at a merely linear rate, and if that rate were to continue, concentration would reach only 570 ppmv by 2100.
Secondly, the exponential curve most closely fitting the NOAA data would be barely supra-linear, reaching just 614 ppmv by 2100, rather than the linear 570 ppmv. In practice, the substantial shortfall between prediction and outturn is important, as we now demonstrate. The equation for the IPCC’s central estimate of equilibrium warming from a given rise in CO2 concentration is:
∆T = 4.7 ln(C/C0),
where the bracketed term represents a proportionate increase in CO2 concentration. Thus, at CO2 doubling, the IPCC would expect 4.7 ln 2 = 3.26 K warming – or around 5.9 F° (IPCC, 2007, ch.10, p.798, box 10.2). On the A2 scenario, CO2 is projected to increase by more than double: equilibrium warming would be 3.86 K, and transient warming would be <0.5 K less, at 3.4 K.
But if we were to take the best-fit exponential trend on the CO2 data over the past decade, equilibrium warming from 2000-2100 would be 4.7 ln(614/368) = 2.41 K, comfortably below the IPCC’s least estimate and a hefty 26% below its central estimate. Combining the IPCC’s apparent overestimate of CO2 concentration growth with the fact that use of the IPCC’s methods for determining climate sensitivity to observed increases in the concentration of CO2 and five other climate-relevant greenhouse gases over the 55 years 1950-2005 would project a transient warming 2.3 times greater than the observed 0.65 K, anthropogenic warming over the 21st century could be as little as 1 K (less than 2 F°), which would be harmless and beneficial.
Temperature: How, then, has observed, real-world global temperature responded?
The UAH satellite temperature record shows warming at a rate equivalent to 1.4 K/century over the past 30 years. However, the least-squared linear-regression trend is well below the lower bound of the IPCC projection zone.
The SPPI’s graph of the University of Alabama at Huntsville’s monthly global-temperature anomalies over the 30 years since 1 January 1980 shows warming at a rate equivalent to 1.4 K/century – almost double the rate for the 20th-century as a whole. However, most of the warming was attributable to a naturally-occurring reduction in cloud cover that allowed some 2.6 Watts per square meter of additional solar radiance to reach the Earth’s surface between 1981 and 2003 (Pinker et al., 2005; Wild et al., 2006; Boston, 2010, personal communication).
Even with this natural warming, the least-squares linear-regression trend on the UAH monthly global mean surface temperature anomalies is below the lower bound of the IPCC projection zone.
Some have said that the IPCC projection zone on our graphs should show exactly the values that the IPCC actually projects for the A2 scenario. However, as will soon become apparent, the IPCC’s “global-warming” projections for the early part of the present century appear to have been, in effect, artificially detuned to conform more closely to observation. In compiling our graphs, we decided not merely to accept the IPCC’s projections as being a true representation of the warming that using the IPCC’s own methods for determining climate sensitivity would lead us to expect, but to establish just how much warming the use of the IPCC’s methods would predict, and to take that warming as the basis for the definition of the IPCC projection zone.
Let us illustrate the problem with a concrete example. On the A2 scenario, the IPCC projects a warming of 0.2 K/decade for 2000-2020. However, given the IPCC’s projection that CO2 concentration will grow exponentially from 368 ppmv in 2000 towards 836 ppmv by 2100, CO2 should have been 368e(10/100) ln(836/368) = 399.5 ppmv in 2010, and equilibrium warming should thus have been 4.7 ln(399.5/368) = 0.39 K, which we reduce by one-fifth to yield transient warming of 0.31 K, more than half as much again as the IPCC’s 0.2 K. Of course, CO2 concentration in 2010 was only 388 ppmv, and, as the SPPI’s temperature graph shows (this time using the RSS satellite dataset), warming occurred at only 0.3 K/century: about a tenth of the transient warming that use of the IPCC’s methods would lead us to expect.
Barely significant warming: The RSS satellite data for the first decade of the 21st century show only a tenth of the warming that use of the IPCC’s methods would lead us to expect.
We make no apology, therefore, for labelling as “IPCC” a projection zone that is calculated on the basis of the methods described by the IPCC itself. Our intention in publishing these graphs is to provide a visual illustration of the extent to which the methods relied upon by the IPCC itself in determining climate sensitivity are reliable.
Some have also criticized us for displaying temperature records for as short a period as a decade. However, every month we also display the full 30-year satellite record, so as to place the current millennium’s temperature record in its proper context. And our detractors were somehow strangely silent when, not long ago, a US agency issued a statement that the past 13 months had been the warmest in the instrumental record, and drew inappropriate conclusions from it about catastrophic “global warming”.
We have made one adjustment to please our critics: the IPCC projection zone in the SPPI temperature graphs now shows transient rather than equilibrium warming.
One should not ignore the elephant in the room. Our CO2 graph shows one elephant: the failure of CO2 concentration over the past decade to follow the high trajectory projected by the IPCC on the basis of global emissions similar to today’s. As far as we can discover, no one but SPPI has pointed out this phenomenon. Our temperature graph shows another elephant: the 30-year warming trend – long enough to matter – is again well below what the IPCC’s methods would project. If either situation changes, followers of our monthly graphs will be among the first to know. As they say at Fox News, “We report: you decide.”
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.





Phil’s Dad says:
August 14, 2010 at 5:55 pm
To paraphrase James Sexton (August 14, 2010 at 12:59 pm) and CRS, Dr.P.H. (August 14, 2010 at 1:19 pm) on GeoFlynx (August 14, 2010 at 11:40 am)
a) The precautionary principle does not trump the scientific method.
b) Not withstanding a) Any proposed action in mitigation of the hypothecated CAGW must itself pass the precautionary test.
The burden, Mr Flynx, remains yours.
——-
Well spoken, sir! You paraphrased me nicely!
Don’t forget the admonition of Hippocrates to “first, do no harm.” (this was attributed to him, but never actually appears in the Hippocratic Oath, see:
http://www.nlm.nih.gov/hmd/greek/greek_oath.html
The AGW crowd believes “first, do something/anything to reduce reliance upon fossil fuels, and worry about the economic consequences, scientific proof, and impact upon society later.”
James Sexton says:
August 14, 2010 at 9:16 am
duckster says:
August 14, 2010 at 7:34 am
So I can assume your failure to mention evidence of past CO2 levels much higher than today isn’t an example of “cherry-picking”, but rather an oversight? Believe it or not, CO2 levels existed prior to 1960. I know, it’s strange, but true nonetheless.
http://www.anenglishmanscastle.com/180_years_accurate_Co2_Chemical_Methods.pdf and
At bottom of PDF file:
Quote: This is an unofficial extract of E-G Beck’s comprehensive draft paper and is for discussion not citing
Mods, would you please correct my link in the preceding post? An extra ) appears at the end of the link, which should be:
http://www.nlm.nih.gov/hmd/greek/greek_oath.html
Cheers & have a happy weekend!
Lord Monkton, thanks for your contributions to this conversation!
[SNIP. This mod is tired of the ad-homs. Comment on the science. ~dbs]
Well, this has been a nice and illuminating discussion.
thechuckr says:
August 14, 2010 at 2:34 pm
“Gone from Tamino also. Romm and Tamino are cowards.”
No shocker there. You didn’t really expect them to post that, did you? They have to call a group meeting and do a talking points brain storming session with the other chicken littles.
Speaking of cowards, I’ve got to say I’m incredibly disappointed that many of Christopher Monckton critics have passed on an opportunity to directly challenge his assertions. Many times in the past, when WUWT does an article about Monckton, they are all over the page whining about this or that or stating he’s been proven wrong about this or that. Yet, today, when Viscount Christopher Monckton of Brenchley, Nobel Peace Prize recipient, former Science advisory to Margaret Thatcher, gives them a chance for dialogue, proofs and counter proofs, very few even bother to show.
Brad Beeson, Geoflynx, Bluegrue, and, of course, R. Gates were the only ones to directly ask for more clarity in, or directly challenge, his assertions.(as far as I can tell.) For them, well done, it isn’t always easy to challenge or make statements you know will be met with certain disagreement by the majority of people posting thoughts here or anywhere.
I guess the rest really belong at RC or Romm’s blog where they can safely remain cheerleaders. I’ve heard of intellectual cowardice before, but never really saw it until today.
I did have one other disappointment today, though. I didn’t get my wish, nor even a response to one of my prior posts which I was sure would elicit something. I thought it a hoot, I didn’t expect Christopher to actually mention it here, but it would have been fun.
http://wattsupwiththat.com/2010/08/14/monckton-why-current-trends-are-not-alarming/#comment-456847
Off to finish reading about a terrible mishap on an ice rink. It seems someone went berserk, took a man’s hockey stick from him and slapped him so hard with it the hockey stick broke.
Cheers.
Monckton of Brenchley,
Thank you for posting this interesting analysis and also for responding to comments.
You have already acknowledged and partially addressed what is one of my main criticisms of your analysis; namely, that you are displaying and analysing temperature records for as short a period as a decade. Your defence appears to have been twofold: 1) you also published a graph that doesn’t suffer from this alleged defect, and 2) somebody else made a statement involving an even shorter time period.
However, I would suggest to you that citing other graphs or somebody else’s statements does not excuse the shortcomings of that particular graph. The latter defence is anyway like comparing apples to oranges, as that other person’s statement did not involve your practice of calculating a warming rate from the temperature data.
It is the calculation of warming rates over too short a time period that is the real issue for me. If the time period is too short then shorter term temperature fluctuations and analysis uncertainties can drown out longer term climate trends. This means the value obtained for the warming rate has some randomness and that it also depends greatly on the specific start and end dates. In short, a rate analysis done over too short a time period is not robust. Presenting analyses that are not robust is not good scientific practice.
This lack of robustness can be illustrated by doing a rate analysis over a time period even shorter than the decade period of your analysis, as follows.
According to the widely quoted analysis of NASA’s GISS, the average global surface temperature in calendar year 2008 was 0.44 degrees C above the average for the 1951-1980 base period, while that for 2009 was 0.57 degrees above that base. The difference between the two most recent calendar years, i.e. 0.13 degrees C over one year, is readily seen to correspond to a warming rate of 13 degrees C per century when expressed in the units you use.
You continue to perform temperature rate analyses over shorter periods than are recommended as good practice. Given that, what would be your objection, if any, to somebody presenting the above analysis over only one year as evidence that global temperatures are in fact rising at the rate of 13 degrees C per century and, hence, the IPCC has gravely underestimated the rate of temperature rise?
I am peeved that James Sexton didn’t include me among the Lord’s critics.
Anyway, another try. It was conceded that the current CO2 increase rate was 200 ppm/century, and also the current temperature trend was 1.4 C/century. These add up to a feedback factor of 2.5 when you do the calculations. Of course, Monckton realized he was conceding something like this and attributed the warming to a “natural” cloud variation. However, the generally accepted reason for the brightening is the “manmade” aerosol reduction that went with various national clean-air acts, together with a lack of recent large volcanoes. Nobody has shown a scientific basis for how natural cloud variation can account for something of this magnitude.
So you might think, a-ha, the warming is because aerosols are still decreasing. Well, take that to its limit of reducing to pre-industrial aerosols, is the temperature returning to those levels? No, it is going to be warmer. Why? CO2, I would suggest.
Jim D says:
“Nobody has shown a scientific basis for how natural cloud variation can account for something of this magnitude.”
We don’t really know that much about what drives the climate. But we’re learning. Prof Richard Lindzen has a peer reviewed paper on the iris effect, in which clouds moderate temperature.
Check out the latest WUWT article. The McShane & Wyner paper states:
Natural climate variability is not well understood and is probably quite large.
You will learn that Michael Mann was just winging it, and got caught.
And if you learn something about the Scientific Method, you will begin to understand that the burden is on the alarmist contingent to provide a convincing argument for their
theoryhypothesisconjecture. So far they have failed.s. wing says at August 14, 2010 at 7:39 pm … basically that the time period is too short.
No, the time period is exactly right to provide information on recent events. And yes, that information may not be “robust” as you indicated. However, it is what it is and nothing more or less. Extending the time period, however, is also not robust if one is looking for indications of recent changes.
There is no right or wrong way to look at data. The key is to make sure you don’t go beyond the data used in one’s conclusions. Your example did exactly that whereas, the presentation above, clearly added the proper caveats.
@Fred H. Haynie
Extrapolating this model into the future suggests that CO2 at Mauna Loa will max out within 2 ppmv of 499 in March of 2091.
It’s clever that you are able to get your model to show this. Wouldn’t global atmospheric CO2 content be determined largely by what Co2 we are adding to it? And the ability of carbon sinks to absorb the difference? Otherwise it just seems like wishful thinking.
Bill Illis says:
August 14, 2010 at 5:49 pm
Now RealClimate says some model runs don’t have much warming in the last ten years so they accurately reflect the current 50% response trends (and somehow that says the models, as a whole, are therefore accurate). Well, that really says that the low temperature growth models/runs are the more accurate models so we should throw out the high temperature growth ones and focus on the low temperature growth ones. Or more accurately, we should be using the accurate almost-no-temperature-growth models.
They’re probably the same models, just different initial conditions. In any event, just because one model, or one model run, gets the last 10 years correctly is meaningless.
To use a gambling analogy … I could write two simulations of playing blackjack. One would use a good counting technique and the other would use only basic strategy. If I run these sims (models) several times using a random dealing algorithm, I could get exactly the same results over a short time period. However, over a long time period they will diverge significantly (one is a winner and the other a loser).
If one claims both programs accurately model a good counting technique then a short duration run cannot provide any verification one way or the other. This is so simple and obvious I find it hard to believe that anyone would claim climate models behave differently.
You probably noticed by now that a reduction in cloud cover, if related to global warming, would be a positive feedback, opposite to what Lindzen and Willis have suggested. If I was them, I would be very troubled by that reduction, if true, and they might want to look into it.
Well and succinctly said! Thank you.
JER0ME says:
August 14, 2010 at 5:35 pm
PJP says:
August 14, 2010 at 10:26 am
RE: Feedback.
In a true closed loop feedback system(I don’t belive the climate is one) all Positive feedback systems are unstable(from EE prof @MTU). You have a defined input parameter say speed or temp, which the feedback is added to and sent to the system. So on your house if you set the thermostat at 70 and Subtract(negative feedback) the measured temp say 68, you send a value of 2 to the system the heat turns on. When it gets to 70 you send a 0 and it shuts off. If the temp is 72, you send a value of -2 and the AC turns on. If you have positive feedback, the input is 70, the acutal is 68 you send 138 to the system and it goes into high. As the temp rises the signal to the system increases exponentialy until it blows up(short period of time). Note: Negative feedback systems can be unstable.
DL says:
August 14, 2010 at 8:49 pm
JER0ME says:
August 14, 2010 at 5:35 pm
PJP says:
August 14, 2010 at 10:26 am
RE: Feedback.
Yes, I think I missed that point. Closed loop is another thing altogether, and as you say, the climate does not seem to act as one.
Well yes, anyone familiar with interstitial condensation in the fabric of buildings would know that you can’t apply the simple radiative physics of CO2 to climate attenuation. Earth is a water planet. CO2 must always follow temperature, no need for any of this machination of minutia.
Last time I looked there was no polythene vapour barrier 10 feet above our heads covering the entire planet. Obviously climate “scientists” don’t live in houses and spend far too much time on field jollies in cheap single walled tents!
Bill Illis says:
August 14, 2010 at 10:40 am
CO2 is increasing at a slightly exponential rate. It is growing at 1.97 ppm per year and that rate is accelerating at 0.0017 ppm per year. So, next year it should increase at 1.987 ppm per year.
These rates have been fairly consistent for the last 60 years but CO2 does increase slightly faster in warm (lets say El Nino) years (with a slight lag behind temperatures) and it increases less fast in cooler years. The last time CO2 actually fell was in WW II when it declined by about 2 ppm from 1940 to 1946.
++++++++++++++
I understand from the links provided by this discussion that the methods used to measure CO2 in the 1940’s did not have a resolution of 2 ppm so the claim seems unlikely. The resolution did not reach 1 ppm until 1964.
Further, chemical methods are consistent and show CO2 concentration during the period 1940-1946 was significantly higher than at present, to an accuracy of 1%.
It would be interesting to have your opinion on the decline in CO2 and its rate from 1942 to 1960 and whether or not you can detect a change in the present rate of increase as Monckton has shown here.
Jim D says:
August 14, 2010 at 7:53 pm
“I am peeved that James Sexton didn’t include me among the Lord’s critics.”
My sincerest apologies. Please note, that I’m just a hack poster on this blog. I’m real sorry I missed your critiques. Do come back to another article and I’ll try to get you proper credits.
” Jim D says:
August 14, 2010 at 8:25 pm
You probably noticed by now that a reduction in cloud cover, if related to global warming, would be a positive feedback, opposite to what Lindzen and Willis have suggested. If I was them, I would be very troubled by that reduction, if true, and they might want to look into it.”
Depends on whether the clouds a high or low. You might want to look into the newest book by Spencer.
Dear Lord Monckton,
You write:
quote
The SPPI’s graph of the University of Alabama at Huntsville’s monthly global-temperature anomalies over the 30 years since 1 January 1980 shows warming at a rate equivalent to 1.4 K/century – almost double the rate for the 20th-century as a whole. However, most of the warming was attributable to a naturally-occurring reduction in cloud cover that allowed some 2.6 Watts per square meter of additional solar radiance to reach the Earth’s surface between 1981 and 2003 (Pinker et al., 2005; Wild et al., 2006; Boston, 2010, personal communication).
unquote
Googling around the references, I came upon this:
“Consequently, a much more logical conclusion would be that the primary driver of the global warming of the 1990s was the large increase in global surface-level insolation.” Co2Science
We need to present an alternative theory which lessens the role of CO2 but still preserves the fact of 20th century warming, which accounts for cloud albedo changes and which explains the undoubted fact of carbon isotope variation. If it were also to explain plankton decrease, the collapse of the Newfoundland cod fishery and the paucity of European eels then that would be a bonus.
Might I commend to you the Kriegesmarine Hypothesis? Not only does it have wide-ranging explanatory power, it also suggests solutions which are practical, unlike the economy-killing prescriptions of the CO2 hypothesis.
The carbon advocates have their 19th century Arrhenius: we ocean surface pollution advocates can adopt Benjamin Franklin and his Clapham pond.
JF
Whilst I don’t consider myself qualified to enter into the arcane mathmatical arguments above, i am surprised that no one has made reference to forthcoming paper by McKitrick, Ross; Strazicich, Mark; Lee, Junsoo; with the abstract:
“Total global carbon emission forecasts span such a wide range as to yield little guidance for policy. Global per capita emissions, by contrast, are well-constrained on both theoretical and empirical grounds. We find per capita emissions are trendless around a stationary mean of 1.15 tonnes, and analysis at the country-level indicates any nonstationary tendencies are cointegrated across nations. Gaussian, simulation and Bayesian methods all yield prediction intervals that imply the high emission scenarios currently in use by the Intergovernmental Panel on Climate Change are
improbable. Hotelling resource price dynamics in a Ramsey growth model help explain these findings, by showing that income growth does not imply per capita emissions growth, while convergence implies declining average emissions. We conclude that greenhouse gas emission trajectories on the low end of the current forecast
range are the most likely to be observed over the next 50 years. ”
Their conclusion is that per capita emissions have been stable for some time and thus increases will follow population growth.
There are also these telling comments on the IPCC scenarios:
“In developing their forty SRES emission scenarios the IPCC used a qualitative “storyline” methodology where future possible socioeconomic states of the world were narrated. The required time-paths of consumption and output needed to reach the projected end-state were then inferred. The quality of economic analysis underpinning these storylines is difficult to gauge since they are not based on conventional growth theory or theoretical resource models. These scenarios are used as inputs to IPCC climate change simulations directly influence the range of global warming predictions. This, in turn, has an impact on policy decisions (and media coverage) related to climate change, including debates over the Kyoto Protocol. The upper end of these forecasts has been the subject of considerable media and policy interest as well as some criticism. Among other things, the IPCC scenarios have been criticized for making international comparisons based on market exchange rates rather than purchasing power parities, which may bias emission estimates upward”
So we have :
1 Vindication of Chis’s approach
2 IPCC models built on sand. By the way, they all tend to assume continuous growth, in some cases improably so, which can only be delivered over the next 20 years as a minimum by fossil fuels.
3 Population growth is slowing and as third world countries become richer, as the models forecast, the growth will continue to slow. So therefore will emissions
4
Of course, the West can lower its emissions by implementing the barmy policies of Huhne (UK climate change minister) and Obama with the result that our economies are destroyed.
Whoops the missing 4 was
4 All of the IPCC scenarios are based upon the false assumption that CO 2 drives temperature/climate for which it has adduced no evidence.
Cheers
Paul
R. Gates says:
August 14, 2010 at 12:53 pm
“http://tamino.files.wordpress.com/2010/04/mloco21.jpg
Which of course shows that there is an exponential growth rate”
It certainly does not. What it indicates is rather some sort of hyperbolic curve (one that starts off curving upwards like an exponential but straightens out to a linear relationship). It could also be the rising part of a sine curve.
While I admire Lord M’s relentless work and his command of the facts, I am frequently put off by the arguments he chooses to make, since they often fall somewhat short of bullet-proof.
In this case, the impression one is invited to take away from his post is that carbon-dioxide increase has become less exponential, and he cites certain ten-year differences for that proposition. But other measures bear just the opposite conclusion. When I compute the exponential trends from the data found here ftp://ftp.cmdl.noaa.gov/ccg/co2/trends/co2_annmean_mlo.txt, I get trends that start at less than 30% per century for the 1959-68 decade to over 70% per century for the 2000-09 decade. I get increasing trends when I base the calculations on 15-, 20-, 25-, 30-, 35-, 40-, 45-, and 50-year intervals
To me, this is at least as good a test of whether the trend is “decaying from exponentiality,” and to me it suggests more exponentiality (if there is such a thing as exponentiality) rather than less.
” James Sexton says:
August 14, 2010 at 10:45 am
Alexej Buergin says:
August 14, 2010 at 9:58 am
‘If we look at liquid water that is boiled and escapes into the air, the answer is yes and yes. The water molecules replace others, mainly nitrogen (76%) and oxygen (23%), which makes moist air lighter than dry air; the replaced molecules have to go somewhere else (up).’
Isn’t the total H2O in the earth’s atmosphere basically fixed? Are you stating this moves the O and N out into space?”
No. The warmer the air is, the more water it can take, and the water VAPOR can come from the sea, a lake or some source of LIQUID water. When air rises, it cools, can absorb less water, and you get clouds or rain.
And no. When you change some liquid into vapor and add it to the atmosphere, you will have more air, which will have a bigger pressure and go higher up. But of course the difference will be very, very tiny. (We can forget about that).
So if there is global warming, there will be more water in the atmosphere, maily from the sea, which might produce
1) further warming due to more greenhouse gases
2) cooling due to more low clouds
The question is how much 1) and how much 2) and what else?