Guest Opinion: Dr. Tim Ball

Courts will not listen to or judge scientific disputes. The basic argument is that it is “your paper” against “their paper” and they are not qualified to judge. This was the issue when I participated in appeals to the US Supreme Court over actions of the Environmental Protection Agency (EPA). It is also the case in the three lawsuits filed against me. They are charges of defamation and not about the science. The lawsuits are effectively Strategic Lawsuits against Public Participation (SLAPP) or a legal form of ad hominem attack. The question is, if I am so wrong about the science, as they claim, then why the lawsuits? The answer is because they cannot say I am not qualified, although they tried, and my ability to explain the complexities of climate science in a way the public understands threatens them.
The same problems confront any discussion in a formal hearing about climate science. Politicians are no better equipped or qualified to determine a science confrontation than the Courts. Scientists who participated in the December 8, 2015, Subcommittee on Space, Science, and Competitiveness, Senate Hearings were exposed to such a dilemma. They found themselves in a forum where they were not qualified to provide the answers to questions and thus effectively lost the case. There is a solution to the challenge, but neither they nor those who organized the hearings understood it.
My presentation at the first Heartland Climate Conference in New York explained the challenges faced by the scientists identified pejoratively as Deniers in the Senate Hearings. Many listened but the few that understood were people involved in communicating science to the public. They knew the pitfalls and the techniques necessary. I underscored my point by saying that Al Gore’s movie deserved an Oscar because it was a brilliant piece of propaganda produced in the fantasyland of Hollywood. They knew how to dramatize the science to catch and hold people’s attention. Chief Justice Burton of the UK Court ruled it was propaganda in the week before Gore received his Nobel Prize, but did not order that it not be shown in the schools, even though it had nine scientific errors. Instead, he ordered that students be apprized of the problems and then shown another documentary for balance. Unfortunately, this assumes that students and teachers can determine who is right and who is wrong.
My message in New York was if the Skeptics are to counteract Hollywood they must understand and apply the same basic techniques. They must abandon the idea that getting access to Washington and participating in public hearings before Congress will achieve the goal of educating the public to the scientific truth. They must show how “Their paper” was deliberately falsified in terms the public can understand. The recent US Senate hearings failed because the “Deniers” explained the scientific problems with the science of “Their Paper.” The politicians and public didn’t understand the difference. Even if they entertained the idea that “Their Paper” was wrong they were confronted with the question of whether the errors were from incompetence or corruption, something the presenters of “Your Paper” were not able or willing to answer.
The title of the Senate Hearing “Data or Dogma? Promoting Open Inquiry in the Debate over the Magnitude of Human Impact on Earth’s Climate” guaranteed it would fail with most people. It failed despite the imbalance in presenters with four, Dr. John Christie, Dr. Judith Curry, Dr. Will Happer, and Mark Steyn arguing for Data, and Dr. David Titley tepidly arguing for Dogma. The imbalance is not surprising. I know it is difficult to get someone to debate the Dogma and that if they are willing to debate it means they know very little about the science. The failure was not the fault of the presenters rather it was the entire problem of arguing science in a political forum. It is similar to why courts won’t consider scientific disputes.
Dr. Curry signaled the defeat when she correctly demanded the right to defend her scientific integrity against the charge of being called a denier. I was surprised because I thought Dr. Curry learned the lesson about how nasty people are when you challenge the prevailing wisdom. Early in the ongoing saga about global warming, Dr. Curry leaned toward the AGW theory and IPCC science thus making her acceptable to her academic colleagues. Then, to her everlasting credit, Dr. Curry, tried to pursue proper scientific method by inviting Steve McIntyre to her University, Georgia Tech, to make a presentation on his analysis of the ‘hockey stick’. McIntyre commented about the reaction.
Readers of this blog should realize that Judy Curry has been (undeservedly) criticized within the climate science community for inviting me to Georgia Tech. Given that the relatively dry nature of my formal interests and presentation (linear algebra, statistics, tree rings etc.) and that I’ve been invited to present by a panel of the National Academy of Sciences, it seems strange that such a presentation to scientists should provoke controversy, but it did.
You cannot imagine how nasty people get until you experience it by challenging their prevailing wisdom. Dr. Curry learned as I did that the surprising and most emotionally disturbing attacks came from colleagues.
Skeptics were pleased with the performance of the presenters in Washington because they know about the science and the issues. What they forget is that the majority don’t know. Skeptics must first step out of their bias and view events objectively. Then they must understand the subjective perspective of those who don’t understand the science, which includes most of the public, the media, and the politicians.
I respect the science and integrity of those who appeared to explain the Data, however, it was difficult to watch them struggle with the political posturing. This observation is not a criticism because they are scientists and want to avoid politics as much as possible. It is as basic as the fact that by simply appearing for the Data side automatically placed them in the Republican camp. Their presence and arguments made them political. They also lacked understanding of the nature of the debate and how it exposed the Dogma side.
I regret to say the Dogma side won because the Data side failed to deal with the real questions implied in their argument. From the Dogma and citizens perspective science is science, so why are there disagreements? They see the Data presenters as representatives of a political perspective. They think this because they ask why would scientists at the IPCC present misleading data, or worse, manipulate the data? What is their motive? The Data presenter’s political motive is clear to them: they are directly or indirectly under the political or financial influence of the energy sector.
For most people the proof that the Data presenters were political was their failure to answer the questions posed by Senator Markey and others about the 97% consensus and the warmest year on record. In fact, they could not answer them because they require a political answer explaining why they falsified the data and their motive? What answer would you give?
Mark Steyn gave an erudite, humorous, blunt, assessment of the politics involved. The problem is he began by saying he is not a climate scientist. Unfortunately, this only served to underscore the view that his fellow Data panelists were also political. There was no political spokesperson for the Dogma side: Senator Markey knew it wasn’t necessary.
The problem for Data presenters is they are climate scientists, specialists each in one small area of the complex, generalist discipline of climatology. It would require dozens of such specialists to cover the subject and be prepared to answer all the questions and still they could not answer the political or motive questions.
How To Manage A Debate
For approximately three years the Roy Green radio program in Canada offered unlimited airtime for anyone who would debate the issue of Global warming with me. Nobody took the offer! Elizabeth May, leader of the Green Party in Canada, told Roy she wouldn’t debate, but would get someone to do it. A month later she told him that she couldn’t get anyone. This is why I was surprised when Ms. May agreed to a debate with me on the Ian Jessop radio program a few months ago. I won’t speculate on her motive.
I know I won the debate because Ms. May resorted to a personal attack at the end with a veiled threat of “another lawsuit.” It didn’t surprise me, although I was amazed and pleased about how many people picked up on it.
The problem with a global warming debate between two scientists is that the public would not understand; they wouldn’t even know who won. In a debate between a scientist and anyone else the scientist inevitably loses because it becomes about emotions, especially the exploitation of fear. Besides, there is always the fall back precautionary principle that we should act regardless of the evidence.
These conditions formed the basis of my thinking in preparing for my debate with Ms. May. I knew as a lawyer she would try to use detail, to find an “error” to justify rejecting the entire case. I also knew that Ms. May believed, as co-author of Global Warming For Dummies, that she knew the subject.
Ms. May did as I expected and discussed the scientific data and detail. I knew this would go over the head of most listeners as the Data specialist’s information did in Washington. I acted with discipline by not even correcting the many errors Ms. May made. There was no point in getting bogged down in data and detail that few understood. Besides, few would even know who was correct even after the explanation. It is the state of confusion and uncertainty about who to believe that is common for most people.
I did the opposite and provided general comments and examples speaking to the Dogma. The first thing was to undermine the credibility of the IPCC. Most people think that the IPCC study climate and climate change in its entirety. Once they learn that the definition given to them by the United Nations Framework Convention on Climate Change (UNFCCC) directed them only to study human causes of change, they realize the limitations of their work. You then reinforce the point by explaining that you can’t possibly determine human causes unless you know and understand natural causes. You can reinforce thay point by saying that failed weather and climate forecasts prove we don’t understand the natural causes.
Next, you counter the myth that a majority of scientists (97%?) agree. It is too technical to present the arguments so well laid out in Lord Monckton’s analysis. It is easier and more effective to explain that very few scientists ever read the IPCC Reports. They accept, not unreasonably, the results of other scientists without question, just as the public do. The confession by Klaus-Eckhart Puls does not require data or scientific understanding. It expresses emotions people understand, including surprise, shock, and then anger that anyone can appreciate. The IPCC produced scientific documents that decimated all scientific rules, regulations and practices. You don’t need to know that when a scientist publically admits his failure in accepting, and passing on their corrupted science without question.
“Ten years ago I simply parroted what the IPCC told us. One day I started checking the facts and data—first I started with a sense of doubt but then I became outraged when I discovered that much of what the IPCC and the media were telling us was sheer nonsense and was not even supported by any scientific facts and measurements. To this day I still feel shame that as a scientist I made presentations of their science without first checking it.”
He wasn’t fooled; he just didn’t look.
I wrote much of my book The Deliberate Corruption of Climate Science years before it was published. I delayed publication because I knew from teaching Science to Arts students for 25 years, giving hundreds of public lectures, and participating in several documentaries that the public was not ready. In many ways, it was still released too early. I know many skeptics shy away from it because it dares to answer the question implied but avoided by the Senate Hearing; if the data is falsified who would do that and why? The question that automatically arises when you argue the Data or Skeptic side is MOTIVE. Not only did the Dogma win, but they also scored points by the inability of the Data specialists to answer questions that required providing the motive.
Elaine Dewar provided the motive after spending five days with Maurice Strong at the UN where he created the United Nations Environment Program (UNEP), the UNFCCC and the IPCC.
Strong was using the U.N. as a platform to sell a global environment crisis and the Global Governance Agenda.
Even if the Data specialists or Mark Steyn spoke to this motive Senator Markey would argue, falsely but effectively, that it was necessary to save the planet and dismiss them as conspiracy theorists. We are still a long way from the point when explaining the motive to the public would resonate. It requires exposition of the crimes first, and that requires explaining the science in ways the public understand. One factor that prevents those that are able is seeing what happened to Dr. Curry, myself, or several others. Just ask Dr. Richard Lindzen. The price paid for even seeking the truth in climate science is financially and emotionally high, and few are willing to pay the price. Besides, the Dogmatists and the public believe energy companies’ reward them well for their efforts.
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Tim,
It’s too bad the courts don’t enforce the laws of physics. Three of which are consistently denied by the consensus.
1) Stefan-Boltzmann – The Earth looks like as an ideal gray body (emissivity = 0.62) from space with a corresponding sensitivity whose upper bound is less than the lower bound claimed by the consensus.
2) COE – The only way that more than 12 W/m^2 of feedback can arise from 3.7 W/m^2 of forcing to sustain a 3C surface temperature increase is to violate COE. (Consensus feedback theory assumes an amplifier with an unlimited external power supply which the climate does not have).
3) The second law – The heat engine producing the planets weather is the net result of evaporation, water vapor GHG effects, clouds, rain and weather and since its net result is to cool the planet (evidenced by hurricanes), the net feedback from water vapor all its consequences must be negative.
CO2good ,
Let’s get our physics right starting with step 1) .
One of the profound misconceptions which got me involved in this global statist nonscience is the notion that a gray sphere , ie : one with a flat spectrum , comes to any different temperature than a black body . It will come to the same temperature no matter how light or dark it is . This is the fundamental finding of Ritchie’s 1830s experiment which Kirchhoff , Stewart and I think others abstracted the fact that absorption and emission are just two directions thru the same filter . That is the temperature given by simply summing the impinging energy fluxes over the sphere . In our obit it is about 278.6 +- 2.3 from peri- to ap-helion . This value is the number which is of any use in calculations . ( Yet NASA does not list this gray body temperature for planetary orbits . )
I find it downright weird that I know of no other presentation of the calculations for arbitrary spectra than that which I presented at Heartland`s ICCC9 , http://cosy.com/Science/HeartlandBasicBasics.html , and elsewhere on my website . It boils down to a ratio of the dot products of the object’s absorptivity=emissivity spectrum with the source spectrum and with a Planck thermal spectrum . Surely someone in this crowd knows some radiant heat transfer textbook with the computation expressed and explained in classical notation . This is distinctly undergraduate STEM level basic physics yet seems to be generally very poorly understood .
Bob,
Here’s some data that shows how the output path of energy between the surface and space appears to be an ideal gray body whose temperature is 287K and emissions are 239 W/m^2 (emissivity = 0.62). Shown is the SB relationship for a black body, a gray body, the sensitivities of each and the presumed sensitivity by the IPCC.
http://www.palisad.com/co2/tp/fig1.png
The 20K small red dots are the monthly average emissions vs. temperature for constant latitude (solar insolation) slices of latitude. The larger dots are the average across the 3 decades of weather satellite data for each slice. Ironically, the data comes from the ISCCP project at GISS. The slope of this measured relationship is the sensitivity limit set by the energy path from the surface to space (about 0.3C per W/m^2)
When the input power is plotted against surface temperature, the slope of the curve becomes the slope of a black body at the surface temperature (about 0.2C per W/m^2). This sets the bounds of the actual sensitivity of somewhere between 0.2 and 0.3 W/m^2 rather than between 0.4 and 1.2C per W/m^2 as claimed by the IPCC.
All I’m interested in , first is the basic venerable experimentally verifiable quantitative physics . That 255K number I see in your graph is just a gross extreme value calculated assuming a step function spectrum with an absorptivity=emissivity ,
ae, of 0.7 with respect to the bulk of the solar spectrum and 1.0 over longer wavelengths , not an ideal gray body . It is nowhere near an accurate enough to explain the 4th and 5th decimal place variations in temperature all this noise is about . ( Theaeyou cite , 0.62 , corresponds to a temperature of about 247K . )There is no point to these extreme step function estimations when actual measurements of our spectrum as seen from the outside have been measured to some accuracy , and the associated temperature can be calculated from the ratio of dot products I present . Does anybody dispute or confirm that basic calculation ? It is implicit in the computation of the 255K number or the 247K corresponding to the ( 0.62 ; 1.0 ) step function spectrum you put forward . It is the very minor change in this spectrum as seen from the sun that is the only effect of GHGs on the planet’s mean temperature .
The real kicker comes from the fact that whatever that temperature calculated for the sun’s spectrum and a planet’s “Top of Atmosphere” spectrum , the Divergence Theorem says that the average interior temperature must match it . No electromagnetic phenomenon can “trap” and hold a higher kinetic energy density inside .
Only gravity can , and as HockeySchtick has presented , does in a straightforwardly computed amount .
Bob,
The 255K is the EQUIVALENT temperature of an ideal BB radiating the same 239 W/m^2 average that the planet receives from the Sun and emits to space. This is an EQUIVALENT quantification of the system. Equivalent modelling is a very powerful tool for analyzing complex system by distilling its behavior down to its average behavior, that none the less must conform to physical laws. This works because when using this method, the EQUIVALENT temperature of the surface very closely matches its average measured temperature.
An ideal BB surface (which the Earth surface approximates) whose average temperature is 287K emits about 385 W/m^2. Multiply this by an emissivity of 0.62 and you get the emissions of the planet. The grayness of the planet has nothing to do with the surface itself and is simply an artifact of the atmosphere as it delays emissions from the surface from reaching space. Consider the AVERAGE behavior of the planet if if had no atmosphere or had an inert atmosphere consisting of only N2 and O2 (which we can quantify exactly) and then incrementally add additional components one at a time and see what happens. Quantifiably, the emissivity will decrease as the surface temperature increases.
Spectral characteristics do not affect the warming capacity of a joule, as I said, a joule is a joule regardless of its source and the amount of work a joule can do is constant, although a joule can heat different materials by a different amount based on heat capacity. The point that the planet behaves LIKE an ideal gray body should be self evident from my previous plot by the alignment of measurements over a very wide range of temperatures with the ideal behavior of a gray body whose temperature is that of the surface and whose emissivity is 0.62.
More importantly, the graph I presented shows the nature of the error that led to an excessively high sensitivity which is the linearization of the relationship between temperature and forcing which ignored the T^4 dependence dictated by the Stefan-Boltzmann Law. AR1 justifies the linearity in the units of sensitivity as degrees per W/m^2 by claiming it’s linear over a small range, which of course it is, except that the center of the slope is the slope of SB and not an implicit slope passing through the origin.
And yes, gravity dictates the temperature profile of the atmosphere, as it does the temperature profile of our ocean and the temperature profile of the Venusian CO2 ocean (atmosphere) whose mass is roughly equivalent to the mass of Earth’s oceans. More importantly, the lapse rate related to the temperature of the N2 and O2 in the atmosphere has nothing to do with the radiative balance and emissions leaving the planet either pass directly from the surface or clouds to space or indirectly from GHG emissions that are not re-absorbed by other GHG molecules.
CO2 ,
I guess we are not as far apart as I thought . It sounds like you are basing the 255k number on actual measurement of our full
aespectrum as seen from the sun rather than the endlessly parroted ( solar 0.7 ; thermal 1.0 ) step spectrum which is used to “derive” it . I would very much appreciate a link to the spectral data and how it is collected because that does not strike me as easy .So you are agreed that except for the minor and easily calculated change in our spectrum as seen from the outside , CO2 has no effect on our mean temperature , and the GHG “hypothesis” for the difference between that “ToA” temperature and our surface temperature is false at a very fundamental level ?
Bob,
Yes, the 255K is based on actual measurements of planet emissions from 3 decades of weather satellite data. The planets emissions are directly measured by satellite sensors. The satellite sensors are relatively broad band and respond to energy within a spectral band. A spectral gap in emissions just means that fewer photons hit the sensor and the total power measured is lower, thus the equivalent temperature is lower. Sensors are also calibrated for absolute power given an ideal Planck spectrum representing a calibration temperature. Knowing how far the sensor is from the surface 1/r^2 can be applied to ascertain the total emitted power.
This is corroborated by indirect measurements of input power from the Sun based on reflection and direct solar measurements which results in the same magnitude of average power.
I do not agree that the only effect GHG’s has on the emitted spectrum. Atmospheric GHG’s absorb photons emitted by the surface. and that does affect the emitted spectrum. The overall shape of the emitted spectrum follows Planck emission at a temperature higher 255K, so if you tried to infer the temperature by the wavelength of peak emissions (Wien’s Law), you would get a much higher temperature since Wien’s Law assumes an ideal Planck spectrum.
In the steady state, Energy that enters the atmosphere must leave as the atmosphere has a limited capacity to store energy. The energy that exits the atmosphere can either leave the top into space and add to the direct emissions passing through the spectrally transparent regions of the atmosphere or be returned to the surface and accumulated with the incident solar power to warm the surface beyond that which the solar power can do on its own.
There are other problems with the IPCC definition of forcing at TOT/TOA (this is never sufficiently disambiguated in the AR’s). For example, an instantaneous 1 W/m^2 increase in solar power passing through the atmosphere is equivalent to an instantaneous 1 W/m^2 decrease in the power passing through the transparent window owing to increased GHG concentrations. These do not have the same effect because in the steady state, the entire W/m^2 of solar ‘forcing’ passes through to the surface, while only about 1/2 of the GHG related forcing is returned to the surface, the remaining half is ultimately sent out into space. This is the difference between energy entering the surface and having only one way out and energy entering the atmosphere having two ways out.
Another problem is that the IPCC definition of forcing obfuscates the negative feedback effect of ice and cloud reflection which leads to the bogus clam that GHG’s and clouds warm the surface by 33C. They neglect the 22C of cooling that clouds and ice has by reducing the input power via reflection, which is also part of the response of the system to forcing and can not be ignored.
Thank you for your answer . It’s perhaps the most detailed description I’ve seen of how our lumped planetary+atmosphere spectrum is measured from the outside . I think a text on these issues aimed at the quantitatively educated layman would go far in making the rational case . Roy Spencer had an excellent description of the many issues in “cleaning” satellite data a bit ago when describing his group’s move to their new , under 10,000 lines of Fortran system . But an overview text would be a very useful resource .
But , if the resolution of the data is only to the degree , then explaining 10ths of a degree temperature variations is beyond current capabilities . Further , if 255K , by coincidence the value calculated for a 0.7
aeratio , is the measured value , how come in one of your early posts did you give a value of 0.62 for the ratio ofaeover the solar spectrum to that over the longer wavelengths ? That ratio corresponds to a temperature of about 247.2K .Of course , what I’d call the effective radiative surface is not simply the ToA . It’s wavelength dependent and over the visible spectrum is generally either the actual surface of the earth , or cloud layers . ( I think Monckton gave a very evocative description here a while ago . ) However , that “surface” defines the boundary on which the Divergence theorem applies . There is no argument that CO2 and other “radiatively active” gases transduce radiant and thermal energy back and forth with their fellow atmospheric molecules , but that changes the variance of the temperature , not the mean . ( The relative lack of discussion of the effect of GHGs on the inertia of atmospheric temperature rather than the mean is another of those red flags which diverted me into this mosh pit . The discussion is just not on the equation by equation derivation I expect in any other branch of applied physics . I come at this from the perspective of such a nerdy APL programmer that I’m rolling my own : https://youtu.be/0u2_jKfo0A8?t=2h48m30s . )
In terms of the temperature gradient from the tops to bottoms of atmospheres , I think HockySchtick’s computations leave very little variance to be explained by anything but gravity . I actually have done little more than glance at his posts , but recognizing that the next excruciatingly obvious parameter to add the half dozen K expressions calculating the temperature of a colored ball in our orbit , is gravity , I’m content to leave it aside until I have some motivation and time to implement it in my 4th.CoSy . I’d rather someone just download 4th.CoSy and I’d just support them in implementing this physics themselves .
Bob,
The resolution of each pixel is 8 bits for most of the satellite channels. However, many thousands to millions of pixels go into a single measurement which significantly extends the resolution by the laws of averaging. Theoretically, this data can extract hundredths of a degree trends, but it only covers 3 decades of data which is insufficient for establishing any kind of long term trend. It’s still quite good for establishing the average response to change, which applies even more averaging and is how I’ve primarily used the data.
Absolute accuracy is hampered by GISS processing http://isccp.giss.nasa.gov which relied on a sloppy cross satellite calibration methodology that manifests significant jumps and drifts in absolute calibration when it looses continuous polar orbiter coverage, a polar satellite exhibits any drift or a replacement has a significantly different sensor response curve and this introduces more than 1C of noise to the data. I have my own tools to analyze the ISCCP satellite data and know how to fix this, but haven’t had the chance to do so, although I do have the TB of minimally processed data (close to raw) already downloaded and should be able to back out the ISCCP ‘adjustments’ and apply a better cross calibration algorithm. The tool is what I used to generate the plot in the earlier post and it provides me with a significant amount of other functionality and many ways to analyze and present data and compare data to models
The distributed nature of the emitting surface of the planet is why it’s best to model the atmosphere as a bulk object. This is where equivalent modelling really shines. The surface itself is better defined and more ideal with an emissivity at the boundary with the atmosphere close to unit which makes an equivalent gray body a very good match to the system, especially since changes to the surface temperature are what we are interested in.
It’s also no coincidence that the equivalent temperature of 255K corresponding to 70% of the 341.5 W/m^2 average input from the Sun is also the equivalent temperature of the energy emitted by the planet. Again, this is where equivalent modelling comes through since an EQUIVALENT temperature is a precise quantification of a rate of energy in joules per second (watts) and joules must be conserved.
I agree that CO2 does not change the average kinetic temperature of the atmosphere, i.e., the ideal gas temperature, since no NET energy from photon absorption/emission by GHG’s is transferred. Sometimes CO2 speeds up if the emission wavelength is less than the absorption wavelength, but equally and oppositely, some CO2 molecules slow down when the opposite occurs, moreover; only a tiny fraction of the photon energy is converted at once and as you said, this increases the variance, not the mean. But. its not the temperature of the atmosphere that matters, its the temperature of the surface.
The value of 0.62 is the equivalent emissivity of an ideal gray body whose temperature is the average temperature of the surface of Earth and whose emissions are the average emissions of Earth into space (equiv to 255K) based on the Stefan-Boltzmann Law.
Wow, you should see the email I got back trying to get some things clarified. Warmists don’t like you looking behind the curtain. Watch the video above and it makes every case you need to debunk the AGW theory.
1) It claims that the tilt of the earth warmed the N Hemi. (note, it hasn’t tilted back, so it is still warming)
2) It claims that the slight warming got the oceans to release CO2, priming the CO2 engine. Problem is, increasing from 180 to 220 only trapped 1.2W/M^2. That isn’t enough to warm the globe.
3) In reality what did the warming do? A) as the ice receded in exposed more ocean to the warming sun B) it turned a very very very dry ice age climate into a normal N Hemi climate that is conducive to life and growth. That means H20 was added to the dry air. Using the default H2O for the N Hemi, that added 70W/M^, or over 50X the trapped heat of CO2. H20, not CO2 is what warmed the atmosphere.
4) As the glaciers melted they slowed the warming of the oceans, but eventually the stopped cooling the oceans, and the oceans warmed more rapidly releasing even more CO2 and H2O.
That is a CO2 free explanation of why the earth warmed and has been warming, and it doesn’t need CO2 at all. That is a far better explanation that what is explained in the video. I would encourage everyone to start promoting this CO2 free explanation and watch the warmists freak. As I’ve said, you should see my email. Warmists don’t want you looking at things this way. It is blasphemy.
When are we scheduled to tilt back ?
I think that Ferdberple made an excellent suggestion which is to start a campaign against the use of private jets. This will start to show who is genuinely concerned about greenhouse gases and who is just grand standing. I would suggest, also, that in talking with others we state clearly that we do not believe in global warming and, what is more, no one else does either if we look at their actions. We need to drive home the disconnect between the words and the actions.
Patrick
The skeptics’ written testimonies were excellent, but, sadly, Data did not trounce Dogma, as discussed here http://www.globalwarming.org/2015/12/21/climate-change-hearing-lessons-from-data-vs-dogma/. Skeptics can win the live debate in the hearing room, but to do so they will need limit the number of witnesses to just one or two and strive to pursue a single line of inquiry from the start to the end of the proceeding. Using that strategy, Former House Small Business chairman Jim Talent (R-Mo.) Congresses thoroughly discredited the Clinton administration’s Kyoto Protocol economic analysis in a pair of hearings during the 105th and 106th Congresses. I saw it with my own eyes and tell the tale here: https://www.masterresource.org/climate-economics/climate-hearings-in-the-112th-congress-gop-chairmen-will-need-talent-like-jims/
Great link.
Fine article.
Easy to follow observation.
http://centerforriskcommunication.org/risk-communication-consultants/
The above link has been used to make the message for many a client. Can’t hurt to know how they do it.
Demosthenes was practicing
rhetoric with pebbles in his mouth. Young Japanese employees mount an elevated
place in the city and sing the
anthem of their company.
Tears of shame in the face.
And folk bands playing in the backstage behind iron roller blinds protecting against bottles.
Thrown after them by the animated crowd.
Lawyers humiliate their clients:
The goal is public standing.
– thats what You’re experiencing right now, Dr. Tim Ball. –
Wish I had Your nerves – Hans
The scientific community needs to police itself. Unfortunately, we have created a $9 billion per year incentive for scientists of all levels to look the other way. I left the field once I realized that it is not possible to question the “consensus” without severe repercussions to one’s career prospects. Those of us who left and are on the outside looking in have no influence or credibility. Working scientists are going to have to save science.
You do have credibility .
The hearing was the first time that key facts were put on record in a public forum: the most important being that the satellite temperature is far more credible than the surface temp being falsely used to show warming.
As such no one can now claim they did not know the surface data was corrupted and that the satellite temperature must be the first port of call when assessing current warming.
Like the butterfly effect, this might not seem a large change but the effect will be profound.
Here’s a summary of the video evidence : https://youtu.be/Kpbkj0iac6M