Greenland Near-Surface Land Air Temperature Data from Berkeley Earth Present Some Surprises

I enjoy surprises in data, especially when they might make alarmists unhappy.

Willis Eschenbach’s post Greenland Is Way Cool at WattsUpWithThat prompted me to take a look at the Berkeley Earth edition of the Greenland TAVG temperature data. See Figure 1, which presents the graph of the annual Berkeley Earth TAVG temperature (not anomaly) data for Greenland from 1900 to 2012, which is the last full year of the regional Berkeley Earth data.

Note: I used the monthly conversion factors listed on the Berkeley Earth data page for Greenland to return the monthly anomaly data to their original (not anomaly) form and then determined the annual averages. [End note.]

Figure 1

With the relative large multidecadal variations in the data, it seemed as though I was taking the trend of one and a half cycles of a sine wave. That prompted me to start the graph in January 1925 and check the linear trend again. See Figure 2. Imagine that: a flat linear trend of Greenland land surface TAVG temperatures from 1925 to 2012.

Figure 2


Are you wondering if the climate models stored in the CMIP5 archived (which was used by the IPCC for their 5th Assessment Report) showed a similar not-warming trend for the same time period? Wonder no longer. See Figure 3.

The climate model outputs, the CMIP5 multi-model mean, are available from the KNMI Climate Explorer. For Greenland, I used the coordinates of 59.78N-83.63N, 73.26W-11.31W, which are listed on the Berkeley Earth data page for Greenland, under the heading of “The current region is characterized by…”. The simulations presented use historic forcings through 2005 and RCP8.5 forcings thereafter. I use the model mean, because it represents the consensus (better said, group-think) of the climate modeling groups who provided climate model outputs to the CMIP5 archive, for use by the IPCC for their 5th Assessment Report.

Figure 3

For those who would prefer to see the full spread of the model outputs in a model-data comparison, you may want to think again. See Figure 4. It’s a graph created by the KNMI Climate Explorer of the 81 ensemble members of the climate models submitted to the CMIP5 archive, with historic and RCP8.5 forcings, for Greenland land surface temperatures, based on the coordinates listed above. There appears to be roughly a 7- to 8-deg C spread from coolest to warmest model. Well, that narrows it down. It’s just another example of how the climate models used by the IPCC for their long-term prognostications of global warming are not simulating Earth’s Climate. Each time I plot a model-data comparison, I find it remarkable (and not in a good way) that anyone would find climate model simulations of Earth’s climate, and their simulations of future climate based on bogus crystal-ball-like prognostications of future forcings, to be credible.

Figure 4

Back to the multi-model mean: Figure 5 presents the observed and climate-model-simulated multidecadal (30-year) trends in Greenland near-surface land air temperatures, from 1900 to 2012. Once again, the models are clearly not simulating Earth’s climate.

Figure 5

I’ll let you comment about that. We’ve already discussed and illustrated back in 2012 how climate models do not (cannot) properly simulate polar amplification. See the post Polar Amplification: Observations Versus IPCC Climate Models. The WattsUpWithThat cross post is here.

That’s it for this post. Enjoy the rest of your day and have fun in the comments.


Please purchase my recently published ebooks. As many of you know, this year I published 2 ebooks that are available through Amazon in Kindle format:

And please purchase Anthony Watts’s et al. Climate Change: The Facts – 2017.

To those of you who have purchased them, thank you. To those of you who will purchase them, thank you, too.


Bob Tisdale

92 thoughts on “Greenland Near-Surface Land Air Temperature Data from Berkeley Earth Present Some Surprises

  1. “See Figure 1, which presents the graph of the annual Berkeley Earth TAVG temperature (not anomaly) data for Greenland from 1900 to 2012”
    It is anomaly data, and it is the spatial average of anomalies. Fig 1 just plots it with a shift to the y axis (as the note says).

    • Nick, it is NOT as you wrote “the spatial average of anomalies”. You’re once again thinking of GISS, NCEI and HADCRUT data or you’re attempting to intentionally mislead the readers here.

      Steve Mosher of Berkeley Earth was very specific in his comment here that Berkeley Earth does NOT average anomalies:

      “All other methods use station temperatures and then they construct station anomalies and then they combine those anomalies.
      “Our approach is different. we use kriging…”

      And my note in this post was very specific when I stated:
      Note: I used the monthly conversion factors listed on the Berkeley Earth data page for Greenland to return the monthly anomaly data to their original (not anomaly) form and then determined the annual averages. [End note.]

      Good-bye, Nick. You waste my time, and I don’t like having my time wasted.


      • Bob,
        “Nick, it is NOT as you wrote “the spatial average of anomalies”.”
        It is. The data file you link is specific (my bold):
        “Temperatures are in Celsius and reported as anomalies relative to the
        % Jan 1951-Dec 1980 average.”

        “% Note that all results reported here are derived from the full field
        % analysis and will in general include information from many additional
        % stations that border the current region and not just those that lie
        % within this region. In general, the temperature anomaly field has
        % significant correlations extending over greater than 1000 km, which
        % allows even distant stations to provide some insight at times when
        % local coverage may be lacking.”

        “% For each month, we report the estimated land-surface anomaly for that
        % month and its uncertainty.”

        • And once again, Nick, it does not state in the quote you provided that they are derived from anomalies. You boldfaced an “In general” comment, not a discussion of methodology.

          Maybe you should read and try to understand their methods paper, without your willful misinterpretations.

          Good-bye, Nick.

        • “Our approach is different. we use kriging”
          But what do they use kriging on? Their methods paper explains clearly why they apply it to anomalies (my bold):

          “One approach to construct the interpolated field would be to use Kriging directly on the station data to define T(x,t). Although outwardly attractive, this simple approach has several problems. The assumption that all the points contributing to the Kriging interpolation have the same mean is not satisfied with the raw data. To address this, we introduce a baseline temperature bᵢ for every temperature station i; this baseline temperature is calculated in our optimization routine and then subtracted from each station prior to Kriging. This converts the temperature observations to a set of anomaly observations with an expected mean of zero. This baseline parameter is essential our representation for C(xᵢ). But because the baseline temperatures are calculated solutions to the procedure, and yet are needed to estimate the Kriging coefficients, the approach must be iterative.”

          • My apologies, Nick. It appears that Mr. Mosher wasn’t complete in that comment, which he concluded with, “The Temp we give you is the absolute T. the source is the observations.”

            The Berkeley Earth “methods” paper continues beyond what you quoted:

            “In our approach, we derive not only the Earth’s average temperature record j avg T, but also the best values for the station baseline temperatures bi. Note, however, that there is an ambiguity; we could arbitrarily add a number to all the j avg Tvalues as long as we subtracted the same number all the baseline temperatures bi. To remove this ambiguity, in addition to minimizing the weather function W(x,t), we also minimize the integral of the square of the core climate term G(x) that appears in Equation (8). To do this involves modifying the functions that describe latitude and elevation effects, and that means adjusting the 18 parameters that define λ and h, as described in the supplement. This process does not affect in any way our calculations of the temperature anomaly, that is, temperature differences compared to those in a base period (e.g. 1950 to 1980). It does, however, allow us to calculate from the fit the absolute temperature of the Earth land average at any given time, a value that is not determined by methods that work only with anomalies by initially normalizing all data to a baseline period.”

            So, in layman terms, it appears Berkeley Earth take the temperature data in absolute form, convert it to anomalies, do their magic to it, then convert back to absolute temperatures.

            In some respects, we’re both correct.


          • Thank you, Alan! Made me laugh.

            As I wrote in the post, “Note: I used the monthly conversion factors listed on the Berkeley Earth data page for Greenland to return the monthly anomaly data to their original (not anomaly) form and then determined the annual averages. [End note.]”

            Cheers, Bob

          • Thanks, again, Bob.

            I suggest people visit Bob on his site – Climate Observations – get his free stuff, buy a couple of his books and donate to him. He does a helluva lot of work and receives scant appreciation.

          • And to think I included “(not anomaly)” in the text so that I could avoid using the term “absolute”, which other visitors here complain about, even with my qualifying the term as commonly used by the climate science industry.

            Maybe I’ll use both next time to tweak them all.


    • Is it possible in today’s world to obtain actual facts, without the use of “Models” ?

      A very important question.

      And the shocking answer us that no, it isn’t. But….

      Even something as simple as taking the temperature of a pond with a thermometer, relies on a model we have of the world in which ponds, thermometers and temperatures actually in some sense exist.

      The thing that will probably arouse the most ire in all the hard scientists with no background in philosophy of science, is that the notion of ‘temperature’ is a model – a categorisation of experience into a concept that is not directly accessible to the consciousness which conceives of it.

      This is an important idea. Most people casually assume that ‘gravity’ is as real as the world they inhabit. It is not. It is an abstraction derived from it, or rather a proposition that seems to explain it. We never experience ”gravity’ just the ‘forces’ that it exerts on ‘bodies’.

      The world is not turtles all the way down, it is, as far as our rational selves are concerned, models all the way down.

      And MOST of the more flagrant mistakes we make are confusing our ideas about the world, with the world, in itself.

      Models are idealised simplified mental constructs that represent the world.

      WE say ‘there is next-doors black cat’ without realising just how many assumptions simplifications and downright anthropocentric bigotry is contained in that statement.

      Information theory would show that even if it were possible to define and exact boundary around the ‘cat’ the contained only the ”catness” so its smells and its fishy breath and so one were excluded from it, and ignoring the fleas – are they part of ‘the cat’ ? – we are still left with a massive amount of information to represent the ‘cat’ at just a single moment in time, let alone in some kind of temporal persistence. Examination of its spectral output with respect to reflected visible light would reveal a complex and transient set of spectral emissions of which ‘black’ would be a gross and inelegant approximation.

      As far as its ownership goes, to say that it ‘belongs’ to’ ‘next-door’ implies massive anthropocentric and cultural ideas of ownership and tribal relationships that surely are completely irrelevant to the creature.

      Those who are familiar with IT will realise that ”next door’s black cat’ is essentially metadata – a pointer to a far far richer and more complex data set that is the experience of the the cat in its raw form.

      And that is the problem. Before we can do physics, we need metaphysics. Metaphysics to tell us what the facts are, that we are using as data to confirm or disprove our theories. BUT the metaphysics is in itself arbitrary. And facts are in some sense an unknown and unknowable Reality transformed – perhaps convoluted is a better term, by applying a subconscious metaphysical process to it.

      The term a posteriori was used to distinguish all the knowledge we have that applies to a (presumed) external world. What we know of the world is at first hand raw experience. Part of the process of becoming an adult is to acquire the ability to structure and categorise it and develop a sense of time and place. Yet even time and place are not raw sensation. They are internalised models that we use to categorise experience into meta data – ‘entities” that can have temporal persistence.

      This is the metaphysics of the classical world that we think we live in. A world of objects in space time all of who have relation to reach other via laws of Nature which are in essence partial differential equations of behaviour (causality) expressed as with respect to time.

      This model is of course in the limit known to be wrong and totally inadequate. Even classical physics of atoms shows that ‘point masses’ do not exist…

      Approximations are OK up to a point (sic!)… BUT it is important to realise that they are approximations.

      This view of the world and science as a hierarchy of models, some of which are more basic (metaphysical) than others, is, I think, a valid approach: It is on those to whom all knowledge is level and one dimensional that the greatest frauds may be perpetrated .

      Those who cannot distinguish the warmth on their skin (sensation) from temperature (a notion) and therefore think that the hypothetical entities of science live in the same space as the more basic entities of the facts of their daily lives, and who utterly fail to realise that even those facts are the product not just of some presumed reality out there, but also of our means of interpreting it into a coherent collection of approximate entities in an approximate time and space matrix, are infinitely gullible.

      It is models all the way down, BUT they come in layers. And the reason why we can say such things as ‘climate science is refuted by the facts’ is that climate science and its refutation lie above and purport to be about a set of agreed ‘facts’ which in this context are the notions that such a thing as a global average temperature – or even a series of local temperatures – exists and has meaning and is measurable.

      Insofar as these things may be said to exist and have meaning and can be measured we can then say ‘these idealised notions that we have nonetheless measured reasonably accurately, refute the notion that carbon emissions are the biggest influence on them, at this point in time’.

      That is, our basic ‘facts’ at this level are derived from metaphysical models of the world in which such abstract entities as ‘temperature’ and ”carbon dioxide’ have been given meaning.

      That is, what we call facts are simply the product of models deeper towards the metaphysical end of the spectrum that we agree to treat as facts, because they accord reasonably well with our mutual experience of the world

      That last bit is extremely important, As scientists we can partially accept the New Left’s insistence that Truth is relative to culture, in that the way we interpret the world does indeed vary from place to place and from one time to another BUT, the product of that culture has to be a worldview that is not inconsistent with experience.

      In this context climate change has sought to make itself a metaphysical assumption. True because it was believed, and used as a metric to judge the world. Unfortunately it depends on a testable proposition, namely that the world is in fact getting warmer, and warmer in lockstep with rises in CO2 concentrations in the atmosphere. And that all these entities are as we who ‘deny’ climate change, understand them to be,. So we measure the temperature and Lo! it has not risen in accordance with predictions.

      Climate change is inconsistent with the ‘facts’ it purports to represent. That these ‘facts’ are in fact (sic!) the output of other metaphysical models, is not relevant in this precise case. Climate science sets out to propose a hypothesis based on certain ‘facts’. If those very same facts refute it, then those promoting it have only themselves to blame.

      • Hmm.
        Dogs may have owners.
        Cats have servants.

        Or am I delving too deeply?
        Or not deeply enough?

        If ‘the cat that habituates next-door’ scratches me when I go to pick it up, is that some sort of a model?
        Probably I am one-dimensional; certainly, I am deplorable, and, likely, a ‘swivel-eyed loon’, too.
        I have read – and re-read – but obviously not understood Leo’s contribution. Sorry.


        • I have read – and re-read – but obviously not understood Leo’s contribution.

          It goes something like this: “There is no true statement except this one, that there are no true statements.” A proposition that contradicts itself on its very face.

      • A simple time-tested model, such as a thermometer that models temperature based on the expansion of mercury, is a far cry from a recent computer model that attempts to model complex and chaotic processes like climate based on the limited understanding of its programmer. The programmer is forced to make educated guesses about complex relationships that are not well known and leaves out influences that, as yet, are completely unknown. It requires hundreds, if not thousands, of years to time-test such a model to verify whether or not the educated guesses are correct or have missed the mark completely. Many useful idiots assume that computers are “smart” and can supply all missing information that humans do not yet understand. They are misinformed. Computers are slaves to their masters. They only do what their programmers tell them to do and reflect the biases of their programmers.

  2. I enjoy surprises in data, especially when they might make alarmists unhappy.

    So do I . I really do.
    But we should fight it.
    And take the data as it is.

  3. At this point in climate hustle, the modelling groups continued modelling with their Climate Cargo Cultist outputs simply demonstrates they are merely a jobs programs for over-produced climate PhD’s and their associated supercomputer centers.

    The only way forward to sanity and return to rationality is to cut government funding, both extramural and intramural, to climate modeling research activities by at least 90%. Then hold that reduced funding level for at least 4 FY budget cycles to clean out the climate hustle government-industry cabal and then try an start over with just 2 NSF funded groups in the US.

  4. It’s clear Gore was brilliant when he banked on the late boomers buying his speel, temps trended down while they where children and then trended up as they got out o school (“V” trend, the larger trend irrelevant). I remember well how CO2 was going to save us all from GW in the late 70’s…the speel was executed well and effectively around 20 yrs b4 Gore lost his election. 20 years is the period that the propaganda folks plan/execute by, the last generation will be irrelevant and the next ignorant but in the “know” aka MSM cool aka OcrzyO…will she takes on male circumcision in the USA…seems more relevant that GW?

  5. Is it not fascinating that they use historical data and tack on RCP8.5 speculations so induce the public to think the inflection represents an “acceleration”?

    Anyone who checks the outcome of speculations with lesser copies of RCP’s will soon find curves of lesser value, both as thermal catastrophes and eye candy.

    What value does RCP8.5 bring to the present or the future? Perhaps it has a place as a target of late show TV ridicule – its only net present value.

  6. Whenever you look at results from BEST, consider the station numbers and distances they use in their krigging: Note that there between 1-5 regional stations prior to WWII. There is also a huge drop in the last two years. For year 2000, about 20 stations in the regions krigged with another 30 stations outside the region to 500 km, another 40 stations 500-1000 km. Regional smearing at best. GIGO. For Greenland from their site:

  7. “We’ve already discussed and illustrated back in 2012 how climate models do not (cannot) properly simulate polar amplification.”

    They have it backwards, an increase in climate forcing cools the AMO and Arctic. There were AMO and Arctic cold anomalies in the mid 1970’s, mid 1980’s, and early 1990’s, when the solar wind was the strongest. And a warming AMO and Arctic from the mid 1990’s from when the solar wind weakened.

  8. Anomalies means the trend at a station. Global average of anomaly presents the average trend. This is not affected by the magnitude of station temperature. Temperature averages are dominated by the Sun’s movement and local climate system effect.

    Global average — temperature anomaly is appropriate
    Station average and homogeneous zone [article under discussion represent this] — temperature magnitude is appropriate.

    Dr. S. Jeevananda Reddy

  9. The models also grossly underpredict ice loss by a factor of 2-3, hence they are using the wrong albedo. They would be running even hotter if they accounted for the proper level of ice loss.

  10. The figures show factual data. What these don’t show is causation of the ups and downs in the data.

    As far as climate models go, do any show futuristic ups and downs? If I remember they all show up, up, up and no up/down variations. This I would question and bring me to have little to no trust in them.

  11. The “models” do not have elevation, i.e.. the height of land above the geoid and all the effects that wind fields – pressure and temperature – and GRAVITY bring. The “model” domain is a 2-d “sheet” — a numerical array, no z-dimension. Talk about the REAL flat-earth society! There it is!

    Ha ha

    • “The “model” domain is a 2-d “sheet” — a numerical array, no z-dimension.”
      Complete nonsense. The models are 3D and have typically 32-40 vertical layers of cells. They take proper account of altitude, using terrain-following coordinates.

    • if in fact this is a cyclical effect, the next year or two should be a plateau and then within at most five years we should see definite significant cooling.

      Imagine that, a falsifiable hypothesis! Check back in 2025.

      I notice that Nick avoids any comment on a point that is relevant to this point.

    • Tom, you’ve misread the graph in Figure 5, which you linked in your comment.

      Look at the units of the y-axis. The units are Deg C/decade, not Deg C. Thus the title block read, “Model-Data 30-Year Trend Comparison.”

      So it the warming rates for the 30-year periods ending around 1940 (not the temperatures themselves) that are comparable to the warming rates in more recent 30-year periods.


      • Thanks for that correction, Bob.

        Just so I don’t look like a complete fool, here are a couple of surface temperature charts from the same area that do show the warmth of the 1930’s.

        These charts show the unadjusted data and show how NASA/NOAA took this data and changed it to make it appear cooler in the past and hotter in the present in their efforts to promote the CAGW narrative that things have been getting hotter and hotter for decade after decade. It’s just not so.

        Unmodified charts from around the world show the 1930’s to be as warm or warmer than subsequent years. Unfortunately, I chose a bad example above with Bob’s chart, but there are plenty of others that make the point.

        It seems to me if it can be shown that the 1930’s were as warm as today, then that destroys the CAGW story. I think it is a pretty good bet that just about all unmodified local and regional surface temperature records show this same temperature profile with the 1930’s being as warm or warmer than subsequent years We ought to take up a collection, and we should probably being guarding this data. I would also bet that none of those unmodified local and regional surface temperature charts would look like a “hotter and hotter” Hockey Stick chart.

        The Hockey Stick is a lie meant to promote the CAGW speculation. Historical evidence shows the Hockey Stick does not represent the real surface temperature profile. The historical evidence shows there is no unprecedented warming in the 21st century. The Hockey Stick is not evidence of anything but fraud.

      • So it the warming rates for the 30-year periods ending around 1940 (not the temperatures themselves) that are comparable to the warming rates in more recent 30-year periods.

        So the 30 year warming rate ending 2012, as projected by the models, is around 0.5C per decade *slower* than that seen in the observations?

  12. Bob,how does this compare to Vinther’s 2006 long instrumental record of Greenland? This is a very long study over 200 years and co-authors are prominent alarmist UK scientists Dr Jones and Dr Briffa.

    Looking at temps over this long period we find that much earlier decades are warmer than the last few decades and they even hold up well against some of the decades over one hundred years ago, back in the 1800s. See TABLE 8.

    So what will be their excuse when the AMO changes to the cool phase, perhaps sometime in the 2020s? Or has it started already? Who knows? See TABLE 8 from the study comparing decades.

  13. Bob, you’ve also found something else with this graph. The shape of this curve faithfully follows the real temperature record of the US, Canada, Europe, South Africa, Paraguay, Bolivia , etc (ie, the world) before the clime syndicate pushed the 20th Century’s high temperatures of 1930s -40s down, which also took out the 35yr deep cooling period that had the same clique panicking about an ice age in the making.

    What is the significance of this? It means in actuality essentially all the global warming we’ve had took place between 1850 and 1940 with no CO2 rise! THAT is the reason for the criminal destruction of the temperature record. The actual record is a slam dunk falsification of our post modern climate theory. The “big” warming from 1979 to the end of the century is in actuality all a warming up from the deep cooling. 1998 El Nino was not a new temperature record at the time. This is when Hansen’s GISS went to work, and like T Karl, H to do the major overhaul ijust before his retirementn 2007.

    • The 20th Century high warming period, with a suitable lag from the ocean warming likely resulted in outgassing of CO2 on a significant scale after 1950.

      • Gary
        You said, “… likely resulted in outgassing of CO2 on a significant scale after 1950.”

        I completely agree with you.

    • “we’ve had took place between 1850 and 1940 with no CO2 rise! ”

      Err no. There is an important rise in c02.

      Dont forget C02 effect is ln()

      Between 1845and 1939 (Law Dome) you have 286 to 309 ppm of c02

      Thats 5.35* ln(309/286) or .41 Watts

      C02 alone ( not counting CH4) added .41 Watts of forcing.

      So from 1850 to 1940, ~.35C from C02 alone

      Anomaly in 1940 .05 ( +- .09)
      Anomaly in 1850 -.55 ( +-.14)

      Difference = .6C

      C02 explains about .35C of this rise

      The balance .6C-.35C is .25C

      Wanna know what explains this residual?

      • Steven, my point still substantially survives. Outgassing was probably a signifucant factor in CO2 rise after 1850 with LIA warm up and probably is a measure of the CO2 drawdown by the LIA from similar high CO2 of the MWP a thousand years ago. Until goal posts were moved from 1950 to 1850 only a couple of years ago,1950 was considered the beginning of “human influence”. World population was one third today’s in 1950 and percapita energy use was quite a bit lower. World pop in 1850 was about a fifth of todays, and our percapita energy use was close to negligible. We largely burned sustainable wood and fired our transportation with sustainable oats and hay.

  14. Bob
    You said, “There appears to be roughly a 7- to 8-deg C spread from coolest to warmest model.”

    In other words, the uncertainty of the predictions is about +/- 4 deg C for a mean temperature commonly reported with more precision than 1 deg.

  15. “Figure 5 presents the observed and climate-model-simulated multidecadal (30-year) trends in Greenland near-surface land air temperatures, from 1900 to 2012″.

    This looks like a mistake, I look at Figure 5 and see data starting in 1928 or 1929.

    • Such cases where an unusually strong low pressure area penetrates to the high Arctic have always happened occasionally.

      Peter Freuchen describes such an event that happened around 1900. The snow in North Greenland partially melted and then froze again. The ice layer preevented the caribou from getting at any food, and they died out over most of northern Greenland-

  16. We can learn from media that tha arctic region warms double the speed of the global warming. Where does that come from?

  17. Griff: I do not know about eskimos but we in Finland including definitely the Sami people (the Lapps) do not call the temperatures between minus 20 and minus 10 degrees a heat wave.

  18. .
    ❶①❶① . . . The “Temperature Range Comb” . . .


    you might be interested to see how Greenland’s temperature ranges, compare to other places in the world.

    The “Temperature Range Comb” (a type of graph which looks like a “comb”), displays temperature ranges, for more than 24,000 real locations on the Earth.

    And I am talking about REAL, ACTUAL, ABSOLUTE temperatures. Not those weak, pale, temperature anomaly things.

    The temperature range for each location, goes from the temperature of the month with the highest average high temperature, to the temperature of the month with the lowest average low temperature.

    To make it easier to refer to the temperature ranges, I will call the month with the highest average high temperature, the hottest month. And I will call the month with the lowest average low temperature, the coldest month.

    The temperature ranges are sorted by the hottest temperature, followed by the coldest temperature.

    Each hottest temperature, has a range of coldest temperatures. Because of this, the sorting causes the graph to have the appearance of a “comb”.

    I have given each location a sequence number, based on its position in the sorted list of hottest temperatures. So it goes from 1, for the location with the highest “hottest” temperature, to just over 24,500 for the location with the lowest “hottest” temperature.

    There are 23 Greenland locations, with sequence numbers between 24,524 and 24,604.

    A word of warning. The X-axis is “reversed”, so zero is on the right side of the graph, and high numbers like 24,500 are on the left side of the graph.

    So all of Greenland’s locations are near the left edge of the graph.

    I hope that makes sense. Having to explain hottest, highest, high, average, low, lowest, and coldest, in various combinations, is difficult.

    I am happy to answer any questions, if you have any.

  19. I went to Greenland in August 2019 on an expedition with 7 young professors of climate, glaciers, weather, rocks, science, birds, and oceans. I am not a scientist so it was great to be with young scientists with the latest data.
    We were on a 2 week science expedition through the artic, north of the artic circle, plus 6 landings on Greenland. I learned that for the first time in 900 years, Greenland can grow gardens(potatoes, leeks, turnips), access oil and gas underground, and once again process minerals that were too far under snow in the past. Northern Canada had their best wheat crop ever and Siberia is experiencing much the same as Greenland and No Canada.
    Some countries like global warming!
    Not us, we have a home on the ocean front near Miami Florida so we fear global warming. But at least someone else is doing well with it.
    I think we need to use the word pollution to discuss our sense of urgency. Pollution and overpopulation are the concerns. Then we do not have to fight over scientific measures of how much the globe has warmed.
    All the sarcastic and hate filled comments above and on so many of these blogs are a waste of time. What shall we each do about pollution and over population? That is the item I wish to discuss

    • wow you have a time machine that’s big news “I went to Greenland in August 2019” please go 100 years in the future and see if your house is still there or not and report back.

    • Did you also find out that a lot of “things” that used to be under Snow & Ice are no longer under Snow & Ice just as they used to be before the Snow & Ice came?
      So Population & Polution are the concerns?
      Not a Billion Starving People, Not 2 Billion without Cheap Energy or proper Medical Care?
      Those people are already here, what do you suggest we do about them?

      Which forms of “Polution” would you like to discuss?
      What are your personal priorities.

    • Mary

      People here are very anti malthus and many are not ovely concerned about population growth.

      Pollution subjects receive very mixed comments here as with a recent article about plastic in general and plastic in the seas.

      Incidentally I do a lot of research at places like the met office and as a generalisation if material has not been digitised it does not exist to many modern researchers. A lot of very interesting material has not yet been digitised and evades published papers.

      Greenland authorities noted a rapidly warming climate back in the 1930’s and it was discussed at an international conference just after the war. Spinks noted that vegetables were being grown again in Greenland during that decade for the first time since the 1200’s or so


    • Mary, you’ve come to the right place to understand why there are sceptics and what they have to say. Oh yes, there is a lot of unpleasant noise on a blog where our precious eroding freedom still flourishes. Skip the ugly stuff and try to understand some of the excellent science told in an understandable way that a fair number of posters here offer.

      The actual article itself is a legitimate critique of the “consensus” science (an oxymoron in true science). Scepticism has been a hallmark of true science since The Enlightenment. Here is a quote from an essay on TE:

      “The Enlightenment has been defined in many different ways, but at its broadest was a philosophical, intellectual and cultural movement of the seventeenth and eighteenth centuries. It stressed reason, logic, criticism, and freedom of thought over dogma, blind faith, and superstition.”

      Believe it or not, traditionally, scepticism was part and parcel of the scientific method itself. In the excesses of climate science, inextrictably and unhealthily bound up with politics, scepticism became a pejorative term. The “consensus” (like a Synod of Bishops) does not tolerate strating from the official line. See climategate email examples of having editors fired for publishing articles sceptical of the science, gatekeeping of publishers, punishment of their own who question. Don’t read whitewash damage control articles on climategate. Read actual examples. They have been catelogued for easier reference.

      WUWT was a natural reaction to the consensus and is #1 one of many sceptical blogs that sprang up to supply scepticism which was absent. I myself am a geologist and engineer and I even studied paleoclimate in geology. I’m not particularly special here. There are giants of science includung some real Nobel scientists who have come to this site.

      A challenge! Suspend belief for a month and be a devil’s advocate. Research the case against unequivocal catastrophic global warming and the evidence that the case is in no way settled. Check out warmer periods in our Holocene (the 15,000yrs since the end of the last glacial
      period) when open seas washed the north shore (now ice-locked) beaches of Greenland strewing it with driftwood! Read the most egregious emails. Read about professors fired for criticizing the science abd more. You will get a wonderful education and, if you still think we are unequivically headed for a disaster for the human race, habitat, agriculture mass extinctions, you will at least be speaking from some authority instead of from instilled fearfulness. Respectfully Gary P

  20. Willis often talks about going back to the raw data. I agree we should examine data at least as a spot check to make sure conclusions are reasonable. With that in mind, I urge you to investigate the B.E.S.T. page for one of the few Greenland stations that provide data prior to 1940. It is

    SW. Greenland coastal fjord station, MITTARFIK ILULISSAT (JAKOBSHAVN LUFTHAVN)

    The record runs from Jan 1866 to Oct 2013. Long records are precious to climate science. But look what BEST did to it!
    There are no fewer than 30 empirical breakpoints in the 35 year period from 1903 to 1938 !!!

    Either the method of determining empirical breakpoints is hideously flawed or this station’s record keeping is so fundamentally flawed the station not be used at all. I prefer the former. This is a station on an arctic fjord and there is little reason to believe it will behave in synchronicity with a sparse regional kriging network. Regardless, you cannot tease out any “climate signal” from temperature record snippets only a year or two long.

    Ironically, this station has an unusual number of quality control failures after 1980 and a gap from 1990-2005, all in the period of “recent warning’.

  21. You have data from 1900 through 2012. Why not show a running 30-yr average of 11yr smoothed data instead of a linear trend for the entire period? Wouldn’t that be more climatically relevant?

  22. Once again, we see that, on a regional scale, climate models fail miserably to synchronise with observations, and natural variability – in this case multidecadal natural variability – dwarfs the secular warming trend to such an extent that it becomes difficult indeed to positively identify a secular warming trend. Climate alarmists merely sniff dismissively, pointing out that it’s GLOBAL warming we’re talking about and OF COURSE there will be large regional variability. They forget to also mention that the rapid warming of Greenland, accompanied by rapid surface ice melt and the shrinkage of Arctic sea-ice is attributed almost entirely to the accumulation of GHGs. In reality, a significant portion, if not a majority of Arctic warming in recent years has very likely been due to multidecadal variability. The ‘catastrophic’ Greenland surface ice melt which is predicted to continue uninterrupted by alarmists has been largely due to changes in atmospheric circulation (NAO), occasioning clearer skies above Greenland and enhanced summer surface melting due to direct solar insolation. Likewise, warmer North Atlantic waters have in part been driven by changes in ocean circulation and overturning.

Comments are closed.