GOLDSTEIN: Feds scrapped 100 years of data on climate change

From The Province

Lorrie Goldstein

Published: September 17, 2019

Updated: September 17, 2019 2:32 PM PDT

Canadians already suspicious of Prime Minister Justin Trudeau’s carbon tax are likely be even more suspicious given a report by Ottawa-based Blacklock’s Reporter that Environment Canada omitted a century’s worth of observed weather data in developing its computer models on the impacts of climate change.

The scrapping of all observed weather data from 1850 to 1949 was necessary, a spokesman for Environment Canada told Blacklock’s Reporter, after researchers concluded that historically, there weren’t enough weather stations to create a reliable data set for that 100-year period.

“The historical data is not observed historical data,” the spokesman said. “It is modelled historical data … 24 models from historical simulations spanning 1950 to 2005 were used.”

These computer simulations are part of the federal government’s ClimateData.ca website launched by Environment Minister Catherine McKenna on Aug. 15.

She described it as “an important next step in giving our decision-makers even greater access to important climate data for long-term planning. The more each of us uses this type of information, the more it will help.”

They don’t hold back.

Blacklock’s Reporter, which describes itself as “the only reporter-owned and operated newsroom in Ottawa” focusing on intensive reporting of government documents, notes that in many cases the observed temperatures scrapped by Environment Canada in creating its computer models, were higher in the past than today.

For example, Vancouver had a higher record temperature in 1910 (30.6C) than in 2017 (29.5C).

Toronto had a warmer summer in 1852 (32.2C) than in 2017 (31.7C).

The highest temperature in Moncton in 2017 was four degrees cooler than in 1906.

Brandon, Man., had 49 days where the average daily temperature was above 20C in 1936, compared to only 16 in 2017, with a high temperature of 43.3C that year compared to 34.3C in 2017…

And balanced as well.

To be fair, the fact that it omitted observed weather data from 1850 to 1949 in developing its computer models is not evidence in and of itself of an attempt by Environment Canada to mislead the public.

Omitting observed historical weather data from computer models is common in climate science because of differences in the quality of the reporting of weather data today, compared to 1850 when historical records started being kept.

Also, weather is not climate.

Computer climate models don’t claim to predict what the weather will be like on any given day, month or year.

They predict long-term weather and climate patterns.

And then boom.

Having said that, McKenna and other politicians give the public inaccurate information about climate change all the time.

Full article here.

HT/Cam_S

Get notified when a new post is published.
Subscribe today!
5 1 vote
Article Rating
126 Comments
Inline Feedbacks
View all comments
Ed Zuiderwijk
September 19, 2019 1:07 pm

It is modelled historic data. Made with models we know are faulty.

KT66
September 19, 2019 1:36 pm

You can’t make this stuff up…Oh..that’s exactly what they are doing.

Mark Broderick
September 19, 2019 1:44 pm

“Katie Pavlich on the cost of Democrats’ climate change plans”

https://video.foxnews.com/v/6087820162001/#sp=show-clips

J Mac
September 19, 2019 2:03 pm

Scrapping 100 years of weather data was a necessary adjustment for Canada, to mitigate equipment calibration inconsistencies and resolve siting errors. Yeah…. that’s the ticket!

knr
September 19, 2019 3:00 pm

In one way they are right , there indeed many problems with date from past data collections and there are problems in trying to be compare it to the way this data is collected now .
But then the problem is the ‘worst ever , hottest ever ‘ climate doom screams are based on the exact same data and when it suits they are happy rto use the data .

Bottom line , there is even now, no were near the level, range , accuracy or any meaningful factor that goes into good measure practice to come up with any ‘global temperatures’ or sea level . large parts of the earth have no measurement and large parts little . And past ‘missing information’ is the very reason to use ‘proxies ‘ like magic tree rings in the first place.

And yet none of this stops the claims of ‘settled science’ and unquestionable predictions to two decimal places for many years ahead , in an area that were they cannot tell you if it will rain or not next Wednesday because of the uncertain involved .

MarkW
September 19, 2019 3:58 pm

Since the data didn’t match the models, it was obvious that the data was flawed and had to be discarded.

September 19, 2019 4:23 pm

Those old, scrapped observations might have been too few to give a Global or Canada wide picture of past temperatures, but they sure as (fill in the blank) gave a picture of the LOCAL past temperatures! (The locals might want to know!)
Why delete them?
Actual past observations at one of those locations might have been different from what the computer-generated fantasy says it was?
Computaguess away where there is no actual data. But delete the actual local data in favor of the Computer Guess of what the local temps were?
Who and how could anybody defend that??!!??

September 19, 2019 4:48 pm

Re. ICISIL, does that mean that all historical data, written down stuff
cannot be used. ?

Sadly that was decided long ago in Villness in Austria .

So known facts such as the Thames freezing over and having fairs held on it
simply did not happen, plus the very detailed records from both Egypt and
China. The MWP and the Little Ice Age too. , plus the Minion, Greek and
Roman civilisations , all the result of warmer periods, simply never happened.

Eric the Red never got to Greenland, and probably Canada and todays America .

Next thing we will be told that the parting of the Red Sea never happened either .
Sark.

Seriously its obvious that the left wing in its never ending quest for World Government will lie and chat till they get their way.

MJE VK5ELL

icisil
Reply to  Michael
September 19, 2019 5:58 pm

Measured temps can be used to accurately determine what temperatures were in their local areas, and historical anecdotal data can be used to get a general idea what temps were, but those are different than making up temperature data when no measurement data exist.

PaulH
September 19, 2019 5:19 pm

Of course they deleted historic data. What else would you expect them to do?

DanQuébec
September 19, 2019 5:40 pm

Not surprising, this disappearance of 100 years of data from McKenna, our minister of “environment and global warming”, known here as “climate barbie”. Typical lying Libtard techniques to suppress the truth.

xsnake
September 19, 2019 5:52 pm

“The end justifies the means.”
Dem DNA.

Steven Mosher
September 19, 2019 7:49 pm

Boy what a classic misreading

1. They are talking about a web page
2. Here it is https://climatedata.ca/

Basically it is a climate services portal.
nothing here to see.

https://climatedata.ca/about/

So for their WEBPAGE that provide climate services they use data from climate models and ANUSPLIN.
ya, real data

icisil
Reply to  Steven Mosher
September 20, 2019 1:15 am

Steve thinks fake, computer-generated numbers are real data. This exemplifies how far science has fallen, and is nothing more than postmodern alchemy of the mind, i.e., change the perception of reality rather than reality itself.

icisil
Reply to  icisil
September 20, 2019 2:34 am

This link describing ANUSPLIN gives a good idea how climate scientists are living in a world of virtual reality that’s completely detached from the real world. This is the type of computer simulation program that generates their fake temperature numbers where no real data exist. How do you measure the average temperature of, let’s say, 100 million sq. mi. (the southern hemisphere) with 100 thermometers? Easy, you use those 100 real data points and make up all the rest. Science has no meaning in this kind of context. It’s fantasy world.

https://fennerschool.anu.edu.au/research/products/anusplin

Scott W Bennett
Reply to  Steven Mosher
September 20, 2019 2:51 am

No!

They used ANUSPLIN only to test the robustness of the BCCAQ method:

“About the BCCAQv2 time-series and maps:
All results displayed are from an ensemble of 24 climate models. Each climate model simulates the climate for the historical period, 1950-2005, and for plausible futures, 2006-2100, in response to three emissions scenarios representing different atmospheric concentrations of greenhouse gases (RCP2.6, RCP4.5 and RCP8.5).”

Misreading! It is brazen propaganda of the highest order. Just type in a place name and read the dishonest shite that comes up. The so called “historical data” isn’t data and it doesn’t match real observations and makes no pretence of doing so! Temps are all lower, it is always better in the past and the future is always a disaster! As for the RCP scenarios, even the IPCC has disowned them as completely unreal.

Adahman
Reply to  Steven Mosher
September 20, 2019 10:23 am

Climate MODELS are not actual verifiable data. Never have been, never will be.

Reply to  Steven Mosher
September 20, 2019 2:32 pm

So … a curious John or Jane Q Public can go to the web page but they won’t see the observed temperature (if there was one) for a local but a modeled temperature?
And your OK with that because if they know the ins and outs of digging deeper into the web they might stubble upon it?
That’s worse than giving the impression that the most recent year a record high/low for the day was tied is the year it was set.

Patrick MJD
September 19, 2019 9:53 pm

“Climatedata.ca is a collaboration between Environment and Climate Change Canada, the Computer Research Institute of Montréal (CRIM)…”

CRIMinal!

Data from computer models is real data, since when? All I see at that about link is models models models all the way down, but they do say it all “quality controlled and unadjusted”.

Your comedy class is coming along well.

DocSiders
September 20, 2019 12:02 am

There WERE not enough weather stations prior to 1950 to determine the Global Average Temperatures.. (with satellite or radiosonde accuracies) period. No pre-1950 Proxies have enough accuracy or resolution to produce valid trends with the accuracies and resolution claimed by today’s climate science crowd. The data is not there. Models may claim to be able to “fill in the holes”, but that is a lie unsupportable by standard statistical analysis.

Models are not able to provide data good enough to “tack onto” the more complete and higher resolution post 1950 climate data. To claim otherwise is a lie.

We are in a war for the preservation of western civilization. Bullets are being fired right at us. How do we fight this…they own the battlefields (Media, Academia, Education, Entertainment, Government, Big Data) and have all the weapons and ammunition (Media and Academia and Education disseminating lies and propaganda).

Patrick MJD
Reply to  DocSiders
September 20, 2019 1:52 am

How NOAA was able to measure a global land and sea average is beyond me. Oh wait! They made it up! Like all averages!

Editor
September 20, 2019 12:28 pm

So, as we see, another misuse of “climate models” — they model the past instead of reporting the past.

Models similarly cannot model (project, predict, anything future pointed) past the breakdown of the model mathematics into the chaotic realm discovered and highlighted by Edward Lorentz “who established the theoretical basis of weather and climate predictability, as well as the basis for computer-aided atmospheric physics and meteorology. He is best known as the founder of modern chaos theory, a branch of mathematics focusing on the behavior of dynamical systems that are highly sensitive to initial conditions. His discovery of deterministic chaos “profoundly influenced a wide range of basic sciences and brought about one of the most dramatic changes in mankind’s view of nature since Sir Isaac Newton,” according to the committee that awarded him the 1991 Kyoto Prize for basic sciences in the field of earth and planetary sciences. ”

As there are no valid real-world predictions of future climate states — there is nothing real-world about climate model “predictions” of the past.

Reply to  Kip Hansen
September 20, 2019 2:46 pm

Kip, I think the term you’re looking for is the horizon of predictability.

horizon of predictability: Roughly speaking. the time up to which a chaotic system’s behavior can be successfully predicted despite some uncertainty in its initial state. For times much longer than this, accurate prediction becomes impossible. Mathematically, the horizon of predictability is defined as the reciprocal of the system’s Lyapunov exponent.

Lyapunov exponent: The rate at which nearby trajectories diverge in a chaotic system. Defined mathematically as \displaystyle \lambda , the rate of separation, in the formula \displaystyle d(t)\approx {{d}_{0}}\cdot \exp (\lambda t). where \displaystyle d(t) is the exponentially growing distance between the trajectories, and \displaystyle t is the time for which they’ve been separating.

Planetary orbits are chaotic. The horizon of predictability of planetary orbits is on the order of from 1 to 5 million years. Unlike Newton’s clockwork universe, it appears that chaos rules the day.

The horizon of predictability of the Earth’s weather system is less than two weeks. Even if we were to use the exact same equations that the Earth’s weather system uses, we still couldn’t predict weather past that two-week horizon. Such are the limits of a chaotic system.

Jim

Amber
September 21, 2019 4:11 pm

Sounds an awful lot like Climate gates natural trick to climate models . Get the results to match the narrative .
Whew what a relief , Liberals lying as usual and now maybe Atwood won’t die in a car crash .
So now they admit after being caught they just made up data for 100 years .
The climate con-game relied on fake data and they got it gift wrapped .

More and more of the outright deceit is going to coming poring out as the noose tights around the fraudsters pumping the scam . We are going to see more volunteer falling on swords like the James Comey set up about leaking data .
Err well the models you know are subject to complex variables that we cannot model to do the model .
So while we do model they are likely hundreds of percent wrong . Of course you knew that all along right ?
I mean that is just settled science right ? Oh no we had no intention of completely bull shitting you for 25 years . Clearly you have misunderstood … we had no intent of enabling the worlds biggest fraud .

Amber
September 21, 2019 4:22 pm

So they scrapped the data just made crap up for 100 years and enabled the worlds biggest fraud .
It takes someone doing an investigation to pull the curtain back on the climate sausage factory .
Then they put a little cutie face in front of it to huff and puff about saving the planet .
The planet is more in danger when con-men get away with this fabrication and deceit .
I don’t think this is an accident coming out now . The Liberals are going to get smoked in a couple of months and the last thing the climate manufactures want is an audit trail of the crap they pulled .
Clear evidence of fraud .