New distance measurements bolster challenge to basic model of universe

Observational results inconsistent with theory

National Radio Astronomy ObservatoryShare Print E-Mail

A new set of precision distance measurements made with an international collection of radio telescopes have greatly increased the likelihood that theorists need to revise the “standard model” that describes the fundamental nature of the Universe.

The new distance measurements allowed astronomers to refine their calculation of the Hubble Constant, the expansion rate of the Universe, a value important for testing the theoretical model describing the composition and evolution of the Universe. The problem is that the new measurements exacerbate a discrepancy between previously measured values of the Hubble Constant and the value predicted by the model when applied to measurements of the cosmic microwave background made by the Planck satellite.

“We find that galaxies are nearer than predicted by the standard model of cosmology, corroborating a problem identified in other types of distance measurements. There has been debate over whether this problem lies in the model itself or in the measurements used to test it. Our work uses a distance measurement technique completely independent of all others, and we reinforce the disparity between measured and predicted values. It is likely that the basic cosmological model involved in the predictions is the problem,” said James Braatz, of the National Radio Astronomy Observatory (NRAO).

Braatz leads the Megamaser Cosmology Project, an international effort to measure the Hubble Constant by finding galaxies with specific properties that lend themselves to yielding precise geometric distances. The project has used the National Science Foundation’s Very Long Baseline Array (VLBA), Karl G. Jansky Very Large Array (VLA), and Robert C. Byrd Green Bank Telescope (GBT), along with the Effelsberg telescope in Germany. The team reported their latest results in the Astrophysical Journal Letters.

Edwin Hubble, after whom the orbiting Hubble Space Telescope is named, first calculated the expansion rate of the universe (the Hubble Constant) in 1929 by measuring the distances to galaxies and their recession speeds. The more distant a galaxy is, the greater its recession speed from Earth. Today, the Hubble Constant remains a fundamental property of observational cosmology and a focus of many modern studies.

Measuring recession speeds of galaxies is relatively straightforward. Determining cosmic distances, however, has been a difficult task for astronomers. For objects in our own Milky Way Galaxy, astronomers can get distances by measuring the apparent shift in the object’s position when viewed from opposite sides of Earth’s orbit around the Sun, an effect called parallax. The first such measurement of a star’s parallax distance came in 1838.

Beyond our own Galaxy, parallaxes are too small to measure, so astronomers have relied on objects called “standard candles,” so named because their intrinsic brightness is presumed to be known. The distance to an object of known brightness can be calculated based on how dim the object appears from Earth. These standard candles include a class of stars called Cepheid variables and a specific type of stellar explosion called a Type Ia supernova.

Another method of estimating the expansion rate involves observing distant quasars whose light is bent by the gravitational effect of a foreground galaxy into multiple images. When the quasar varies in brightness, the change appears in the different images at different times. Measuring this time difference, along with calculations of the geometry of the light-bending, yields an estimate of the expansion rate.

Determinations of the Hubble Constant based on the standard candles and the gravitationally-lensed quasars have produced figures of 73-74 kilometers per second (the speed) per megaparsec (distance in units favored by astronomers).

However, predictions of the Hubble Constant from the standard cosmological model when applied to measurements of the cosmic microwave background (CMB) — the leftover radiation from the Big Bang — produce a value of 67.4, a significant and troubling difference. This difference, which astronomers say is beyond the experimental errors in the observations, has serious implications for the standard model.

The model is called Lambda Cold Dark Matter, or Lambda CDM, where “Lambda” refers to Einstein’s cosmological constant and is a representation of dark energy. The model divides the composition of the Universe mainly between ordinary matter, dark matter, and dark energy, and describes how the Universe has evolved since the Big Bang.

The Megamaser Cosmology Project focuses on galaxies with disks of water-bearing molecular gas orbiting supermassive black holes at the galaxies’ centers. If the orbiting disk is seen nearly edge-on from Earth, bright spots of radio emission, called masers — radio analogs to visible-light lasers — can be used to determine both the physical size of the disk and its angular extent, and therefore, through geometry, its distance. The project’s team uses the worldwide collection of radio telescopes to make the precision measurements required for this technique.

In their latest work, the team refined their distance measurements to four galaxies, at distances ranging from 168 million light-years to 431 million light-years. Combined with previous distance measurements of two other galaxies, their calculations produced a value for the Hubble Constant of 73.9 kilometers per second per megaparsec.

“Testing the standard model of cosmology is a really challenging problem that requires the best-ever measurements of the Hubble Constant. The discrepancy between the predicted and measured values of the Hubble Constant points to one of the most fundamental problems in all of physics, so we would like to have multiple, independent measurements that corroborate the problem and test the model. Our method is geometric, and completely independent of all others, and it reinforces the discrepancy,” said Dom Pesce, a researcher at the Center for Astrophysics | Harvard and Smithsonian, and lead author on the latest paper.

“The maser method of measuring the expansion rate of the universe is elegant, and, unlike the others, based on geometry. By measuring extremely precise positions and dynamics of maser spots in the accretion disk surrounding a distant black hole, we can determine the distance to the host galaxies and then the expansion rate. Our result from this unique technique strengthens the case for a key problem in observational cosmology.” said Mark Reid of the Center for Astrophysics | Harvard and Smithsonian, and a member of the Megamaser Cosmology Project team.

“Our measurement of the Hubble Constant is very close to other recent measurements, and statistically very different from the predictions based on the CMB and the standard cosmological model. All indications are that the standard model needs revision,” said Braatz.

Astronomers have various ways to adjust the model to resolve the discrepancy. Some of these include changing presumptions about the nature of dark energy, moving away from Einstein’s cosmological constant. Others look at fundamental changes in particle physics, such as changing the numbers or types of neutrinos or the possibilities of interactions among them. There are other possibilities, even more exotic, and at the moment scientists have no clear evidence for discriminating among them.

“This is a classic case of the interplay between observation and theory. The Lambda CDM model has worked quite well for years, but now observations clearly are pointing to a problem that needs to be solved, and it appears the problem lies with the model,” Pesce said.

###

The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc.

From EurekAlert!

82 thoughts on “New distance measurements bolster challenge to basic model of universe

  1. “Observational results inconsistent with theory.”

    Redo those observations until they are consistent with theory.

      • They are probably underestimating the importance of CO2 in their model. It’s worse than we thought!

        They just need to compare to recent work on clouds and apply the same techniques to their maser measurements.

          • It’s a herds of Perfect Blackbody Bovine expelling CO2 through their farts as they cruise through the universe at light speed.

      • Not exactly, “their way” is to stop making those inconvenient observations and declare the science settled.

        Interesting new analyses. One way to make maps is to watch from afar and guess. Another is to go and measure. Early maps from the guessers didn’t look much like the maps from the actual explorers.

    • If there were Trillions of dollars of redistributed wealth to skimmed by a small subset of elitists, then you can be assured they would be funding the efforts to adjust data to fit a profitable theory. As it is, science that remains “un-politicized” can remain largely uncorrupted as a field of honest and ethical inquiry.

      What has corrupted climate science into junk science is the vast wealth it promises to be re-distributed and who can profit from skimming a piece of the action.

        • Loydo,

          You make a mistake that is all too common these days, and which explains much of what is unfolding at every level of analysis in human affairs – namely the slow, relentless and insidious division of all human inquiry and endeavor into polarized camps; the death of nuance, and the birth of an integrated mass-movement – certainly in the West – towards trying to solve highly complex issues through a binary framing: – black/white, warming/cooling, east/west, for us or against us, good/evil. And the proposed remedy is always the same: Big Solutions, preferably via well meaning, enlightened “experts” (like you?)

          I’ve read your commentary for some time; you seem to be just another disciple of this perspective and your comment here simply underscores that. Not that many of the skeptics here on WUWT are exempt of course. Maintaining balance isn’t easy. No-one’s exempt; it takes a significant amount of cognitive and emotional wrangling to extricate oneself from drifting into binary patterns of thought and emotion, and there is in fact a strong and rapidly growing body of psychological and neurological literature showing how the internet, social media and the propensity to conveniently apply algorithm-driven solutions to every complex, interconnected issue these days undermine cohesion all the way from geopolitical issues right down to how the Big 5 personality traits are organised within our own minds.

          Against this backdrop, take another look at your knee-jerk “us vs them” framing on the subject of “Elites”. The (faux) Utopian hunt for a simple, elegant, “morally homeostatic” society is very old wine in a shiny new bottle. History has spoken clearly on where this kind of thinking leads – without exception.

          • History has spoken clearly on where this kind of thinking leads – without exception.

            Thanks Peter for your erudite analysis of the psychological state of modern humanity. Just a quick question on how history has spoken regarding where Lloydo’s kind of thinking leads – if it’s “without exception,” are you saying there’s a simple binary choice in the matter?

          • Thank you, Peter, for pointing this out. I definitely agree, and the concept is in no way limited to either end of the spectrum. Social media definitely seems to create polarising thoughts and behaviours.

          • Replying to syscomputing:

            if it’s “without exception,” are you saying there’s a simple binary choice in the matter?

            I see where you are coming from, but syscomputing was saying that history shows us this, so it has always happened. By all means provide evidence if you think it’s not correct.

            I agree that binary stances lead to conflict and no real solution (which is what I believe is being put forward). Compromise is not very evident on social media, and I believe that it is exacerbating the problem.

            I’m one of those weird individuals who holds self-conflicting views on many topics, so I’m probably wrong…

          • You are right Peter, they have even managed to make racial prejudice into a black and white issue !

            No, I’m not being sarcastic. Just look at the range of racially mixed individuals who “identify” as being “black”. Any fraction of non white lineage seems to make you “black”. Indeed, you do not even need have any African ancestry now. All you need is feel black.

            All nuance of the racial melting pot has been banned. You are either “black” or you part of the oppressive white patriarchy.

            Polarisation : divide and conquer.

          • Us vs Them is an innate characteristic of all humans. Working in coordinated groups allowed us to out-compete other groups over the millennia and survive. Put a group of humans together and within a short while there will be a hierarchy. When two groups are put near each other, they quickly become tribal (Us vs Them).

            Humans ALL have a rather large amount of confirmation bias built into their moral and emotional psychology. There is no way to rid ourselves of this bias. The best we can hope for is to partially recognize the bias and attempt to correct for it. Best scenario is to present our argument/theory and then allow others to argue, discuss and theorize about our work. That way the confirmation bias can be attacked from multiple angles at the same time.

          • Thank you for your thoughtful comments Peter. Not much of what you say I could disagree with and of course your warnings are acknowledged. Can you not however see the irony of: “If there were Trillions of dollars of redistributed wealth to skimmed by a small subset of elitists…” If? Its already happened, is still happening and accelerating. That is not controversial, its not even “us and them”, its just a statistic truth. Pat wants to ‘splain it as capitalism envy but that’s the nub of the irony – the *actual* “small subset of elitists” already *are* the beneficiaries, but he wants to defend them even as he is being screwed over. Watch how well a rising stock price is received in the midst of an economic crisis suffered disproportionately by low paid workers with insecure jobs.

            I would argue that rising wealth disparity is as much or even more of a risk than the black and white, polarizing thinking you describe and if you dissect the entrails of the various collapsed societies and civilisations extreme stratification never ends well.

            You and sycomputing want to characterise me as someone with a “type of thinking” but you need to keep that in perspective. My opinions stand out here as “opposite” but bollocks should be challenged and I promise you there is a lot of bollocks written on WUWT that goes completely unchallenged. Sometimes just quietly smfh is not enough.

            After all that you go straight for the “they” Greg? Really? For Weylan, (and for most here) too much “us and them” is never enough.

          • Loydo:

            You and sycomputing want to characterise me as someone with a “type of thinking” but you need to keep that in perspective.

            What do you mean, “[y]ou and sycomputing”?

            Speaking of keeping things in perspective, I think somebody isn’t paying attention (and when I say “somebody,” that means you).

          • Binary thinking ignores the Rule of Three.

            Problems are not yes or no. They are yes, no or maybe. Positive, negative or neutral. True, false, or NULL.

          • Loydo ( June 16, 2020 at 2:14 am), “Pat wants to ‘splain it as capitalism envy …

            Nothing of the kind, Loydo. I merely pointed out that nearly everyone in the US is getting wealthier.

            That includes the immigrant class, as they assimilate into society. So Thomas Sowell has reported.

            My final sentence just observed that Euro anti-Americans insist on pejoratively misrepresenting the US, just as you have typically done.

            And like all socialists, you want to cure the curse of greedy corporate elites by turning all of society into one giant corporation, to the sole benefit of your own greedy corporate elites.

            None of you seem able to grasp the critical irony at the center of your econo-political stupidity.

          • It has been shown, in economic literature, that free market economies do far better than less free market economies. Half of lower quintile taxpayers move up into higher quintiles within ten years, as is the reverse – higher quintile taxpayers moving down into lower quintiles.

            https://www.treasury.gov/resource-center/tax-policy/Documents/Report-Income-Mobility-2008.pdf

            The people in lower and higher tax brackets are exchanging positions frequently within a dynamic free economy. Socialist economies, on the other hand, ensure that nobody moves up or down because a small cadre of apparatchiks determine how much everybody makes and ensures one remains where they’ve been allocated. I’m not sure there’s really a quintile in a good socialist country – there’s only the few and the rest who stand in bread lines.

          • Zig Zag Wanderer:

            I’m one of those weird individuals who holds self-conflicting views on many topics, so I’m probably wrong…

            Actually, you’re both right and wrong in this case. I guess that means it all evens out in the end. 🙂

            I see where you are coming from, but syscomputing was saying that history shows us this, so it has always happened. By all means provide evidence if you think it’s not correct.

            I think you meant “Peter was saying that . . . ” etc. Anyway, I’m happy to grant to Peter’s binary premise without comment:

            History has spoken clearly on where this kind of thinking leads – without exception.

            But that premise contradicts his criticism of Loydo:

            You make a mistake that is all too common these days . . . trying to solve highly complex issues through a binary framing

            My initial question to him was intended to lead, depending on how he answered, into how in the world his criticism of Loydo’s (assumed) binary framing could possibly make any sense when he used binary logic to criticize the idea of binary thinking.

            Alas, it wasn’t to be. Peter did a “Mosher.”

        • Typical socialist, he assumes that all wealth is stolen.

          Socialists view themselves as being the smartest, best people around. The fact that there are people out there who have more than the socialists do is proof that the system is broken and that the socialists need to be in charge so that they can finally be given everything they believe themselves entitled to.

        • If you ever decide to do the work Loydo, you can find US household income here: https://www.census.gov/library/publications/2019/demo/p60-266.html

          Table A-2. Households by Total Money Income, Race, and Hispanic Origin of Householder: 1967 to 2018

          Plotting out those data, one finds that the number of wealthy households is increasing across that range, while the number of poor households is nearly constant.

          That’s exactly what one expects in a wealth-producing society with accompanying large-scale immigration.

          One expects income disparity to increase, because as overall wealth increases new immigrant households enter the low income workforce.

          In his “Wealth, Poverty, and Politic,” Thomas Sowell points out that study of IRS tax records show large scale movements between income groups. Rich people move down, and poor become middle class. There is no permanent hierarchy of rich and poor in the US.

          European anti-Americans love to paint the US as a capitalist hellhole of greedy rich and oppressed serfs. They’re idiots (mostly compensating for their own elitist fantasies).

        • Dear Lloydo:

          Please get back to us after you liquidate all your assets above the worldwide average, and send it along with any income in excess of that other average to some agency to redistribute.

      • There wasn’t big climate science money in 1988, Joel, when Jim Hansen made his deceptive testimony before Congress.

        Nor in 1995 when Ben Santer made his deceptive changes to the IPCC SAR. Nor in 1998/99 when Michael Mann used false methods to make his deceptive paleo so-called temperature reconstruction hockey stick.

        Nope, they had to have a different reason back then. I think the reason is that they were, and are, bad people. Conniving little cacat olim.

    • The problem is the ‘limitations’ of mathematics.
      Maths uses many ‘tricks’ and in the very narrow field of planetary physics, it works.
      But try applying mathematics to the cosmological scale or conversely to the sub-atomic scale and quite clearly it fails.
      The very idea that there are any universal ‘Constants’ is risible.

      • That, of course, is the issue. Is the Hubble constant really a constant? A basic ass7mption in science is that the laws of physics, especially the various constants, are immutable – true for all time and everywhere in the universe. If they’re not, then we have a problem. An ongoing question is precisely that: have some of these constants changed since the Big Bang? If so, how do we detect it?

    • Astronomers are way more honest than that.

      They aren’t on an anti-industrial jihad to save the world, and they blow all their money on telescopes.

  2. Real science at work: the model does not match the data, hence there’s a problem with the model.

    Pseudo science at work: the model does not match the data, hence the data are wrong.

    • Ed , I think that you are being a bit dogmatic History suggests a more complicated picture .
      Consider the Michelson – Morley expt . The null result was in contradiction to the expected result from the ether model. Einstein showed that the observations were correct and the model wrong . However more than 100 years later observations on the passage of neutrinos indicated that they were travelling faster than light speed , contrary to models from the time of Einstein’s earliest relativity work . On this occassion further work showed that it was the observation that were in error not the model.

      • First rule of climate ‘science ‘ when reality and models differ in value its reality which is in error.

      • I assume you are referring to the CERN paper about a decade ago. That observation of fast neutrinos was shown to be an artifact of an incorrect measurement of the arrival times. Someone had overlooked a tiny systematic effect in the detector. There are no observations of superluminal neutrinos, I’m afraid.

        • My point , Ed. On this occasion the theory or model, ie particles cannot travel faster than light, was correct , the observations were incorrect . Sometimes you should trust the models not the observations. The tricky bit is knowing when. Going back to the subject of this post two different methods give 73-74 km/sec/megaparsec , higher than the standard model , strongly suggesting that the standard model is in need of attention. Might need another Einstein though.

          • So the one example you can come up with to support your theory proves to be inaccurate, but your theory is still accurate?

          • Not sure about that. I for one knew immediately that it was a spurious result the moment I heared of it. Why? It always struck me as extremely odd that it was such a tiny effect. If nature would give us superluminal neutrinos why not ones with a whopping 1.5 or 2 lightmach. No, only a measly lightmach 1.0…01. Poor show. Actually I also considered that CERN had made a major error in going public without thorough corroboration (like for instance happened when symmetry breaking was discovered ); you could almost set your watch on the antisemetic comments directed at Einstein appearing on the web.

      • Michaelson-Morley experiment didn’t yield a null result.

        0.1 is not 0

        Read the paper rather than relying on people who wanted a null result.

        • Another very misleading comment by Zoe Phin. Stellar aberration disprove the ether drag theories of Fresnel and Stokes. Water-filled telescopes disprove the index-of-refraction explanation of Young for stellar aberration. The tiny result for the motion of the Earth WRT the ether is essentially a null result. As stated in Micelson-Morley 1887, the experiment was performed over a six-month period. Unless the ether was being dragged by the Earth, there shouldn’t be a tiny result at both locations in the Earth’s orbit that are six months apart–the Earth’s orbital velocity is pointed in opposite directions.

          Jim

          • “Another very misleading comment by Zoe Phin.”

            When did I ever make a misleading comment?

            “Fresnel, Stokes, Young”
            Great 19th century physicists, but maybe you want to move into the 20th century?

            I suggest you read this with an open mind:
            http://www.orgonelab.org/miller.htm

            For example:
            “It is also notable that this was the second time Michelson’s work had significantly detected an ether, though in the first instance of Michelson and Gale (1925) the apparatus could only measure light-speed variations along the rotational axis of the Earth. These papers by Michelson and also by Kennedy-Thorndike have conveniently been forgotten by modern physics, or misinterpreted as being totally negative in result, even though all were undertaken with far more precision, with a more tangible positive result, than the celebrated Michelson-Morley experiment of 1887. Michelson went to his grave convinced that light speed was inconstant in different directions, and also convinced of the existence of the ether. The modern versions of science history have rarely discussed these facts.”

          • >>
            When did I ever make a misleading comment?
            <<

            Seriously?

            >>
            “Fresnel, Stokes, Young”
            Great 19th century physicists, but maybe you want to move into the 20th century?
            <<

            If you had read Michelson-Morley 1887, they discussed work done by Fresnel, Fizeau, Lorentz, and Stokes. I guess your chastisement of another commenter for not reading the 1887 paper didn’t mean that you actually read the paper too. I thought we were discussing a 19th century paper written by 19th century physicists. I didn’t know we needed to move into the 20th century.

            However, I’d like to quote from a textbook: “Introduction to the Theory of Relativity” by Peter Gabriel Bergmann. Its original copyright is 1942. It has a forward by Albert Einstein who makes the following statement:

            “Much effort has gone into making this book logically and pedagogically satisfactory, and Dr. Bergmann has spent many hours with me which were devoted to this end. It is my hope that many students will enjoy the book and gain from it a better understanding of the accomplishments and problems of modern theoretical physics.”

            The textbook is approved by Einstein. Is that 20th century enough for you?

            So in Chapter 3: “The Propagation of Light” and in the section covering the Michelson-Morley experiment is the following:

            “. . . Any effect should have been clearly observable after all the usual sources of error, such as stresses, temperature effects, and so forth, had been eliminated. Nevertheless, no effect was observed.

            An impasse was at hand: No consistent theory would agree with the results of Fizeau’s experiment, the Michelson-Morley experiment, and the effect of aberration. . . .”

            In other words, a null result. An experiment designed to find a result, found no result.

            Jim

        • So now we can add the ether to the list of crazy ideas that Zoe Phin pretends to believe when trolling.

          Where do you stand on phlogiston?

  3. I remember reading a few years ago that “standard candles” are not as standard as had been presumed. That conclusion, apparently, hasn’t been accepted by the powers that be because they are still assuming that the intrinsic brightness of these standard candles is known.

    In any case, the new result of 73.9 kilometers per second per megaparsec is remarkably close to the 73-74 kilometers per second per megaparsec that was previously determined for the Hubble Constant based on standard candles and gravitationally-lensed quasars. It’s hard to imagine that coincidence alone could explain such a match.

    • “In any case, the new result of 73.9 kilometers per second per megaparsec is remarkably close to the 73-74 kilometers per second per megaparsec that was previously determined for the Hubble Constant based on standard candles and gravitationally-lensed quasars. It’s hard to imagine that coincidence alone could explain such a match.”

      That’s what I was thinking, too.

      This new result looks like a confirmation of one of the measurement methods.

    • Using the “standard candle” method, they came up with a value of 67.4 kps. The 73.9 kps was the value originally calculated by Hubble

  4. Six data points for galaxies out to 1/25 the age of the universe? Doesn’t seem like a very robust set of data.

    • Has anyone recently replicated Hubble’s experiment? Using modern methods might lead to a different answer from the similar measurements with modern tools.

      ps. my knowledge of astronomy is one step above squat.

      • It wasn’t an experiment, it was a proposed constant. Astronomers are always recalculating it based on the latest data. In this case two different ways of calculating the constant are giving significantly different results and no-one can figure out why.

  5. As I recall, they’ve been saying for a pretty long time now that there’s evidence that the expansion of the universe has accelerated over time — that’s what’s forced them to contemplate “dark energy” as the purely theoretical source of this inferred acceleration.

    Whether or not anyone understands what ‘dark energy’ is, this business of accelerating expansion has gotten to be almost a ‘standard’ idea? So these measurements with the ‘earlier’ Cosmic Background calculations showing a slower expansion constant, as compared to the ‘variable stars’ estimate, this is such a surprise? I’m not quite sure why the discrepancy in measurements is now being described as a mystery in itself.

    • David, from what I read above, the 73-74 value comes from several measurements, including from the farthest possible detectable objects like the earliest, gravitationally-lensed quasars. Then the 67 value comes from the standard “model” of the cosmic background radiation, also at the farthest regions. So, AFAIK, those two, different values are coming from data near/at the same “area”.

  6. In science the best answer is accepted as the truth only until a better answer comes along. Science is never settled.

    • It never is settled, is it? And, now that I read the article a little more closely, it seems the authors are not so much worried about a straightforward measurements discrepancy, as they are concerned with “predictions of the Hubble Constant from the standard cosmological model” indicating a significantly lower value than what is actually measured.

      So, the model might need to be revised!
      LoL, is *that* all?

  7. re: “Astronomers have various ways to adjust the model to resolve the discrepancy. Some of these include changing presumptions about the nature of dark energy, moving away from Einstein’s cosmological constant. Others look at fundamental changes in particle physics, such as changing the numbers or types of neutrinos or the possibilities of interactions among them. There are other possibilities, even more exotic, and at the moment scientists have no clear evidence for discriminating among them.

    Yes … and ANYTHING discovered, proposed, and even validated (to exist) by someone else (outside the field of astronomy) to explain, oh, say, (so-called) ‘dark matter’ will be disregarded, ridiculed and most importantly, ignored, because, N.I.H. (not invented here).

    ‘Science’ does not move forward with that sort of entrenched ‘practice’. The UPSIDE is, it leaves these so-called discoveries for some bright, energetic, fearless individual who enjoys challenging the ‘orthodoxy’ of the old science …

  8. This result is in the middle of prior observations of 74 +/-2. The problem is that Planck’s CMB-derived figure now falls below this range. It’s no longer centered on 70 but 67, with error a fraction of 1.0.

    Best guess now is that the Hubble “constant” has sped up. This implies that the proportion of dark energy has also changed. Does it change into dark matter? Who knows? Would help to have an inkling as to what those actually are.

    In real science, the more you learn, the more questions there are.

  9. I think the problem is that the standard model does not reflect the true nature of the universe.
    However big it might be the universe has a centre of gravity derived from the distribution in space of all its available matter.
    That matter is not evenly distributed so that there are temperature and density differentials as one moves away from the gravitational centre.
    The matter in the universe is mostly gases which obey the Gas Laws so, inevitably, one must expect convective movements to develop, some away from and some towards the centre of gravity.
    Thus the appropriate model would be of a steady state universe but with regions of space containing matter which is moving away from the gravitational centre and expanding and other regions of space containing matter which is moving back towards the gravitational centre and contracting.
    The interface between a region that is expanding and a region that is contracting would involve changes in the speed of light such that we could never observe across such a boundary.
    I anticipate black holes in an expanding region absorbing light and matter with white holes in a contracting region spilling out light and matter.
    It seems to me that such a model would resolve the current puzzles.

    • Space/time according to Einstein is what is expanding, not just space. Plus time is affected by both velocity and gravity as is length. Plus space/time is curved by gravity, again according to Einstein. A very tricky puzzle in which to work!

      • I’m ok with that.
        Space and time would both be affected if the speed of light varies which is what I suggest happens if there is expansion or contraction of an individual region of convection.

    • All models are wrong – but some models are useful.

      I think that is a quote from someone – I can’t remember who, but it is very good to remember. I would simply like to extend the quotation by saying that some models are useful for a while>/b>.

      We build a model so that we can try and understand how a complex situation arises from a set of basic assumptions. When we first build the model, it probably “works” in the sense that predictions using it are more or less accurate. Over time, with more observations, tests etc. the model becomes less and good at making correct predictions. At that stage, you can see if adjustments to the model will make it “useful” again, or if there is a better model altogether

      The final paragraph of the article sums this up:

      “This is a classic case of the interplay between observation and theory. The Lambda CDM model has worked quite well for years, but now observations clearly are pointing to a problem that needs to be solved, and it appears the problem lies with the model,”

      Science at work: We had a model that worked well until we got better data and now we have to move on.

      However, if you have invested your whole career and achieved wealth and fame from a model you begin to defend the model instead of challenging it. This is human nature, but it is what a real scientist fights to avoid.

      • Better is, all models are wrong, but some are predictive.

        The latter are exclusively the physical models.

  10. Inflation, expansion acceleration, differing distance measurements, differing expansion speeds, dark matter, dark energy? When will it be admitted that the “standard model” is just plain wrong? Perhaps then more research could be done without the the fear of being too far outside of the “consensus”.

  11. Astrophysicists exhibit scientific curiosity by wanting to understand a few percent difference between theory and data. Climate scientists lack this desire by ignoring differences in execss of 100% between a theoretical ECS claimed to be between 0.4 and 1.2 C per W/m^2 and a measured ECS of 0.3C per solar W/m^2 with less than 10% uncertainty. The motivation for this scientific malfeasance is that to accept the truth means that the IPCC and UNFCCC have no reason to exist and it persists because the IPCC has become the arbiter of what is and what is not climate science, or to be more precisely, what does and what does not support their narrative.

    • co2isnotevil,
      RE: “The motivation for this scientific malfeasance is….”
      Well said, co2! +100!

  12. The mistake was made in 1929 with the conclusion based on Hubble’s redshift alone that the universe was expanding. Prior to that time there were 3 competing models of the universe – Expanding, Contracting and Steady State. Overlooked was the fact that from the vantage point of an astronomer on Earth, the light from most galaxies observable from Earth in a Contracting Universe would also be doppler shifted to the red in proportion to distance exactly as Hubble observed. The reason is that as between the Milky Way and most other galaxies, one would be closer to the Singularity than the other and would therefore be moving faster towards the singularity than the other due to the stronger pull of gravity on the closer one. This means the distance between the Earth and that galaxy would be increasing resulting in a redshift in the light between them proportional to distance even as they are both being pulled towards the Singularity. The Big Crunch contracting model also has simple explanations for Dark Energy and Dark Matter.
    See http://www.bigcrunchuniverse.com for a more complete explanation.

    • But the concept of a big crunch is as unnecessary as indefinite expansion if one simplifies it to regions of expansion and regions of contraction co-existing within a static universe containing mobile gases.

  13. Since the discrepancy between the various measured values is with the number derived from the Cosmic Microwave Radiation, which has a lot of hypothesis , thermodynamic and particle physics entering, those calculations and the premises for the CMB will have to be rethought.

  14. Maybe there is a fifth fundamental force operating on a cosmic scale vast enough that it was not noticeable until of late.

  15. Seems simple to me. The idea that RedShift (RS) = Velocity is completely unproven but it is the basic behind everything from the Big Bang hypothesis onward.

    ALL the evidence starts with RS = V and they use the distance value derived from that to ‘prove’ RS = V.

    Halton Arp showed high RS objects with physical and energetic attachments to low RS ones and the ebst they could do was assert he was confused about 2D vs 3D observations.

    If RS =/= V, then pretty much all of the SMC is incorrect, based around and built on an error.

    And if you REALLY want a rabbit hole, go take a look at Maxwell’s ACTUAL equations instead of the ones fiddled with by Heaviside and Hertz. When you have an error going back that far, is it any wonder physics keeps being surprised and getting ever more complicated as they try to balance on shifting sands?

  16. This isn’t new… not at all.
    Cosmologists have known for decades that there’s a problem with the standard model, if only by that fact that it isn’t possible to completely reconcile it with the quantum model.
    Many lines of research have tried to reconcile the two, string theory, unifying theory, TOE, geometric unity, etc.

    There’s a gap in our understanding of physics… finding experiments to prove this is like finding an experiment to prove water is wet.
    So now we’ve been able to take a measurement proving that we had it right in knowing we had it wrong… woop tee doo….

    • I think the gap is Maxwell. When they changed his quaternion equations to vector ones and disappeared 2 forces, they set physics on a wrong track and ever since, instead of getting simpler as we learn more, it gets more complicated.

      Physics only gets more complicated when you are first making guesses; as you come to understand the underlying principles it is SUPPOSED to get simpler. Newton simplified things to where a mostly uneducated soldier could accurately calculate the trajectory of a cannon shot instead of using guesswork.

      Then there’s the insistence the universe is gravity-formed, despite the fact that more and more they are bringing magnetic and electric forces into it from observations.

      One day, mainstream physics will have their “DOH!” moment, slap their foreheads and go back to Maxwell and start over.

Comments are closed.