From CERN, another science press release with a “could” and “preliminary” caveat. Sigh. I expected fireworks. It is encouraging though. 5 sigma isn’t anything to sneeze at.
I have to wonder though, if the fact that CERN delayed this press release (from Monday when it became known) to today, the 4th of July, wasn’t a final dig at the legacy of the failed US effort with the superconducting supercollider. They write at the CERN page:
Higgs within reach
Our understanding of the universe is about to change…
The ATLAS and CMS experiments at CERN today presented their latest results in the search for the long-sought Higgs boson. Both experiments see strong indications for the presence of a new particle, which could be the Higgs boson, in the mass region around 126 gigaelectronvolts (GeV).
The experiments found hints of the new particle by analysing trillions of proton-proton collisions from the Large Hadron Collider (LHC) in 2011 and 2012. The Standard Model of particle physics predicts that a Higgs boson would decay into different particles – which the LHC experiments then detect.

Both ATLAS and CMS gave the level of significance of the result as 5 sigma on the scale that particle physicists use to describe the certainty of a discovery.
One sigma means the results could be random fluctuations in the data, 3 sigma counts as an observation and a 5-sigma result is a discovery. The results presented today are preliminary, as the data from 2012 is still under analysis. The complete analysis is expected to be published around the end of July.
The press release:
CERN experiments observe particle consistent with long-sought Higgs boson
Geneva, 4 July 2012. At a seminar held at CERN1 today as a curtain raiser to the year’s major particle physics conference, ICHEP2012 in Melbourne, the ATLAS and CMS experiments presented their latest preliminary results in the search for the long sought Higgs particle. Both experiments observe a new particle in the mass region around 125-126 GeV.
“We observe in our data clear signs of a new particle, at the level of 5 sigma, in the mass region around 126 GeV. The outstanding performance of the LHC and ATLAS and the huge efforts of many people have brought us to this exciting stage,” said ATLAS experiment spokesperson Fabiola Gianotti, “but a little more time is needed to prepare these results for publication.”
“The results are preliminary but the 5 sigma signal at around 125 GeV we’re seeing is dramatic. This is indeed a new particle. We know it must be a boson and it’s the heaviest boson ever found,” said CMS experiment spokesperson Joe Incandela. “The implications are very significant and it is precisely for this reason that we must be extremely diligent in all of our studies and cross-checks.”
“It’s hard not to get excited by these results,” said CERN Research Director Sergio Bertolucci. “ We stated last year that in 2012 we would either find a new Higgs-like particle or exclude the existence of the Standard Model Higgs. With all the necessary caution, it looks to me that we are at a branching point: the observation of this new particle indicates the path for the future towards a more detailed understanding of what we’re seeing in the data.”
The results presented today are labelled preliminary. They are based on data collected in 2011 and 2012, with the 2012 data still under analysis. Publication of the analyses shown today is expected around the end of July. A more complete picture of today’s observations will emerge later this year after the LHC provides the experiments with more data.
The next step will be to determine the precise nature of the particle and its significance for our understanding of the universe. Are its properties as expected for the long-sought Higgs boson, the final missing ingredient in the Standard Model of particle physics? Or is it something more exotic? The Standard Model describes the fundamental particles from which we, and every visible thing in the universe, are made, and the forces acting between them. All the matter that we can see, however, appears to be no more than about 4% of the total. A more exotic version of the Higgs particle could be a bridge to understanding the 96% of the universe that remains obscure.
“We have reached a milestone in our understanding of nature,” said CERN Director General Rolf Heuer. “The discovery of a particle consistent with the Higgs boson opens the way to more detailed studies, requiring larger statistics, which will pin down the new particle’s properties, and is likely to shed light on other mysteries of our universe.”
Positive identification of the new particle’s characteristics will take considerable time and data. But whatever form the Higgs particle takes, our knowledge of the fundamental structure of matter is about to take a major step forward.
Contact:
CERN press office, press.office@cern.ch
+41 22 767 34 32
+41 22 767 21 41
Further information:
UPDATE: My friend John Coleman at KUSI-TV in San Diego has produced an interesting video report based on input from the WUWT thread. Watch it here
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
In your Cern link: “According to theory, the Higgs mechanism works as a medium that exists everywhere in space. Particles gain mass by interacting with this medium.”
Well, maybe that makes sense. But that reminds me a b it of the “luminiferous ether”, you know, that medium thru which light was supposed to be propagated? (before the Michelson-Morley experiment, and Einstein).
To quote the famous play write, “Much To Do About Nothing” when you have nothing of substance you inflate the press release.
Peter Kenny: I think the better analogy is in the Higgs field that is supposed to result from this particle’s virtual existence. The field is much like the interacting electromagnetic field that propagates light, only the Higgs field interacts to give mass and gravity. I don’t know where exactly this ether language got dragged into the discussion, but there really is no analogue in the theory. A field is not a medium, but may induce force, in this case the force of gravity and inertia. In the electric field it is electric force, magnetic field – the magnetic force though we state that those are intertwined and the particle related is the photon. In the same way, in the case of gravity/inertia, the particle is the Higgs boson. I still have a harder time visualizing the concept of the Higgs than I do the photon, as the property of a photon looks like a “particle” of light. I can’t get a similar feel for the Higgs creating gravity,
JPY- You make an egregious error with that leap of illogic. One commenter at a site with an enormous number of readers apparently suffices in your mind to indict some unspecified portion (presumably all) as “uncritical” and imply that readers here are reading crackpottery. The only person I can be sure is incapable of critical thinking is you. On the other hand one would certainly expect that the great mass of blog readers here would perhaps learn something by visiting sites on the blog roll. I suspect the majority of blog readers have no opinions about high energy physics but it would behoove them, if they want to be informed, not to pay attention nonsense from Smolin. I am sure you and he would get along quite well, being purveyors of ignorance.
I dare any blog reader to take their criticisms of physics, and I do see quite a few of you, to the blog of an actual physicist (The Reference Frame, it’s on the blog roll) and see how far you get. If you have no criticism of physics, check it out anyway if you want good information about this and other things.
George E. Smith;
July 4, 2012 at 2:30 pm
Hi George
I still look in here every day, but am now more busy on a physics blog. The AGW aka Climate Change aka Climate disruption is no longer fascinating for me except sociologically. The science and data are more or less settled in my mind 🙂 .
Peter Kenny says:
July 4, 2012 at 6:43 pm
“In your Cern link: “According to theory, the Higgs mechanism works as a medium that exists everywhere in space. Particles gain mass by interacting with this medium.”
Well, maybe that makes sense. But that reminds me a bit of the “luminiferous ether”, you know, that medium thru which light was supposed to be propagated? (before the Michelson-Morley experiment, and Einstein).”
It is a bit, except, and it is a big exception , the mathematical construction/form is such that it is Lorenz invariant, i.e. the velocity of light is constant in all frames. The luminiferous ether was not.
Paul Westhaver
July 4, 2012 at 4:46 pm ,
and all detractors of this great rejoicing in the physics community, please keep in mind what I said before:thousands of people for decades have worked on this project hoping to find new unseen physics that would progress our knowledge of the microcosm and not only.
It is a huge effort in time and space . I myself worked on part of this from 1997 to 2001 when I retired. My work was acknowledged by signing the detector paper together with another 3000 people. This is group work of the order of building pyramids or Parthenons. And it is not nine to five work. For most of them it is work consuming all their intellect and resources.
Please view it as such and not attribute it to hysteria or ulterior financing motives . If you watch sports, would you call hysteric the people who rejoice at a success? It is part of the human group makeup and this is a group of highly educated and intelligent people yoked together for a common goal. Of course they rejoice and shout it to the treetops.
Peter Kenny says:
July 4, 2012 at 6:43 pm
“In your Cern link: “According to theory, the Higgs mechanism works as a medium that exists everywhere in space. Particles gain mass by interacting with this medium.”
Well, maybe that makes sense. But that reminds me a bit of the “luminiferous ether”, you know, that medium thru which light was supposed to be propagated? (before the Michelson-Morley experiment, and Einstein).”
In addition to Anna’s comment:
We also have good reason to believe that something sharing at least one property with the Higgs (a type of interaction) is present everywhere. The fact that the interaction-particles of the Weak nuclear force have mass implies that there is a symmetry in the high-energy limit which then breaks at lower energies. That (“Electroweak symmetry breaking”, or “EWSB”) implies that some particle is screwing it up at low energies, which implies that it appears, at least as a virtual particle, spontaneously in low-energy vacuum. That field of non-zero probability of the presence of virtual particles which screw up the Weak force is the medium in question.
Someone mentioned that advancement of technology is accelerating. Indeed, it is, the slope is exponential if you look at processing capability. My first “computer” was an Atari with perhaps a n effective capability of a few Mflops. At my desk where I work (really just a table), I command a 4+ Tflop machine. Our lab has many of these machines, to the tune of something like 50-75 kW of power (they had to put in a new transformer last week due to the load) with rarely even 10 developers working at a time. This progress took about 25 years (Atari to Tesla release).
If you read Kurzweil, he thinks this pace will continue and we will soon (2020-2030 IIRC) reach a singularity at which point none of this will matter. I wonder if he is correct… will our ability to process information outstrip the rate at which we discover it? Will we then know all that is knowable? Will we be bored? Will beer still be so good?
Mark
John Coleman
The premise of your request or that of your editor is wrong. The benefits of understanding the science of particle physics are their own reward. Journalists look for value from the results of scientific research rather than the benefits of the result itself to the science. In today’s climate of climate research the knowledge that comes from the research has to have political and economic values and implications. The benefit of scientific breakthroughs such as the Higgs boson experiments is the new knowledge gives an opportunity to build confidence in the results and a hook to grab deeper understandings of the science of particle physics. If the justification for spending money to conduct research in particle physics demands an immediate societal payback, the public will be disappointed. For example, the application of principles derived from mathematical research often take years of additional mathematical research to provide a payback long after the mathematician has retired or has died. But the breakthrough paid off because of how is was assimilated into the field of mathematical knowledge and used in subsequent research. I doubt that Newton issued a press research claiming the benefits of understanding gravity which it later turned out that his idea were radically corrected by Einstein. Even if he had released a blurb to the press, would that knowledge really have changed anything in people’s lives? The benefits of verifying the existence of the Higgs boson won’t immediately change people’s lives but it is a tremendous breakthrough for understanding particle physics. That knowledge combined with the fabric the rest of quantum and gravitational physics greatly alter the application of physics to life on this planet or on other planets.
Belgians Francosis Englert & recently deceased Robert Brout were first in publishing the “Boson” particle theory in August 1964; followed within a month by Peter Higg’s “Physics Letter” treastise.
A 1972 US National Accelerator Laboratory symposium is where “Higgs” boson appellation took hold.
Kurzweil says 2045 for the singularity. My bad for not checking before posting.
Mark
I would like to make a relatively short statement, for the record.
On pages of this blog and elsewhere, I expressed, several times, my disbelief in certain dogmas of the “mainstream” cosmology. Namely, I don’t believe in the Big Bang, in so-called “dark matter” and “dark energy,” in the commonly accepted interpretation of the red shift, and in the accelerating expansion of the Universe. I still hold these views. Being a music composer and a translator, I am not prepared to give a detailed physical explanation of the observed phenomena but it is obvious even to a layman today that astronomical observations contradict the “consensus cosmology.”
Having said this, I must make the following important reservation.
On pages of this blog and elsewhere, several people proposed to me and to others that a suitable explanation of above-mentioned contradictions between observed phenomena and widely accepted theories is given by the so-called “Electric Universe Theory.” Since I am instinctively predisposed to suspect any “theory of everything,” I wasn’t inclined to engage into analyzing this theory. I had a feeling that, like many other simplistic “theories of everything,” it would prove to be a sham.
Today I overcame my internal resistance, and read with attention a synopsis describing fundamental tenets of the so-called “Electric Universe Theory.” I am glad that my premonitions were true.
The “Electric Universe Theory” is a sham of the first degree, on par with Hubbard’s Scientology. It has nothing to do with serious science. I don’t want to be associated with this theory in any way or form. I pray everybody who reads my occasional comments in this blog to remember that Alexander Feht and the “Electric Universe Theory” are two things incompatible and mutually repulsive.
Thank you for your attention.
P.S. As to the Higg’s boson, I don’t know. Too little information, too many premature conclusions. As long as it’s existence doesn’t contradict the logic of the Unitary Symmetry of the elementary particles, I wouldn’t mind it’s existence. I magnanimously allow it to exist if it wants to. Let it be…
Lubos Motl is convinced it’s right and real. He won a bet.
Stephen Hawking lost (another) bet.
http://motls.blogspot.com/2012/07/higgs-bets-i-won-500-gordy-kane-won-100.html#more
Maybe Stephen Hawking should shy away from theoretical physics and just stick to math. But no matter. He’s become ultra famous for theoretical physics—even though he doesn’t understand what he is talking about. What a world. Go figure.
OK, Higgs’ boson and its existence.
Forgive my Russian war with English apostrophes.
Higgs or no Higgs or some other Higgs, but nobody can convince me that the Universe was created from a substance smaller than a pea 13 billion+ years ago in a large “explosion”. How can you have a theory about the creation of something you don’t even understand or can measure because the further out you look the more you detect without reaching any outer limit. I cannot be convinced that all the volume and weight of material already detected could just emanate from a “pea” in some particular place of a vast empty space (plus perhaps infinitely more – because nobody knows where the outer limit is if there is one).
To me this has all the hallmarks of a ‘we found the part of the brain that handles philosophical thought’ based on fMRI malarki.
You take an enormous amount of noise, presuppose there has to be signal X in it, build a statistical filter that has to find X, refine it over time and hey presto! The signal you designed it to find is there.
Surprise, surprise. Btw, is funding drying up due to worldwide recession?
Color me skeptical.
Mark T;
If you read Kurzweil, he thinks this pace will continue and we will soon (2020-2030 IIRC) reach a singularity at which point none of this will matter. >>>>
Color me raging skeptic on that one. Computers do exactly what the programs tell them to do (which is frequently not what you wanted them to do). The notion that due to huge increases in processing speed they will at some point make the leap to intelligence, or consciousness, or some cognitive state that we poor humans cannot grasp or comprehend is ludicrous.
Further, Moore’s Law (transister count on an integrated circuit doubles every 24 months and processing power doubles every 18 months) is essentially broken already. CPU’s stopped getting a whole lot faster a few years ago. Processing power that you can buy in a single computer has continued to increase, not because the cpu’s are a whole lot faster, but because we’re stuffing multiple cpu’s (cores) into one big chip, plus each core is “multi threaded” so can run more than one task at the same time provided that the software was written to take advantage of this feature.
The problem relates to the speed of light. We can make the cpu’s go faster, but there’s no point because we can’t feed them data fast enough to use the speed. We cannot move data from mass storage (disk drives or in highly scalable environments storage arrays) to the cpu any faster than the speed of light. So, making many small cpu’s and letting each one do multiple computing tasks at the same time gives us the illussion of additional processing power. What is really happening is that the cpu’s are working on Problem A for a few clock cycles, and then on Problem B for a few clock cycles, and while Problem B is being worked on, the next chunk of data required for problem A has had the opportunity to arrive.
This problem actually has “tiers”. The cpu has a certain amount of memory or cache right on the chip. Blazing fast, close to the cpu so speed of light limitations don’t impart a massive latency. Next is RAM. Not as fast as cpu cache, and further from the cpu, so higher latency, but still faster than mass storage which for the most part these days is spinning rust. Yes, yes, I know, SSD’s (solid state disks) are becoming economical, but they are not the panacea that people think. Like all technologies, they do some things better than others. High IOPS the SSD wins hands down. Need to move massive amounts of data in big streams? Spinning rust is faster.
The technocrats will probably jump in and start saying yes, but you can do pipelining, and de-duplicated cache, and so on, and that is true. But at day’s end, until we have a cure for the speed of light, we’re actually pushing the bounds of processing power because we just cannot move data fast enough to the cpu’s.
Interestingly, this problem was identified by a physicist named Grace Hopper, who was a programmer working on the earliest computers ever for the US Navy. She invented things like the compiler that have changed the face of computing for decades. She predicted that computers would reach the limit of processing power due to the speed of light limiting data transfer in (if memory serves) the late 40’s or early 50’s. Back then, computers were constructed of physical relays, and she was already explaining that the smaller the computers could be made, the faster they would go, and she was correct. (she also tracked down an arithmetic error due to a moth that got stuck between the contacts of a relay, pasted it into her lab book with clear cellophone tape, and wrote the now famous words “first bug found”)
John Peter,
Priests of the consensus will tell you that all the space and all the time were contained in that “singularity” at the moment of Creation (a.k.a. the “Big Bang”) — so there was no “empty space around it,” it was the space-time itself.
But you are right, they are talking nonsense. The Universe as we see it IS that singularity. The abstraction called “time” is an effect of observation (a way to use cyclical processes to compare various manifestations of entropy within the system of coordinates small enough for space-time curvature to be negligible).
Time is a component of space-time, separated from the space only in our imagination, not in reality. It is as elastic as space, and changes with the intensity of the gravitational field. In other words, gravity is a function of the change of the rate of time (not of the amount of time but of the rate of time). Hence, the farther from the intense gravitational field is the generator of synchronizing signals, the higher the frequency of these signals for the observer receiving them within the intense gravitational field.
There could not have been a beginning of something that doesn’t really exist.
Paul Murphy says:
2 – what we seem to have is conditional, highly caveated, confirmation that almost, but not quite, fits the current standard model. Those deviations, especially those noted by the fermi labs people, may be the most important results here – ultimately allowing somebody somewhere to break the mental logjams physics has faced since pretty much the late 1930s.
Of course what many people ignore is that the logjam may actually be a deadend caused by following WRONG theory! Enough monkeys doing math can make anything seem plausible until you actually have to match it with reality.
John Peter
I’m certain that no matter how it is finally determined how the universe “began” it came from God.
.
But actually it isn’t believed (anymore) that everything came from something the size of a pea. There’s a hypothesis that the universe we’re in came from leftover material of a previously existing universe. Or, that universes are in the shape of sheets and two universes and touch together, like curtains touching together in a windy day, and cause the beginning of another universe. Or that there is a much larger universe around our universe and ours was formed by materials from that larger body of material.
Whatever the case, it all came from God.
2 videos on “Brane” theory:
http://www.youtube.com/watch?v=TUf4pVg1Lts
Ooops, forgot the last paragraph….
This notion that processing power increases will eventually lead to some kind of leap to a cognitive state by computers that mere humans don’t understand is false on two fronts. The first is that the processing power for any SINGLE task isn’t increasing all that much, no “singularity” in sight, and the second is that the processors simply execute simple instructions very very fast, they have no ability to make new instructions. Techniques like fuzzy logic can give the illusion of “choice”, but at days end the computer cannot “choose” anything at all other than to execute the instructions given to it.
“””””…..davidmhoffer says:
July 4, 2012 at 11:36 pm …..”””””
Make that Admiral Grace Hopper; who was one of the longest serving persons on active duty in the US Navy history.
Re: some thoughts about Kurzweil’s predictions
What we call “self-awareness,” “consciousness,” or “personality” (and what some traditionally inclined people prefer to call “soul”) is, in evolutionary terms, an adaptive top-level subroutine (shell) responsible for optimal survival and reproduction of the colony of specialized cells and symbiotic bacteria that itself is a specific expression (phenotype) of the collection of genes contained in the biological organism.
Plants, probably, don’t have this subroutine — at least not in the form usually associated with autonomously moving gene carriers. All animals with central nervous systems have it — but already on the level of insects it is so complex that, most probably, it will take centuries to decompile.
I am not saying that raising computing systems to the level of complexity inherent in the human self-awareness subroutine, in the human underlying operational system, in the interfaces supporting the communication between the self-awareness and the operational system (collectively known as “subconsciousness”), as well as in multiple applications running our physiological processes, is impossible in principle. But it is obvious that we cannot create something as complex as ourselves without clearly understanding everything that makes us tick. This may take thousands of years, and by the time we build our first self-aware computer (which, most likely, will be an artificial life form), “human beings” will already cease to be us as we know ourselves.
Before fantasizing about “artificial intelligence” try to make dogs who speak. That would be nice.
I would like it very much if my conversations with dogs wouldn’t be so awkwardly one-sided sometimes.
“””””…..rge E. Smith; says:
Your comment is awaiting moderation.
July 5, 2012 at 12:23 am
“””””…..davidmhoffer says:
July 4, 2012 at 11:36 pm …..”””””
Well any increase in cpu speeds and multiple cores etc is all for naught. Micro$oft will ensure that its flagship computer virus, will continue to expand to use up any and all hardware gains.
My 8-core laptop isn’t any faster than when I was running M$ DOS-3.2 on my IBM PCXT.