National Science Foundation backs Rice-led effort to create science-aware artificial intelligence
Rice University
HOUSTON — (Sept. 18, 2019) — A Rice University scientist and his colleagues are booting their search for dark matter into a study they hope will enhance all of data science.
Rice astroparticle physicist Christopher Tunnell and his team have received a $1 million National Science Foundation (NSF) grant to reimagine data science techniques and help push data-intensive physical sciences past the tipping point to discovery.
Experiments in the physical sciences are starting to produce thousands of terabytes of data, Tunnell said. “These datasets are fundamentally different from large datasets of everyday photos, text or video,” he said. “Ours relate to experiences of the natural world that only highly specialized instruments and sensors can ‘see.'”
In tackling this class of problem, the two-year project aims to influence the way data scientists use machine and deep learning in bioinformatics, computational biology, materials science and environmental sciences. Tunnell said the goal is to support these physical science communities through a “domain-enhanced” data science institute.
“In large astroparticle data sets, we often look for the faintest signals that anyone has ever attempted to measure,” said Tunnell, an assistant professor of physics and astronomy and computer science and lead investigator on the project.
“Science is incremental,” he said, explaining the domain-enhanced approach. “We have spent decades building up mankind’s most precise physical theories, which provide the foundation for these measurements. When using machine learning in this realm, the machine has to learn through its own ‘Phys 101.’ But the great artificial intelligence advancements of the last decade have been mostly in computer vision and natural language processing with a muted impact in physical sciences.”
Tunnell’s co-investigators are Waheed Bajwa, an associate professor of electrical and computer engineering at Rutgers University, and Hagit Shatkay, a professor of computer and information sciences at the University of Delaware. The team formed at an Ideas Lab run by the NSF and Knowinnovation that brought together scientists and engineers to facilitate novel data science ideas that did not fit any disciplinary mold.
The researchers argue that particle physics can serve as a driver for technological advances that are later used by other sciences in the same way that data-handling needs at the European Organization for Nuclear Research (known as CERN) led to the development of the World Wide Web.
“Our proposal focuses on one scientific application — in this case astroparticle physics — to test out multiple novel methods,” Tunnell said. “We are searching for solutions to a real-world problem rather than problems that fit our solution. That, in my view, is what interdisciplinary science is about.”
For the dark matter search, they need data science and machine-learning algorithms that improve measurements of particle interactions in their detectors. “This will simultaneously increase the ability to measure faint dark-matter signals while improving the precision of energy measurements,” Tunnell said. “It will help the experiment be sensitive to neutrinoless double-beta decay, a process that sheds light on the nature of neutrino mass and, potentially, why our universe is made of matter.”
He said they will employ probabilistic graphical models that allow them to encode their knowledge of science, as well as inverse problem formulations that teach machine-learning routines enough that they can learn the rest on their own.
Tunnell has already gained a foothold in the search for dark matter, even if the matter itself is not at hand. Earlier this year, he and colleagues at the XENON1T experiment announced in Nature they had found the first physical evidence of the material with the longest half-life ever measured. The sophisticated detector under a mountain in Italy discovered that Xenon 124 has a half-life of 18 sextillion years, demonstrating that the experiment and subsequent data science can measure exotic physical signals.
He noted the grant incorporates funds for educational outreach and training of data scientists in the techniques under development.
Tunnell’s group was formed as part of Rice’s Data Science Initiative, with additional seed funding for research from two Rice Creative Venture grants. “This work has already led to one discovery: a strong friendly interdisciplinary team interested in trying something new,” he said.
###
Read the abstract at https://www.nsf.gov/awardsearch/showAward?AWD_ID=1940209
This news release can be found online at news.rice.edu
Follow Rice News and Media Relations via Twitter @RiceUNews
Related materials:
Elemental old-timer makes the universe look like a toddler: http://news.rice.edu/2019/04/24/elemental-old-timer-makes-the-universe-look-like-a-toddler-2/
Rice Astroparticle (Tunnell group): https://astroparticle.web.rice.edu
Inspire Lab (Bajwa group): http://www.inspirelab.us
Computational Biomedicine Lab (Shatkey group): https://www.eecis.udel.edu/~compbio/
Rice Computer Science: https://cs.rice.edu/
Rice Department of Physics and Astronomy: https://physics.rice.edu/
Image for download:
https://news-network.rice.edu/news/files/2019/09/0909_DATA-1-web-1.jpg?CAPTION: Christopher Tunnell. (Credit: Jeff Fitlow/Rice University)
Located on a 300-acre forested campus in Houston, Rice University is consistently ranked among the nation’s top 20 universities by U.S. News & World Report. Rice has highly respected schools of Architecture, Business, Continuing Studies, Engineering, Humanities, Music, Natural Sciences and Social Sciences and is home to the Baker Institute for Public Policy. With 3,962 undergraduates and 3,027 graduate students, Rice’s undergraduate student-to-faculty ratio is just under 6-to-1. Its residential college system builds close-knit communities and lifelong friendships, just one reason why Rice is ranked No. 1 for lots of race/class interaction and No. 4 for quality of life by the Princeton Review. Rice is also rated as a best value among private universities by Kiplinger’s Personal Finance.
OK, the comment “neutrinoless double-beta decay” got me sufficiently curious to look it up, here it is: Neutrinoless double-beta decay is a forbidden, lepton-number-violating nuclear transition whose observation would have fundamental implications for neutrino physics…” (Dolinski, et al). I stopped reading at that point because I realized any comment I could possibly make would reveal my tendencies toward snarkism, so, never mind!
I tried utilizing AI in image processing, but the training requirements (of the computer, not me) were prohibitive.
Perhaps you used the wrong learning algorithm?
Actually, Ed Z., what we learned, by controlled experiment, ie flying to gps coordinates in a helicopter and taking samples, was that digital data, when modified by algorithm, cannot produce the same result as the pure, direct data. In image processing this means utilizing Supervised Classification, instead of any of the popular image analyzing algorithms. The experience of mine, managing a research group, shows that the further you stray from the pure data the less likely you are to maintain scientific purity, ie, get results that are correct and not mis-guided.
Hi Ron. Same results here. Nothing matches real data! The only difference is that I’m using UAVs and probably in a smaller scale than you.
This is reminiscent of the photographer’s observation: you can only take a picture once.
The better the camera- resolution, lens precision, image size, which kind of sensor for digital, which film and where in the production run it was spooled, mechanical design and stability, etc. the harder it is to take two apparently identical pictures.
A cheap, low resolution camera can’t collect enough data to make a clear picture, even when examined at the native image size. Two pictures can look identical just because there isn’t enough information to discern differences.
A bigger, better camera can take much better pictures but the cost goes up exponentially with image size. The larger sensor with larger pixels can capture more light and the sensor can be designed to deliver more bits of data per pixel. The last few bits are usually almost completely random, but in very precise situations they they can show legitimate differences that aren’t the same for multiple frames take just milliseconds apart.
Analog sensors and outputs have defined accuracy. Accuracy is fixed by the design of the sensor and the design of the output. A mercury thermometer with a marked in deg intervals has an overall accuracy of 1degree, and depending on how its built perhaps an estimate of 1/4 degree.
Digital sensors have a fixed accuracy based on how many places there are in the output. Technically it can’t “see” anything finer than the # of place, but better sensors often register more places than they report and use the extra places for calibration.
I imagine it can get pretty complex looking for “dark matter” that doesn’t affect light or other electromagnetic radiation.
Science not constrained by scientific rigor and intellectual honesty to map only what is real, creates its own “reality”, or ψευδωνύμου γνώσεως, i.e. falsely-named knowledge (who will understand my reference?). Call me cynical, but there are too many examples of false science created by science practitioners, who were seduced by the expediency of technology and their own hubris, to make me think this exercise will produce anything of practical worth.
“Dark Matter and Energy” always sounded Gnostic to me….
The hard nosed CERN physicists see no trace of it.
They’ll have more luck finding the Loch Ness Monster.
The conflation of reality with image is, unfortunately, so prevalent/endemic in the world.
It’s called “dark” because stuff is happening and there’s nothing thus far in all of known physics that can fits the observations. There’s been lots of attempted solutions, and they all fall flat somewhere.
Dark Energy at least might be explained as a quantum vacuum pressure, and such a pressure would fit pretty well with the observations. Even so, how do you directly measure the energy state of literally nothing?
As for dark matter, would you prefer to call it “WTF matter”? That really is where we are at. Whether by Newtonian physics or Relativity, our galaxy, and nearly every other galaxy observed, shouldn’t exist. 80% of the gravity in galaxy sized objects has no identifiable source.
Is it all contained within Sagittarius A*? Nope, not even close. The mass of Sagittarius A* can be determined by the orbits of nearby stars.
What about a bunch of other supermassive black holes? Nope, they would perturb other stars nearby.
Lots of tiny black holes? Nope, or they would be whizzing through planets all the time, leaving wakes of shredded nuclei behind them.
Lots more gas and dust than expected? Nope, as the amount required would noticeably interfere with our telescopes.
Lots of planetary objects in deep space? Nope, the amount needed would also be noticed.
Etc.
If you figure out what is going on, you’re guaranteed a Nobel prize.
icisil September 26, 2019 at 3:29 am
I looked up the phase, it was interesting reading. Thank you.
michael
I did a search to see what you might have found interesting, and learned two new things. Irenaeus’ anti-gnostic tome Against Heresies is actually Against ψευδωνύμου γνώσεως. And Clement of Alexandria wrote that some gnostics rejected 1 Timothy because Paul used the phrase ψευδωνύμου γνώσεως therein. So thank you.
Billions wasted on searching for something that does not exist so philosophical world views can have their equations balanced. This is not science.
“even if the matter itself is not at hand” No kidding.
The entire point to science is to understand natural phenomena. Thus far, such things as Newtonian physics, Doppler Effect, Quantum Mechanics, and Relativity are understood, and relied upon heavily in our daily lives. Those equations are important for the advance of technology.
Unlike CAGW, this branch of physics is freely admitting that they just don’t know what’s going on. That’s why they call it “dark”. They have been trying to figure out what the “dark” stuff is for years, and nature isn’t being very forthcoming.
At this point, we have conflicting data between what we know and use already and what is observed on galactic scales. On top of that, Quantum Mechanics and Relativity don’t mesh together.
This all may seem like total BS, but shrugging and being satisfied with only knowing less than 20% of the fundamentals of how the universe works is a far more lame attitude to take.
I’m naturally skeptical on most of the projects like this, but in this area specifically we are getting huge amounts of data that may contain very interesting results.. if only we can recognize it. The only way is to develop methods to parse through it to get something meaningful.
It may not pan out or it may. Nothing ventured, nothing gained. I would hate to stop all basic research just because the climate people take the desired results and work backward. All people in research aren’t this corrupt.
of more concern is the gurgles supposed quantum computing being able to hack crack and interfere with all sorts of present data
Glorious.
Computer-aided apophenia!
Academics like to laugh at “undergraduates” (normal folks) who blindly use the old statistical software, I think it was called SPSS? to search out every possible correlation in tiny stacks of data.
Like going door-to-door for an afternoon, asking a few questions at each door, and taking the data set down to the data center. Where you ask the operator (no personal access to an actual computer in those days, 70s) to find all possible statistical matches in it. Then publish your paper that proves conclusively that peanut butter sandwiches cause dementia. Correlation equals causation.
I don’t understand the physics of this article at all but I know what happens if you use a computer to blindly whack at a data set until some faint pattern drops out. Ignorance, stupidity and false conclusions, pseudo-science of the most pernicious sort. All made holy and perfect because a computer is involved! Impossible to replicate or test. But replete with the latest buzz words, like “AI” and “machine learning”.
Instead of one garbage can full of unmentionable articles out for one can in, now we can process whole land fills at a time!
I am reminded of the paper which claimed scary warming of some of part of Antarctica, only for Steve McIntyre to find its data residuals looked like Chladni figures.
Will AI be able to learn and identify false precision?
The Big Lie of AI is that it “programs itself”.
A program that can alter its own code rapidly becomes utterly useless.
Machines can’t “learn” the way we do. The big AIs and the little chips that make your car window go up and down are all programmed to do what they do. Sometimes the program can be very sophisticated, using advanced statistical techniques on huge data sets and so on.
But the moment they change their own code they crash. If you are a programmer, try to imagine a single line, completely random, change that could improve a program. Not possible. I++ to I–. Sure.
All the mad monkeys in all the wild world at all the most sophisticated typewrites with the most hyper-buzz-word names will still not produce the works of Shakespeare. “Do be or not to be, that is the PbSFDSFRRsskl”.
The evidence for the causal effect of co2 causing global warming and the evidence for the existence of dark matter is currently of equal strength.
Dark Matter is not going to be found by whiz bang AI, in the same ways that gravity will not be cracked by better telescopes, in the same way that having all of the worlds information digitized does not lead to new knowledge of the fundamental kind – that is a creative process of the human mind.
Tens of scientists know what dark matter is. But they are analytic chemists, experimental physicists etc so they don’t speak out.
Having reached out to about 5 actual dark matter physicists this is untouchable.
Happy to discuss twenty eight years of published work in plasma physics and other journals.
One of my team members wrote an article that is an outstanding introduction to dark matter.
https://medium.com/@brett.holverstott/dark-matter-and-the-frontier-of-euv-astronomy-460f92d6ca84
It’s a fascinating idea and observers don’t have an explanation for the heat and light that Mills has been producing in his lab. There doesn’t appear to be any fraud or falsification but naturally everyone who knows about this are highly skeptical. Mills has been quite open and transparent about what they are doing. Is someone trying to duplicate this yet? I don’t know but those who have observed this process are perplexed.
If dark matter didn’t exist we’d have to invent it. Oh wait…
My neighbours dog ate his copy of Lee Smolin’s ‘The Trouble with Physics’. I suspect the dog had less trouble digesting it than most people.
“If dark matter didn’t exist we’d have to invent it. Oh wait…”
That was funny. 🙂
Dark matter does not exist, as the concept dark matter does not explain the observational fact that all spiral galaxies’ angular momentum increases linearly with spiral galaxy mass paradox:
Which is explained in Disney’s Galaxies are “simplier’ (William: Sick) than expected
Angular momentum should for the Big Bang mechanisms collapsing gas clouds with dark matter distributed Gaussian around a low base.
Larger mass galaxies should not have more angular momentum.
What is physically required by the observations (not an option, something that is observed and hence must be explained) is a mechanism that creates angular momentum. Dark matter does not create angular momentum.
http://www.nature.com/nature/journal/v455/n7216/abs/nature07366.html
https://arxiv.org/abs/0811.1554
“Galaxies appear simpler than expected
Galaxies are complex systems the evolution of which apparently results from the interplay of dynamics, star formation, chemical enrichment, and feedback from supernova explosions and supermassive black holes1.
The hierarchical theory of galaxy formation holds that galaxies are assembled from smaller pieces, through numerous mergers of cold dark matter2,3,4. The properties of an individual galaxy should be controlled by six independent parameters including mass, angular-momentum, baryon-fraction, age and size, as well as by the accidents of its recent haphazard merger history.
Here we report that a sample of galaxies that were first detected through their neutral hydrogen radio-frequency emission, and are thus free of optical selection effects5, shows five independent correlations among six independent observables, despite having a ….
… This implies that the structure of these galaxies must be controlled by a single parameter, although we cannot identify this parameter from our dataset. Such a degree of organization appears to be at odds with hierarchical galaxy formation, a central tenet of the cold dark matter paradigm in cosmology6.
…Consider spin alone, which is thought to be the result of early tidal torquing. Simulations produce spins, independent of mass, with a log-normal distribution. Higher-spin discs naturally cannot contract as far; thus, to a much greater extent than for low-spin discs, their dynamics is controlled by their dark halos, so it is unexpected to see the nearly constant dynamical-mass/luminosity ratio that we and others14 actually observe.
Heirarchical galaxy formation simply does not fit the constraints set by the correlation structure in the Equatorial Survey.”
Large spiral bulgeless galaxies (Great example Milky Way) in fact cannot be created by the Big Bang mechanisms.
The Big Bang mechanisms with Dark matter create spiral galaxies with a massive bulge and that rotate too slow.
http://arxiv.org/abs/1009.3015
BULGELESS GIANT GALAXIES CHALLENGE OUR PICTURE OF GALAXY FORMATION
BY HIERARCHICAL CLUSTERING
We inventory the galaxies in a sphere of radius 8 Mpc centered on our Galaxy to see whether giant, pure-disk galaxies are common or rare. We find that at least 11 of 19 galaxies with Vcirc > 150 km s-1, including M101, NGC 6946, IC 342, and our Galaxy, show no evidence for a classical bulge.
We conclude that pure-disk galaxies are far from rare. It is hard to understand how bulgeless galaxies could form as the quiescent tail of a distribution of merger histories.
Recognition of pseudo bulges makes the biggest problem with cold dark matter galaxy formation more acute: How can hierarchical clustering make so many giant, pure-disk galaxies with no evidence for merger-built bulges?
All we need is more Element 115… And a non stick skillet.
“Science is incremental”
That’s strange, we’ve been told the (climate) science is settled. 😉
Just be sure to do the real science before the AI learns leftist politics.
Rice astroparticle physicist Christopher Tunnell and his team have received a $1 million National Science Foundation (NSF) grant to reimagine data science techniques and help push data-intensive physical sciences past the tipping point to discovery.“
Mumbo jumbo buzz phrase nonsense. (My bold)
If the CDM signal rose above the noise of an immensely large graphed data set, the human eye-brain would see it. Then it would be up to computer analysis to determine the actual statistical significance for confidence (to eliminate spurious signals).
In the CERN LHC search for the Higg’s mass, relevant Higg’s boson data finally began to emerge out of the mass of random fluctuations around 125 GeV by two different groups using two different detectors of the LHC. Then complex statistical processing set the confidence limits finally to better than 5 sigma around 125 GeV.It was massive amounts of data, and it didn’t take AI to resolve.
So bootstrapping your search with an AI machine learned algorithm human’s don’t “understand” is a path toward self-deception. Very much like a modern GCM being tuned to find high CO2 sensitivity that is not there. Self deception..
”The first principle is that you must not fool yourself – and you are the easiest person to fool.”
– Richard Feynman
The eye-brain may not be able to see it, but the nose-brain can sure smell it.
It would be nice to believe that machines will do more useful stuff with artificial intelligence than most people do with their natural intelligence. But on the evidence we’ve seen so far, I somehow doubt it.
“Reimagine data science techniques and help push data-intensive physical sciences past the tipping point to discovery.”
A very noble goal, but not entirely novel. This box is usually labeled “then a miracle happens”
Does anyone else see the great possibilities for invalid conclusions–a la the hockey stick statistical analysis exposed by Steve McIntyre?
Dark Matter is the CO2 of astrophysics.
Now along comes AI to the rescue.
Maybe some really clued in climaologist will take the point with Climate AI.
When I hear “tipping point” it’s deja-vu.
Dark matter and string theory, seems to be religious to me, but I am a nuts and bolts kind of guy.
I smiled when I saw the ‘ Knowinnovation’ reference, parsing it as ‘No-win-ovation’ phonetically.
Algorithms, models, software coding, breathless/sappy promotional language (e.g., “This work has already led to one discovery: a strong friendly interdisciplinary team interested in trying something new.),” and a head-shot suitable for an ad for BVGARI = GIGO.
How do they know it is not 18 sextillion and 42 years?
If trying to discuss science with young emotionally challenged high school drop-outs seemed tough, even after years honing one’s skills by arguing pin-headed run-of-the-mill warmistas to a dead standstill, just imagine what a fun time is to be had when refuting A.I. acquired “knowledge”?
I cannot wait to be told “If you have a real mind, you are unqualified to comment on these findings!”
There are alternatives to the standard model
You might like this one
https://www.youtube.com/watch?v=p8lKQMEYYLw
Death of the big bang.
“Reimagine data science techniques and help push data-intensive physical sciences past the tipping point to discovery.””
Which realistically and probably means “push over the cliff”
I’ve been wondering for a while just how programmers can include proper scientific skepticism into AI. If it assumes all the data it assimilates is accurate and reliable, it will be pretty useless. But what kind of algorithm would be able to detect incompetence, bias or fraud?
Here is my explanation of Dark Matter. Generated by musing in a comfortable armchair.
Two objects each travelling at near the speed of light converge. Then they converge at a speed greater than the speed of light.
If they were diverging then they would be diverging at greater than the speed of light and thus would be unknown to each other, or to put it otherewise. Would be Dark wrt each other.
The number of particles that are moving away from earth at greater than the speed of light must be very large indeed. Hence lots and lots of Dark Matter in the universe from the perspective of the earth. QED.
Cheers everyone.
Alasdair
Dark matter may be nonsense, but your thought experiment is in violation of special relativity. Your two objects do not “converge/diverge at a speed greater than the speed of light” in their own reference frames which see the events as sub-c.
Spare me. Some journalistic BS just leaps out of the screen at you and gives you a real good slapping.
Like with climate claptrap, I’ll believe in their unexplained but new-found computational powers, whatever they may be, when it produces real verifiable results. Not just because “this is great because a million dollar grant says so”.
Dark Matter, Dark Energy, Black Holes.
Do you see a theme here?
Thinks that can’t be seen, thinks that can’t be touched, things that go bump in the night.
And a million dollar government budget.
I say we are being scammed.
I say it would make a great title for the new Batman movie …
“He said they will employ probabilistic graphical models that allow them to encode their knowledge of science, as well as inverse problem formulations that teach machine-learning routines enough that they can learn the rest on their own.”
That just sounds like bafflegab. They’re using the million bucks research money for something else….
“This project will develop innovative domain-enhanced data science methods that will be based on probabilistic graphical models and graph-regularized inverse problems.”
========
Sounds like a bargain at 1 million.
Just make the data open-source.
Dark Matter = Electron as a particle
Dark Energy = Electron as a wave
Dark matter/dark energy is nothing more than a fudge factor, applied individually to each galaxy just to “fit” the gravity effects detected and the space acceleration they think they detect. This is NOT science, it is just a kludge.
The whole dark matter/dark energy is nothing but unscientific fudge factors to prop up a theory that isn’t working quite right.
I prefer quantum inertia as a calculable theory that either matches the findings or doesn’t (so if it doesn’t work it can be discarded, as science is supposed to do) and is not created just to match what is detected with what is expected.
“Its residential college system builds close-knit communities and lifelong friendships” – in other words: networks.
Anyway: what’s the use of supercomputer searches when there ain’t the slightest idea of what to search for.