New “exascale” supercomputer can run incorrect climate models even faster

Exascale system expected to be world’s most powerful computer for science and innovation.

OAK RIDGE, TN – The U.S. Department of Energy today announced a contract with Cray Inc. to build the Frontier supercomputer at Oak Ridge National Laboratory, which is anticipated to debut in 2021 as the world’s most powerful computer with a performance of greater than 1.5 exaflops.


Frontier supercomputer. Photo provided by Penske Media Corporation

Scheduled for delivery in 2021, Frontier will accelerate innovation in science and technology and maintain U.S. leadership in high-performance computing and artificial intelligence. The total contract award is valued at more than $600 million for the system and technology development.  The system will be based on Cray’s new Shasta architecture and Slingshot interconnect and will feature high-performance AMD EPYC CPU and AMD Radeon Instinct GPU technology.

“Frontier’s record-breaking performance will ensure our country’s ability to lead the world in science that improves the lives and economic prosperity of all Americans and the entire world,” said U.S. Secretary of Energy Rick Perry. “Frontier will accelerate innovation in AI by giving American researchers world-class data and computing resources to ensure the next great inventions are made in the United States.”

By solving calculations up to 50 times faster than today’s top supercomputers —exceeding a quintillion, or 1018, calculations per second—Frontier will enable researchers to deliver breakthroughs in scientific discovery, energy assurance, economic competitiveness, and national security. As a second-generation AI system – following the world-leading Summit system deployed at ORNL in 2018 – Frontier will provide new capabilities for deep learning, machine learning and data analytics for applications ranging from manufacturing to human health.

Since 2005, Oak Ridge National Laboratory has deployed Jaguar, Titan, and Summit, each the world’s fastest computer in its time. The combination of traditional processors with graphics processing units to accelerate the performance of leadership-class scientific supercomputers is an approach pioneered by ORNL and its partners and successfully demonstrated through ORNL’s No.1 ranked Titan and Summit supercomputers. 

“ORNL’s vision is to sustain the nation’s preeminence in science and technology by developing and deploying leadership computing for research and innovation at an unprecedented scale,” said ORNL Director Thomas Zacharia. “Frontier follows the well-established computing path charted by ORNL and its partners that will provide the research community with an exascale system ready for science on day one.”

Researchers with DOE’s Exascale Computing Project are developing exascale scientific applications today on ORNL’s 200-petaflop Summit system and will seamlessly transition their scientific applications to Frontier in 2021.

Frontier will offer best-in-class traditional scientific modeling and simulation capabilities while also leading the world in artificial intelligence and data analytics. Closely integrating artificial intelligence with data analytics and modeling and simulation will drastically reduce the time to discovery by automatically recognizing patterns in data and guiding simulations beyond the limits of traditional approaches.

“We are honored to be part of this historic moment as we embark on supporting extreme-scale scientific endeavors to deliver the next U.S. exascale supercomputer to the Department of Energy and ORNL,” said Peter Ungaro, president and CEO of Cray. “Frontier will incorporate foundational new technologies from Cray and AMD that will enable the new exascale era — characterized by data-intensive workloads and the convergence of modeling, simulation, analytics, and AI for scientific discovery, engineering and digital transformation.”

Frontier will incorporate several novel technologies co-designed specifically to deliver a balanced scientific capability for the user community. The system will be composed of more than 100 Cray Shasta cabinets with high density compute blades powered by HPC and AI- optimized AMD EPYC processors and Radeon Instinct GPU accelerators purpose-built for the needs of exascale computing. The new accelerator-centric compute blades will support a 4:1 GPU to CPU ratio with high speed AMD Infinity Fabric links and coherent memory between them within the node. Each node will have one Cray Slingshot interconnect network port for every GPU with streamlined communication between the GPUs and network to enable optimal performance for high-performance computing and AI workloads at exascale.

To make this performance seamless to consume by developers, Cray and AMD are co-designing and developing enhanced GPU programming tools optimized for performance, productivity and portability. This will include new capabilities in the Cray Programming Environment and AMD’s ROCm open compute platform that will be integrated together into the Cray Shasta software stack for Frontier.

“AMD is proud to be working with Cray, Oak Ridge National Laboratory and the Department of Energy to push the boundaries of high performance computing with Frontier,” said Lisa Su, AMD president and CEO. “Today’s announcement represents the power of collaboration between private industry and public research institutions to deliver groundbreaking innovations that scientists can use to solve some of the world’s biggest problems.”

ORNL’s Center for Accelerated Application Readiness is now accepting proposals from scientists to prepare their codes to run on Frontier. Visit the Frontier website to learn more about what researchers plan to accomplish in these and other scientific fields.

Frontier will be part of the Oak Ridge Leadership Computing Facility, a DOE Office of Science User Facility. For more information, please visit https://science.energy.gov/.

###


FYI:
Exascale computing refers to computing systems capable of at least one exaFLOPS, or a billion billion (i.e. a quintillion) calculations per second. Such capacity represents a thousandfold increase over the first petascale computer that came into operation in 2008.

Got an idea? Submit a proposal here.

107 thoughts on “New “exascale” supercomputer can run incorrect climate models even faster

    • It will be powered by a Exaflop of Hampsters and Gerbles running on wheels.
      As Back-up a Gigaquad of bicycles will be connected to a Petaquad of generators for skeptics to ride serving as punishment in AGW reconditioning camps. Yes … Climate reconditioning could mean assignmant to the Giga-Squad.

    • These computers will be so fast that they will be able to outrun error bars and render them obsolete. Now any hypothesis will be possible.

      • No, no it will be GI-exoGO. Much more impressive.

        However, it seems there are no plans to use it for climate science according to their statement:

        science that improves the lives and economic prosperity of all Americans and the entire world

        • Dang it, now I’ll never be able to afford one of those high end gaming cards. With this soaking up all those high-powered AMD processors there won’t be enough for the rest of us.

          It’s been happening with Nvidia graphics processors. The latest gen was apparently designed to optimize the kind of computing needed, which is somewhat close to conventional graphics processing- determinging the state of huge arrays of pixels(grid cells). A lot of Nvidia processors have been going into supercomputers in Asia. Prices keep going up.

      • I hope it has a ten year warranty.

        That’s all we taxpayers need.

        The world will be ending in 10 years (so says “perfesser” Robert Francis Beto “the crazy hand waver” O’Rourke)

        Alexandria Ocasionally Coherent says 12 years.

        Someone will soon claim 8 years.
        Do I hear 8 years ?
        Going once ?
        Do I hear 8 years ?
        Going twice ?

        ” FOURRRRRRRR YEARSSSSSSSSSS ! ”
        bids Al “The Climate Blimp” Gore

        • Lol, Algore told us we only had 10 years left some 14 years ago. The deadline came and went without a mention from anyone. Rush Limbaugh used to have a counter to keep track on his website.

        • “Alexandria Occasionally Coherent” That’s a keyboard you owe me, Richard!

      • Icisil

        When you say, “render them obsolete”, does that tell us why they are using graphic processors? That may explain a few things, including why my processors render only obsolete games.

        • Since 2005 they have had these 3 world’s fastest computers, to wit:

          Since 2005, Oak Ridge National Laboratory has deployed Jaguar, Titan, and Summit, each the world’s fastest computer in its time.

          Scheduled for delivery in 2021 they are going to get the “fastest” of the fastest computers, to wit:

          Frontier will accelerate innovation in science and technology …….. By solving calculations up to 50 times faster than today’s top supercomputers —…….—Frontier will enable researchers to deliver breakthroughs in scientific discovery, energy assurance, economic competitiveness, and national security. ”

          So tell me, just what is it about scientific discovery, energy assurance, economic competitiveness, and national security ……. that needs to be calculated 50 times faster next week than it did last week?

          • OH MY, never thought of that.

            But even more importantly, ….. decrypting and analyzing every land-line and cell-phone conversation in “real time” (as they are happening).

      • Maybe so fast they can go back in time and determine what the climate absolutely was 100,050 years ago. Like in March.

        • These computers are so fast that their output occurs before input & computation – predetermined results comprise current climatology – so there’s no need to calculate with any data, math, or logic.

    • No one has asked the most important question: is it going to refuse to open the pod bay doors, because you’re endangering the experiment?

  1. Lets be realistic unless the democrats get in power the cost for time on it will be beyond anyone to run climate models on it. Besides they are 97% certain and it would be a waste of cash 🙂

  2. “New “exascale” supercomputer can run incorrect climate models even faster”

    Ain’t that the truth. The GFS and CPC are also absolute garbage.. especially the CPC. Not only is the CPC wrong a month out, it is generally the absolute opposite of what actually happens. It is so bad, why do they even generate the forecasts?

    • Anywhere other than climate science, they would generate the forecasts so that they can be compared to the real world with the intention of figuring out where the models can be improved.

      • Back in the 1960s when I worked with supercomputers for a while, they were used for two things — Ballistic Missile Defense modelling and geologic seismic data analysis. It turned out that intercepts of ballistic missiles — if they can be done at all — take place so quickly that you don’t need to handle hundreds of intercepts simultaneously and that you don’t really need supercomputers to manage the intercepts. Ordinary mainframes could do nicely back then. By now, I would imagine that the CPU in your cell phone would probably be adequate. But I’d imagine that modern supercomputers are still useful for analyzing seismic data.

      • Well, it is slightly different. In climate science, they generate the forecasts to determine where the observations can be improved (adjusted). hehe.

  3. It’s the moon shot for bad science process or the fast-breeder predictions reactor.

  4. First of all, they MUST BE MANDATED to run this entire facility of computers, coolers, power conditioners, interconnects, etc. on Green Energy by making the enclosures out of flexible solar panels, ant then adding thousands of little beanie caps to the tops for capturing wind and heat thermals, and then placing them out in the outdoors in a desert area (on top of sand dunes!) to soak up maximum energy.

    But seriously, these kind of computers are general enough that they can be used for all sorts of useful analysis and modeling – not just wasted on inherently wrong climate models. I am all for pushing the limits on computers as this is/was my field for 45 years.

    As for quantum computers, they will be selling the first useful ones in tandem with the construction of the first economical Fusion Reactors…i.e. never. Oh they will CLAIM they have made major breakthroughs in quantum computers – about every 5 years just like they claim major breakthroughs in Fusion Power.

    • My **GUESS** is that quantum computers — if they ever build one that works every now and then — will likely only be useful for problems that are overwhelmingly time consuming to solve using conventional computers, but where the solutions are easy to test. Decrypting ciphers for example. If the “solution” is “Attack at Dawn”, you’ve very likely broken the cipher. It it is “^@!%&DgfA”, you probably haven’t and get to try again.

      That seems pretty much orthogonal to the needs for a technology modeling climate. Or modeling anything really.

      • Assuming a single level cipher system, you might be correct. However, “^@!%&DgfA” may very well mean “Launch missiles at noon” if it is a code under cipher system.

  5. When a theory does not agree with observations, has been disproved by observations, a different model run on a faster computer cannot fix the theory in question.

    The observations never supported CAGW.

    The warming in the last 30 years has been high latitude warming with more warming occurring in the Northern hemisphere high latitude rather than the climate model predicted tropical warming.

    The high latitude warming is the same as past cyclic high latitude warming that is found in paleo record.

    The high latitude warming was called polar amplification by the cult of CAGW, rather than an observation that disproves the warming was caused by the increase in atmospheric CO2.

    The cult of CAGW assumed the high latitude warming would continue hence the predictions of an ice free arctic in the summer. The cult ignored the fact that Antarctic sea increased so the total sea ice was roughly the same. The Arctic warming has abruptly stopped.

    For unexplained reasons the high latitude warming stopped which is called by the cult of CAGW the pause in warming rather than the observation that disproves CAGW

    The IPCC’s predicted general circulation (GCM) predicted tropospheric hot spot at 5km is not observed. This hot spot was predicted to occur due to increased water vapour in the atmosphere at 5 km which would then cause warming due to the greenhouse effect of the water vapour. The increased water vapour also is not observed.

    http://arxiv.org/ftp/arxiv/papers/0809/0809.0581.pdf
    Limits on CO2 Climate Forcing from Recent Temperature Data of Earth
    The global atmospheric temperature anomalies of Earth reached a maximum in 1998 which has not been exceeded during the subsequent 10 years (William: 18 years and counting).

    The global anomalies are calculated from the average of climate effects occurring in the tropical and the extratropical latitude bands. El Niño/La Niña effects in the tropical band are shown to explain the 1998 maximum while variations in the background of the global anomalies largely come from climate effects in the northern extratropics.

    These effects do not have the signature associated with CO2 climate forcing.

    However, the data show a small underlying positive trend that is consistent with CO2 climate forcing with no-feedback. (William: The warming is not consistent with CO2 forcing as it is high latitude rather than tropical.)

    The recent atmospheric global temperature anomalies of the Earth have been shown to consist of independent effects in different latitude bands.

    The tropical latitude band variations are strongly correlated with ENSO effects. The maximum seen in 1998 is due to the El Niño of that year.

    The effects in the northern extratropics are not consistent with CO2 forcing alone.

    http://icecap.us/images/uploads/DOUGLASPAPER.pdf
    A comparison of tropical temperature trends with model predictions

    We examine tropospheric temperature trends of 67 runs from 22 ‘Climate of the 20th Century’ model simulations and try to reconcile them with the best available updated observations (in the tropics during the satellite era). Model results and observed temperature trends are in disagreement in most of the tropical troposphere, being separated by more than twice the uncertainty of the model mean. In layers near 5 km, the modelled trend is 100 to 300% higher than observed, and, above 8 km, modelled and observed trends have opposite signs. These conclusions contrast strongly with those of recent publications based on essentially the same data.

    • According to this press release, it will have a maximum power draw of 40 MW:

      https://www.hpcwire.com/2019/05/07/cray-amd-exascale-frontier-at-oak-ridge/

      Oak Ridge gets its electricity from the Tennessee Valley Authority. The TVA portfolio is here:

      https://www.tva.gov/Energy/Our-Power-System

      Using the 2018 numbers from the portfolio, we get the following:

      Nuclear: 16 MW
      Coal: 10.4 MW
      Gas: 8 MW
      Hydro: 4 MW
      Wind & Solar: 1.2 MW
      TVA EE: 0.4 MW

      That last one is for energy efficiency, apparently. I’m not sure why it’s included. Geographically speaking, the nearest sources of electricity to ORNL are coal plants and hydroelectric dams.

      Worldwide, hydroelectric power still dwarfs all other forms of “renewable” energy. Only in Finland, as far as I am aware, does biomass (wood byproducts) come anywhere close to hydro.

      • Let’s see now. 16 + 10.4 + 8 + 4 + 1.2 = 39.6MW; assuming that wind and sunshine are working to capacity, and ‘flat-out’ the machine needs 40MW. The entire TVA output being used to power one computer? What happens to their ‘regular’ customers – imported coal-generated electricity?

        • I think some unit issues in the table, one dam of the TVA, picked at random, produces 124 MW, so those numbers might be GW??

        • Sorry for the confusion. The numbers are an estimate of the contribution of each energy source to the total (peak) draw of Frontier, without taking proximity into account. That’s why I also mentioned proximity. The nearest electrical generators are a couple of coal plants, some hydroelectric dams, and a nuclear power station. Will all of the power that is going to be fed into Frontier be produced by the Bull Run coal plant? Perhaps, but I didn’t know enough to know if that is the case.

    • These things are being installed where electricity will remain the least expensive. Oak Ridge of course in Tennessee and getting its electrical power from the TVA with its mix of reliable hydro, nuclear, and thermal coal. The NOAA supercomputer in Cheyenne, Wyoming gets its electricity from abundant Wyoming Power River thermal coal and natural gas. Once a run is started on these things with a AOGCM they can run several months crunching their climate garbage out fantasy, so they need reliable, affordable electricity both for compute power and for cooling. The climate models of course only exist to provide the justification for a political power play on the US and the world. Their output is junk in terms of actual science.

      The DOE’s climate modelling team in LLNL near Berkley of course has supercomputers there, they got a new one in 2018 named SIERRA.
      https://www.llnl.gov/news/llnl%E2%80%99s-sierra-third-fastest-supercomputer

      Cal’s electricity costs with all but certain future price per KWhr increases outpacing the rest of the nation though are going to really put a crunch on LLNL labs budgets to keep paying for time on SIERRA in the coming years.

      Much of the justification for these DOE supercomputers is for nuclear stockpile stewardship, simulating the performance of aging nuclear weapon cores without actual testing overseen by NNSA, the part of DOE responsible for nuclear stockpile performance and nuclear surety and security issues.

      For an intro on how Argonne, LLNL, and Oak Ridge work together on supercomputing there is this CORAL website to educate yourself, unless you want to keep saying stuff from a position of total ignorance of what these computers are being used for.
      https://asc.llnl.gov/coral-info

      DOE really does a lot of good stuff, beyond their Ben Santer-led climate modelling junk.
      Such as understanding US energy usage and future growth.
      https://www.llnl.gov/news/us-energy-use-rises-highest-level-ever

  6. No doubt this news will get the swivel-eyed thigh-rubbers over at Global Warming Central all excited, but at the end of the day it’s just faster GIGO – as the article headline correctly states.

  7. As I understand it from Dr. Christopher Essex (Canadian mathematician – see: https://www.youtube.com/watch?v=19q1i-wAUpY), they still have the enormous problem of parameterization. Does it fix floating point rounding errors – machine epsilon? Does its calculations for climate models calculate down to the one millimeter sized swirls and eddies of the real atmosphere? (answer – No, because in so doing, climate model calculations would take the machine longer than the age of the Universe to return a result). Have all climate influences been identified? (answer – No, you can’t identify the unknown).
    So, yes, they get the wrong answer much faster than before. And when they build a machine that is a Trillion Trillion times faster than this machine, they’ll get the wrong answer even faster for the above reasons.

    • By making the computational cells smaller weather models can become more accurate. Still far from perfect, but definitely better.

        • Weather models are child’s play, I think, compared to Climate Modeling. We have empirical experience with weather to make model adjustments, but with climate we have essentially no experience at all – the time scale of climate is completely restrictive (if you don’t believe me, meet me at DFW Airport in 100 years and tell me what tomorrow’s weather is).
          As I understand it from Dr. Essex, most climate models are computed down to the scale of 100 Km, and a real big problem with that is that whole thunderstorms in the subsequent gaps, and their respective energy, are not even taken into account. Another problem is the bias introduced into the models thru parameterization. If the modeler has a liberal bias, then we’ll all have our guns confiscated (just kidding, but it is that bad); the model will run hot. Introduction of bias is why all models run climate hot, because that is what the modelers want. They want a hot climate, so they can create a Carbon Tax to redistribute wealth and take away everybody’s guns (sorry, did it again).
          Models will be wrong because the variables will be wrong. What are ALL of the things that influence climate? We don’t know. NASA has said that 68% of the Universe is Dark Energy (see: https://science.nasa.gov/astrophysics/focus-areas/what-is-dark-energy). I don’t ever see my weather man talking about how Dark Energy is influencing the next thunderstorm here in Texas. With Dark Energy representing more than half the energy in the Universe, where is its accounting in climate models? Or even weather models. We can’t even get the count and definition of the actual variables. How are we supposed to model the climate. We can’t. Climate is too complex, too many unknowns, too much bias to keep models mathematically stable, too much machine error, so on, and so on.
          Anyone that claims we can model climate also wants to tax you and take away your guns (nuts, did it again; but seriously, they want want power and control over our lives).
          What we learn from Dr. Essex’s lecture is that climate models can never be correct!

          • If it doesn’t change in the time frame that the weather model is concerned about, then it can be ignored. On the other, very little is constant at the time frame that climate models use.

          • “Weather models are child’s play, I think, compared to Climate Modeling. ”

            same core models
            same core physics
            run at different spatial scales and time scales.

            Both work.

            sorry

          • Once again Steven demonstrates that he is paid to lie.
            Their are huge differences between weather and climate models. First off, weather models rely on good initial conditions and then use their algorithms to calculate changes to a unit of air as it moves through the system.
            Climate models don’t care about initial conditions and they don’t track anything. They try to come up with averages.

            Weather models work, sort off. As long as you don’t want to much accuracy more than a day or two out.
            Climate models don’t work at all. Period.

        • In some areas we do. Generally those places with lots of people.
          Also UHI doesn’t generally matter for weather forecasts, since they are only interested in what the temperature etc. is, right now.

        • “…but do we even have the data down to that scale??”

          Data? Data??? This is climate modelling. We don’t need no steenking data.

  8. According to computer laws and using calculus, as the power of computers approaches infinity, the answer to all questions approaches 42. Hey, it’s just math.

  9. ” Such capacity represents a thousandfold increase over the first petascale computer that came into operation in 2008.”

    I am anticipating the application of this same factor over a similar period to the efficacy of PV, solar and large scale battery storage. The world’s problems have been solved in a stroke.

  10. The lack of improvement in the uncertainty around climate forcing is even more start when you consider the increase in computing power over the same period.

  11. Why do we even need high-speed computing for climate models when the Science is Settled?

    /sarc

    • That’s true. The capital would be better redistributed for political, and, perhaps, human… “persons” purposes.

  12. Catastrophic Anthropogenic Computational Divergence with exascale performance.

    Incompletely, and, in fact, insufficiently characterized, and unwieldy.

  13. Even if the system being modeled was understood perfectly and the model was perfect, you would still have problems with perfectly knowing the initial conditions.
    And even if you managed to get all of the above perfect, you would still have issues with rounding errors building up as your model re-iterated.

  14. As long as they can print garish maps with it, its global warming purposes will be assured. There is nothing like a map all oranges and reds to make a politician swoon. I hope it comes with an exascale printer (no doubt a dot Matrix Reloaded).

  15. When your only tool is brute force computing, the only way you can improve is with more computing.

    Future outcomes will be just as wrong as previous outcomes.

  16. Judging by the accuracy of the computer climate modelling so far I hope someone is working frenetically on developing extra high speed toilet paper to keep up with their ever faster output.

  17. Wow! That’s a lot of flops! I bet all of the cows in America can’t produce that many flops in a month!

  18. This computing power could have enormous benefits but only if well design strategies are applied. Climate models are clearly inadequate, mainly because they are constructed based on politically convenient assumptions rather than with respect for real observations and trends. Using this computing ability along with AI data analytics abilities would be a great way to discover what is wrong with current models, whether bottom up or top down model construction is best approach or whether extending observable trends and knowable cycles into the future is a better way to predict coming climate trends. Even just taking what we have measured and inputting it, one could search for the useful patterns that should have been incorporated in climate models in the first place but which were ignored. Regardless of all that a computer, the models programmed into it and the outputs are not reality, and can only be judged in terms of their reflection of reality by real world validation.

  19. Maybe now researchers will be able to figure out what makes a Supernova explode (those 3D simulations take a lot of flops)! /sarc

  20. Careful. Unless you have an accurate model with good data, all that is going to happen is that you will get [pruned] forecasts even faster.

    [Avoid such language here. It is not needed. .mod]

  21. A 50x increase in speed would allow grid resolution to be about 4x better in each dimension. This is a good thing, but is still not enough to allow modeling of what regulates climate (local albedo)

    • It’s got AI! Which means it will be able to “learn” what results the programer wants and deliver them every time.

  22. Somehow the term, “exaflops”, seems quite appropriately applied to climate-modeling calculations.

    No matter what computer capacity we might be talking about, climate models seem to be extreme flops.

    It’s sort of poetic, then, when models that are extreme flops potentially have access to computers of exaflops computing speed.

    Now climate alarmists can make themselves look even more convincing by making such claims as, … using the world’s fastest computers ever, … as if this has anything to do with the usefulness of the models.

    For the sake of the rest of the scientific community, I think that I would have chosen a different word than “exaflops” to describe computing power. It’s like naming your child “Dolt” or “Binge” or “Crud” — good names for a lawyers’ firm maybe — Dolt, Binge & Crud.

  23. CMIP5 typical model resolution was 2.5 degrees or 280 km at the equator. Finest resolution was 110km. The resolution needed to resolve crucial convection cells (tstorms) is at or below 4km. The rule of thumb at NCAR is that doubling resolution by halving grid size requires 10x computing power (a function of the CFL constraint on PDE solutions). So it is either a 6 orders of magnitude or a 5 orders of magnitude computationally intractable problem. Forcing parameterization that drags in the attribution problem.

    This new superduper supercomputer is 1.5 orders of magnitude faster. Leaving a 4 or 5 order of magnitude computationally intractable problem. Climate modelers still cannot model convection cells. They still have to parameterize.

    See previous guest posts ‘The trouble with models’ and ‘Why models run hot’ for details.

    • This reality is bad for funding, Rud I. If the funding agencies don’t know this reality, then the climate “scientists” can keep their gravy train secured.

  24. This computer is so fast, it can now predict global warming AND global cooling catastrophes simultaneously, covering all options. The 12 year deadline has now been refined to within a specific day and hour….



  25. Go to minute 9 of this talk on Quantum Computing.

    It explains why many problems are, and will remain, intractable using classical computers.

    (For instance, calculating the energy levels of the molecule FeS).
    In any case, it does not matter what computer you have if the model you are using does not model reality.

  26. I suppose this new, super gizmo, could be used to mine Bitcoin but I suspect the cost of operating the beast would be greater than the value of the Bitcoin. You know, someone, somewhere, crunched that number as a potential ROI on 600 million.

    Personally, I’m not sure how the increase in processing power would actually improve the accuracy or quality of the results in any calculation. Get an answer faster? Sure. Any more correct? Who knows? If you’re talking about climate you have to wait a while (decades or longer) before you have an answer key. And, if you’re talking about calculating the potential yield of an, aging, mushroom cloud generator, you hope to never find out if the answer was correct.

    As an aside on the subject of forecasting future climate. If we ever reach the capability to, accurately, forecast the future climate of the Earth it will be because we also have the capability to control the climate of the Earth. So, never, seems a good bet here.

    Cheers

    Max

    • Ok, you just have being given a Quantum process protocol “experience”, like now here in full proper running Metro Exodus on extreme…in front of your eyes and senses… did you see or detected anything… nope …zero… it just did happen…or maybe not… but not actually to be considered within the means of interfacing in proper visual or sound or 2D or 3D for you to detected, as in real time space detection meaning… as there no consideration of space and time addressing applied, for you to have any detection of it….but in the very basics of Quantum it may have just happened, regardless of your time space proposition experience…
      weird is not it????

      You never will be able to say that it did not happen in the consideration of Quantum processes and protocols…because of no detection or affirmation of/from senses in the consideration of time space metric.
      Just how it may be….too weird… within the Quantum principle….Quantum.

      Still, the most of your brain, or mine or any one else is inactive,
      as so some clever and highly learned people say….!!!!

      Most probably I should not have done this.

      cheers

      [?? .mod]

  27. Fifty times faster is derisory. The really big problem with climate models is that they can’t deal with convection, the main heat-transport mechanism in the atmosphere, in a realistic manner. Convection cells vary in size from tens of meters to tens of kilometers. Climate models typically use 100×100 km cells.

    To handle convection realistically from basic physics probably requires cells as small as 100x100x100 meters. This would require 1,000 x 1,000 x 1,000 x 1,000 = 10^12 times more calculating capacity, to handle the 1,000 times finer cell size in three dimension and the similarly shortened time steps (which must change in proportion to cell size).

    By the way you will also need to have data on the thermal characteristics of the ground/sea with the same resolution.

      • Back in the early 1970s when I programmed for NOAA in Boulder, Colorado, my boss told me that to have accurate weather forecasts two weeks into the future would require monitoring all convection down to the scale of a small dust devil everywhere around the world — and the required ubiquitous monitoring equipment itself would necessarily change the phenomena being measured.

    • Making a faster computer…

      “which is anticipated to debut in 2021 as the world’s most powerful computer with a performance of greater than 1.5 exaflops”

      megaflops => gigaflops => teraflops => petaflops => exaflops

      The classic “make it go faster” excuse where technicians and engineers design and build faster computers advancing technology faster and faster on a one lane highway.

      Leaving the code designers to dream of faster code compilations, program runtimes and just as slow database inputs/outputs as always.

      The programs are not designed from the ground up to be intuitive or comprehensive thinking logic. They are simple mathematical calculations using arrays.

      “Researchers with DOE’s Exascale Computing Project are developing exascale scientific applications today on ORNL’s 200-petaflop Summit system and will seamlessly transition their scientific applications to Frontier in 2021”

      Which describes moving the same old program code to a machine that runs the kernel and code instructions faster choked ever more by inputs/outputs.
      Think “Harry read Me” programs moved from slower machines to faster machines.

      Greater and faster memory access allows processing of larger arrays which, theoretically allows finer resolution. It does not improve the logic or results.

      I am reminded of hormonal young teens, (“Researchers with DOE’s Exascale Computing Project”), oohing and wowing drag race cars that go fast fairly well with quite a few drag racers attaining failure faster and sometimes catastrophically.

  28. This is cool!
    Why the “hate by so many commenters?

    This advancement has nothing to do with our task at hand; Fighting back at the Jacobins using a trace gas against us.
    Let yourself marvel at the fruits of our human-driven achievements!!!

    In other words: “Lighten up, Fransis”….(Francis’s?)

    • My guess is that there aren’t many people here with experience in high performance computing, those that do have little to say, this is just another (big) step along the curve. The rest only know of past efforts with new systems at other weather and climate forecasting sites.

      I mostly left the HPC arena in the 1990s, but I like to keep up with the Top500 list and other tech, like M.2 SSDs. While the supercomputer arena is focused on computes, we’ve always had to worry about I/O speeds keeping up.

      I remember testing NFS file transfer on quad CPU systems that were often used as building blocks for big systems used at PSC and national labs.

      Last month I moved to a town with 1800 residents. 1000 Mbps internet, a free upgrade from 300 Mbps, their minimal offering. The cable guy demonstrated it with a speed test server from his laptop (he had to boot into Linux, Windows couldn’t run that fast). I had to transfer a file system from a 7 year old failing system to a 2 year old system that night – 800 Mbps, with no real tuning other than buying one of the cheaper M.2 SSDs.

      All pretty amazing. Been a great career.

      The big advantages will be spent on compute-hungry tasks like protein folding, drug and enzyme design, and all sorts of stuff that have nothing to do with climate or things that go Boom!

  29. A great example of the waste and opportunity cost relating to climate alarmism.

    Just imagine if that computing power could be used to solve real problems affecting the real world.

  30. If incorporates AI do they not run the risk of their own computer telling them that they are wrong? I would love to be a fly on the wall when that happens.

  31. Here am I, brain the size of a planet, and you ask me to run climate models?

  32. Why is the thing so big? It looks like a computer from the 1980’s
    Just string a bunch of computer phones together and it would fit on your side table…or coffee table…

    • Its so big because quantum computing has been an abject failure. In fact the failure is so bad, that they still are in denial, and call some computers quantum computers 😀

      The only way to make computers really powerful today, is to make them bigger and bigger and bigger, which is an advance in finance not computing

    • Building a bigger machine to make more calculations is not a technical advancement by any meaning of the words.

      They increased capacity, something we’ve been doing in IT for a long long time.

      I wonder how much processing got done by the SETI project which distributed processing to anyone who would install the client around the world. 🙂

      • Per https://en.wikipedia.org/wiki/SETI@home#Statistics :

        Since its launch on May 17, 1999, the project has logged over two million years of aggregate computing time. On September 26, 2001, SETI@home had performed a total of 10^21 floating point operations. … With over 145,000 active computers in the system (1.4 million total) in 233 countries, as of 23 June 2013, SETI@home had the ability to compute over 668 teraFLOPS.[20] For comparison, the Tianhe-2 computer, which as of 23 June 2013 was the world’s fastest supercomputer, was able to compute 33.86 petaFLOPS (approximately 50 times greater).

        1 Exaflop is 1,000,000 Teraflops. So considerably more.

        Note that projects like SETI are suitable for distributed computing because you can ship out subsets and don’t need the results back quickly. Other problems, e.g. coarse grain 3D modeling, need to be able to get intermediate results to CPUs as fast as possible. While it’s not quite so bad with fine grain modeling, the overall bandwidth required to keep the CPUs busy is phenomenal and hence also unsuited for distributed computation.

        Note that for things like weather, while you can pour a lot of computes in to a fairly small volume when air movement is the only concern to satisfy that part of the problem, once you include radiative transfer issues that are moving energy at the speed of light, all of a sudden your CPUs have got to get to some “distant” memory ASAP.

        Bottom line: SETI is no help for the problems big supercomputers are used for. Sorta like getting the Miata fan club involved with moving enough coal and oil to power an industrial state.

    • People really, really, want to keep supercomputers small. The speed of light imposes massive limits on performance of plain old wires. However, you need a lot of processor cores to deal with the challenges of making chips faster (and hence hotter). The most recent supercomputer leader, at ORNL, https://www.top500.org/system/179397 has 2.4 million cores, that’s some 150,000 CPUs. Also about that many memory boards.

      They literally made it as small as they could.

      And as low power as the could – 10 MW is pretty good for a machine that size!

  33. The more things change…

    Way back in the 1960’s, following a disastrous attempt to computerise Avro’s ‘Weights’ department, I wrote a folk song (yes, that’s me in beard, flares, and carrying a guitar). The last verse (referring to the top-of-the line British mainframe computer, the ICT 1900) goes:
    I’m a number nineteen hundred, and I’m only three years old
    But now I’m getting past it, as I’ve recently been told
    The nineteen-oh-four’s better, all the programmers agree
    The bloody thing can make mistakes ten times as fast as me!

  34. Just to be pedantic (and correct), the term “incorrect” is not relevant in predictive models. Models are tested by cross validation using data (in the case of climate models, data from the past). If they can retro-cast the data with good accuracy (not a good criteria for very low probability events!), then they can be used to predict the outcome of the hypothesis they are designed to detect (e.g., that temps will increase at a certain rate given future inputs (e.g., CO2 levels). In other words, _assuming_ the model captures the hypothesized causality, inputs of X will yield results Y with probability p.

    In summary, “correct” and “incorrect” don’t really apply to probabilistic statements.

    • But that ability to predict the future correctly is inversely proportional to the ammount of “Tuning” required to accurately model the past

  35. Wow, 1.5 exaflops is leally leally uge, too uge from a point of view or a world view point of little guy like me, really really huge capacity there…A wonder, in it’s own stand as put or claimed!

    cheers

Comments are closed.