Can You Vague That Up For Me?
Acknowledging that today's supercomputers lack the computational power to successfully model Earth's climate system, a climate modeler is suggesting that climate models would benefit from running on computers whose calculations are less exact. “In designing the next generation of supercomputers, we must embrace inexactness if that allows a more efficient use of energy and thereby increases the accuracy and reliability of our simulations,” says Tim Palmer, a Royal Society research professor of climate physics and co-director of the Oxford Martin Programme on Modelling and Predicting Climate at the University of Oxford, UK. This is nothing more than grasping for excuses to explain the dismal performance of the current crop of climate model simulations. There is an old saying: a craftsman never blames his tools for a bad result. Evidently climate modelers are not even close to being craftsmen.
Anyone who has been paying attention to the climate change follies for the past several decades could not help but notice that climate models suck. The computerized monstrosities are seemingly incapable of coming up with a temperature forecast that comes anywhere close to reality. Billions of dollars have been spent on developing these software chimeras, initially based on a combination of atmospheric and ocean circulation models. Since their inception, climate modelers have added more factors to the models like a student stuffing dirty clothes into a bag before returning home on holiday. As can be seen from the graph below, this has all been to no avail.
The frustration among those who work on such models seems to be running high these days and one such modeling maven, Dr. Tim Palmer, a physicist at the University of Oxford, has had a brainstorm – he wants to make computers less exact. “We must move beyond the idea of a computer as a fast but otherwise traditional 'Turing machine', churning through calculations bit by bit in a sequential, precise and reproducible manner,” he writes in a recent article in the journal Nature, “Modelling: Build imprecise supercomputers.” Dr. Palmer seems to have forgotten that people use computers to crunch numbers in a precise and reproducible manner primarily because humans can't do so. No matter, he seems to think what climate models need is a good dose of imprecision.
In particular, we should question whether all scientific computations need to be performed deterministically — that is, always producing the same output given the same input — and with the same high level of precision. I argue that for many applications they do not.
Energy-efficient hybrid supercomputers with a range of processor accuracies need to be developed. These would combine conventional energy-intensive processors with low-energy, non-deterministic processors, able to analyse data at variable levels of precision. The demand for such machines could be substantial, across diverse sectors of the scientific community.
“it is time for researchers to reconsider the basic concept of the computer,” he gushes. I hate to tell the doctor this, but imprecision leads to error propagation and error propagation leads to divergence in models. This is like saying “We are getting really crappy answers from our computer, I know, let's make it less precise! That will fix it!” Wrong.
As noted in the article, computer climate models predict Earth's future climate by solving partial differential equations (PDEs) for fluid flow in the atmosphere and oceans. In mathematical terms, a PDE is any equation involving a function of more than one independent variable and at least one partial derivative of that function. Given a function u = u(x,y), then a PDE in u is classified as linear if all of the terms involving u and any of its derivatives can be expressed as a linear combination in which the coefficients of the terms are independent of u. In other words, In a linear PDE, the coefficients can depend at most on the independent variables. Throw in some other terms or complications and the equation is called nonlinear. For example, the first equation below is linear, the second nonlinear:
Mathematical technicalities aside, the thing to note is that nonlinear equations are a real pain. While linear PDEs can often be solved exactly using techniques such as separation of variables, superposition, Fourier series, and Laplace, Fourier, or other integral transforms, not so with nonlinear PDEs. They generally require a numerical approach, and so it is with the equations in climate models. An overview of PDEs and how they are simulated by difference equations can be found in “A Review of Numerical Methods for Nonlinear Partial Differential Equations.”
Suffice it to say that the less accurate your computer's calculations the faster your answers diverge from reality. In a 3D simulation, like a climate model, this is dependent on not just the computer's internal representation of numbers but on the size of the grid that the world is chopped up into. The larger the chunk the less accurate the simulation, the smaller the chunk the more calculations needed for each time step. It's a tradeoff between resolution and speed. More simply put, with a big grid you miss the small things, like mountains and clouds, allowing you to more quickly compute the wrong answer.
Current climate simulators – typically with grid cells of 100 kilometers in width – can resolve large weather systems typical of mid-latitudes, but not individual clouds. Yet it has been discovered that modeling cloud systems accurately is crucial for reliable calculation of global temperature in the future (at least that is what modelers are currently blaming their lack of success on). Even then clouds on scales smaller than a grid cell will still have to be approximated, or parametrized, using simplified equations. “Errors introduced by such parametrizations proliferate and infect calculations on larger scales,” Palmer laments.
Since it is estimated that supercomputers won't become fast enough to get remotely accurate simulation results for another decade, it is unsurprising that modelers are casting about for something to do in the meantime. Palmer has hit upon the idea that dispensing with the need for all that pesky accuracy would make his job easier. To this end he envisions a new type of supercomputer:
Like current ones, such a machine would be massively parallel, comprising many millions of individual processing units. A fraction of these would enable conventional, energy-intensive and deterministic high-precision computation. Unlike conventional computers, the remaining processors would be designed to take on low-energy probabilistic computation with lower-precision arithmetic, the degree of imprecision and inexactness being variable.
Such a computer would be able to do a better job of nonlinear simulations and use less power to boot, the author claims. There are a few wrinkles in Dr. Palmer's scheme, however.
The first is that what he is calling noise has another name – error, and error is the bane of all computation. When he says he wishes to reduce the accuracy of the calculations in some areas he is simply asking the hardware to do something the model writers could do today, abet with a massive rewrite of their code. This is akin to not liking the balance in your bank account and deciding to keep less accurate records in an attempt to solve the problem.
Second, his goal is to make the simulation results non-deterministic, which is to say, not reproducible. How do you know when you've gotten the magic right answer when the computation yields a different result each time? Considering that reproducibility is one of the founding principles of experimental science this idea is the height of lunacy.
Today, because the results of individual models and parameter sets are so undependable, climate modelers have adopted the convention of forming “ensembles.” These are collections of many model runs, averaged together in the vain hope that adding up several hundred wrong results will some how yield a correct one (it doesn't).
I know from experience that building a custom computer, which is what such a variable accuracy massively parallel computer would be, takes four or five years from conception to realization. In that amount of time the next two generations of “normal” computer hardware will have been delivered. Palmer frets that coming supercomputers will use entirely too much power – rest assured the engineers are working on that. Before this pipe-dream could be realized the world will have moved on.
The bottom line on this nut-ball idea is that it is a ridiculous suggestion. This is simply an expression of exasperation by a researcher who finds his tools lacking. Palmer wishes for a magic imprecise, probabilistic miracle machine that would yield the accurate results that have eluded climate modelers by being less accurate. Things are evidently getting desperate out there in climate alarmist cuckoo land.
Be safe, enjoy the interglacial and stay accurate.