Science

Latest supercomputers run truer simulations of extreme weather

Latest supercomputers run true...
A high-resolution simulation of the global climate provides a much better representation of extreme weather events than previous lower-resolution models (Image: Department of Energy/Berkeley Lab)
A high-resolution simulation of the global climate provides a much better representation of extreme weather events than previous lower-resolution models (Image: Department of Energy/Berkeley Lab)
View 1 Image
A high-resolution simulation of the global climate provides a much better representation of extreme weather events than previous lower-resolution models (Image: Department of Energy/Berkeley Lab)
1/1
A high-resolution simulation of the global climate provides a much better representation of extreme weather events than previous lower-resolution models (Image: Department of Energy/Berkeley Lab)

High-resolution simulations of the global climate can now perform much closer to actual observations, and they perform far better at reproducing extreme weather events, a new Berkeley Lab study has found. Lead author Michael Wehner heralds this news as evidence of a golden age in climate modeling, as not only did the simulation closer match reality but it also took a fraction as long to complete as it would have in recent history – just three months compared to several years.

"These kinds of calculations have gone from basically intractable to heroic to now doable," Wehner said. "I've literally waited my entire career to be able to do these simulations."

The researchers used a CRAY XE-6 supercomputer at the National Energy Research Supercomputing Center to conduct their analysis for the period 1979 to 2005 at three spatial resolutions – 25 km (15.5 mi), 100 km (62 mi), and 200 km (124 mi) – and then compared the results to both each other and to real-world observations. The highest resolution case required 7,680 cores, while the entire output of all three cases amounted to just over 100 terabytes of data.

The higher-resolution simulation, which you can see in action in the video below, gave a much more accurate representation of weather areas with a lot of topography (i.e. mountainous regions), since altitude in the simulation grid is averaged over all terrain in each square – 25 sq km (9.6 sq mi) for the higher-resolution, 100 and 200 sq km (37 and 77 sq mi), respectively, for the lower resolutions.

LBNL fvCAM5 1 25km

The higher fidelity particularly produced stronger and more frequent storms in locations where mountains and hills are prevalent, in keeping with actual observations, and it provided significantly more realistic results for midlatitude winter and spring over land. But it was not always superior in its realism.

The simulations revealed that many deficiencies in the Community Atmospheric Model 5.1 they used are made significantly worse at higher resolutions, such as a bias towards forming a "double intertropical convergence zone." More uniform regions were often better represented at the 100 km resolution, and extreme precipitation over land in the midlatitude summer was too high.

Even so, future modeling at high resolution like this will be a huge help in projecting long-term climate change. Wehner was a lead author on the chapter concerning this in the fifth Intergovernmental Panel on Climate Change assessment report, which concluded that the contrast between wet and dry seasons will increase with global mean temperatures – leading to more extreme (in both magnitude and frequency) precipitation events in the wetter regions.

"Knowing it will increase is one thing," Wehner said, "but having a confident statement about how much and where, as a function of location, requires the models do a better job of replicating observations than they have." And the key to this is more high-resolution modeling.

His next project is to run the 25 km-resolution model for a future-case scenario, though he argues that it's just a matter of time before scientists get to one kilometer (0.4 sq mi) resolutions. Before they can, however, they must improve their understanding and modeling of cloud behavior.

"That will be a paradigm shift in climate modeling," Wehner said. "We’re at a shift now, but that is the next one coming."

A paper describing the research was published in the Journal of Advances in Modeling Earth Systems.

Source: Berkeley Lab

6 comments
Steve Jones
Why do all of the elements have to be the same size? It sounds like it would make more sense to have smaller elements in regions of highly variable elevation.
Catweazle
Anyone who claims that an effectively infinitely large open-ended non-linear feedback-driven (where we don't know all the feedbacks, and even the ones we do know, we are unsure of the signs of some critical ones) chaotic system - hence subject to inter alia extreme sensitivity to initial conditions - over any significant time period is either a charlatan or a computer salesman.
Ironically, the first to point this out was Edward Lorenz, a climate scientist in his 1963 paper Deterministic Nonperiodic Flow in the Journal of the Atmospheric Sciences.
ivan4
Until such time as these computer models actually produce output that matches measured observed readings of what the climate is doing they are only good for making pop art pictures. They are totally useless for making any political decisions and both those that make those decisions and those that supply the information should be the ones made to bare all costs involved. I would go further and say those producing the incorrect data should be out of a job.
Synchro
@Ivan4. ALL data is incorrect. It's a matter of error margin. For example, look at your watch: it's wrong, but you don't know by how much.
In reducing the computation time from years to months, this has allowed them to spot inadequacies in this particular model *which is the entire reason for doing it*. Models get better by iteration - but it's hard to iterate if it takes years to run simulations.
How do you possibly expect to get 'politically useful' information if you don't support efforts to make that data available in a relevant time frame, especially when that data affects everyone?
Your obstructionist suggestions belie your politics.
moreover
@Synchro - Very good point about getting better by iteration. I was curious about the Cray supercomputer (because my MIT room mate worked for them after college). Turns out that Cray XE6 is already being replaced by a new model this month, and it runs on a flavor of Linux. Let's hope the big labs have the necessary funding to keep up. I can make due with an old computer but global modeling really improves with higher speeds.
Matt Fletcher
There's a reason why weatherman/people are called meteorologists and that's because our weather mostly comes from space. We maybe able to tell what storm fronts and what high and low pressure systems will do but you don't know when the next x flare or large CME ejection will be coming unless you can predict what the sun is going to do. Plus add cosmic rays and solar winds and I can safely say they will never be able to predict the weather accurately until they can foresee what's coming at us in space.