Virtually reality

Bookmark and Share

This is part 2 of a series of introductory posts about the principles of climate modelling. Others in the series: 1.

The second question I want to discuss is this:

How can we do scientific experiments on our planet?

In other words, how do we even do climate science? Here is the great, charismatic physicist Richard Feynman, describing the scientific method in one minute:

http://www.youtube.com/watch?v=b240PGCMwV0

If you can’t watch this charming video, here’s my transcript:

“Now I’m going to discuss how we would look for a new law. In general, we look for a new law by the following process:

First, we guess it.

Then we — no, don’t laugh, that’s the real truth — then we compute the consequences of the guess to see what, if this is right, if this law that we guessed is right, we see what it would imply.

And then we compare the computation result to nature, or we say compare to experiment, or experience, compare it directly with observations to see if it works.

If it disagrees with experiment, it’s wrong. In that simple statement is the key to science. It doesn’t make a difference how beautiful your guess is, it doesn’t make a difference how smart you are, who made the guess, or what his name is — if it disagrees with experiment, it’s wrong. That’s all there is to it.”

What is the “experiment” in climate science? We don’t have a mini-Earth in a laboratory to play with. We are changing things on the Earth, by farming, building, and putting industrial emissions into the atmosphere, but it’s not done in a systematic and rigorous way. It’s not a controlled experiment. So we might justifiably wonder how we even do climate science.

Climate science is not the only science that can’t do controlled experiments of the whole system being studied. Astrophysics is another: we do not explode stars on a lab bench. Feynman said that we can compare with experience and observations. We would prefer to experience and observe things we can control, because it is much easier to draw conclusions from the results. Instead we can only watch as nature acts.

What is the “guess” in climate science? These are the climate models. A model is just a representation of a thing (I wrote more about this here). A climate model is a computer program that represents the whole planet, or part of it.* It’s not very different to a computer game like Civilisation or SimCity, in which you have a world to play with, in which you can tear up forests and build cities. In a climate model we can do much the same: replace forests with cities, alter the greenhouse gas concentrations, let off volcanoes, change the energy reaching us from the sun, move the continents. The model produces a simulation of how the world responds to those changes: how they affect temperature, rainfall, ocean circulation, the ice in Antarctica, and so on.

How do they work? The general idea is to stuff as much science as possible into them without making them too slow to use. At the heart of them are basic laws of physics, like Newton’s laws of motion and the laws of thermodynamics. Over the past decades we’ve added more to them: not just physics but also chemistry, such as the reactions between gases in the atmosphere; biological processes, like photosynthesis; and geology, like volcanoes. The most complicated climate models are extremely slow. Even on supercomputers it can take many weeks or months to get the results.

Here is a state-of-the-art simulation of the Earth by NASA.

The video shows the simulated patterns of air circulation, such as the northern hemisphere polar jet stream, then patterns of ocean circulation, such as the Gulf Stream. The atmosphere and ocean models used to make this simulation are high resolution: they have a lot of pixels so, just like in a digital camera, they show a lot of detail.

A horizontal slice through this atmosphere model has 360 x 540 pixels, or 0.2 megapixels. That’s about two thirds as many as a VGA display (introduced by IBM in 1987) or the earliest consumer digital camera (the Apple QuickTake from 1994). It’s also about the same resolution as my blog banner. The ocean model is a lot higher resolution: 1080 x 2160 pixels, or 2.3 megapixels, which is about the same as high definition TV. The video above has had some extra processing to smooth the pixels out and draw the arrows.

I think it’s quite beautiful. It also seems to be very realistic, a convincing argument that we can simulate the Earth successfully. But the important question is: how successfully? This is the subject of my next post:

Can we ever have a perfect “reality simulator”?

The clue’s in the name of the blog…

See you next time.

 

* I use the term climate model broadly here, covering any models that describe part of the planet. Many have more specific names, such as “ice sheet model” for Antarctica.

 

Creative Commons License
Virtually reality by All Models Are Wrong, unless otherwise expressly stated, is licensed under a Creative Commons Attribution 4.0 International License.

This entry was posted in climatemodels, definitions, introductory. Bookmark the permalink.

5 Responses to Virtually reality

  1. genemachine says:

    Do these these models reproduce any of earth’s weather patterns and circulations?

    1) The Gulf Stream; mass, speed and temperature

    2) El Niño and the Pacific Decadal Oscillation; frequency and scale

    3) precipitation patterns; deserts and wet seasons in roughly the right places with similar precipitation timings and intensities

    4) Cloud cover; hours of sunshine in different places

    GD Star Rating
    loading...
  2. Hans Jelbring says:

    Dear Dr Edwards,

    Since we have similar education I am glad to confirm that you are describing
    a virtual reality when referring to the highly complex NASA model and
    General Circulation Models often mentioned in the IPCC reports. The real
    question is if there is any value at all using them beside at very short
    time scales.

    To use them for “predicting” the future is like entering the world of Alice
    in Wonderland. This is one reason why their output is called “projections”
    among knowledgeable people and not predictions. I will try to justify my
    statement in a way easy to understand for anyone with a little education in
    maths or gambling.

    I think you would agree that any model rests on a number of assumptions
    where each one has a chance (p) to be correct that is between 0 and 100%.
    Assume that a GCM rests on 50 such assumptions. If any single one of them is
    really bad and p = 0 the model is useless (at least for long term
    forecasts). The weakest link in the logical chain decides the quality of the output.

    Let´s make the claim that all of the basic assumptions and laws have p = 99%
    or alternative3ly 90% chance to be valid as an average. The chance that a long term prediction P will be correct is then approximately P = p^50

    Consider two cases:
    I p(0.9) = 0.9 will mean that the end result P(0.9) = 0.0051 The model will
    be correct about 5 times our of 1000 which means that it is useless as a
    prediction tool.
    II p(0.99) produces P(0.99) = 0.60 which means that the end result is
    correct about 60 times of 100 which is not good enough. A percentage of
    0.95 ought to be achieved to reach a scientific standard.

    However, there is no scientific support that the impact of anthropogenic carbon dioxide emissions
    (the unproven greenhouse effect from carbon dioxide) has the effect claimed
    by the GCM models. Let´s be generous and estimate that the chance
    of this assumption to be correct is 10%. This value should actually be lower IMO since
    no one has verified its impact in nature despite 20 years of hard work among
    scientists.

    Then, there are 49 more assumptions that can and should be discussed separately before putting any faith in long term climate models. Consequently, IMO, the chance that GCMs have a predictive capacity is approximately zero. Any result that is claimed to have a scientific value has to be verifiable. If that´s not possible or has been ignored the produced result is simply not reaching a scientific standard.

    Colorful animations such as the one made by supercomputers at NASA doesn´t change that fact. Such ones can serve the purpose of general information at high school level but not more. It is sad when qualified scientists use them as more than guidelines and expecting them to influence prominent decision makers in our societies.
    Best

    Hans Jelbring
    Ph.D. Climatologist

    GD Star Rating
    loading...
    • Alexander Harvey says:

      The probability of my writing this sentence by the random selection of 8 bit bytes was less than 1 in 10 to the power of 100; so I didn’t.

      GD Star Rating
      loading...
    • tallbloke says:

      Hi Hans,
      I wonder if there’s going to be anything but a ‘smart Alex’ reply to your cogent observations regarding probability. Tricky stuff, science communication.

      GD Star Rating
      loading...
  3. tallbloke says:

    Hi Tamsin, you said:
    “At the heart of them are basic laws of physics, like Newton’s laws of motion and the laws of thermodynamics”.

    I have spoken to many climate scientists who seem to think that at equilibrium, the troposphere would be isothermal, in accordance with the Maxwell-Boltzman distribution. They argue that there would be no lapse rate without radiation and convection. Do the models make this assumption too?

    If they do, it seems like a grave error to me. The true situation isn’t like Maxwells ‘isolated column of gas’ for several reasons. Although it is bounded by Earth and space, its volume is not constrained. NASA has confirmed observationally that since the Sun went quiet in 2004, the thermosphere has shrunk by 30% and the average altitude of the cloud deck has dropped by 30m.

    Gravity acting on the mass of the atmosphere produces a pressure gradient by pulling the gas against the solid Earth. Because air is compressible, that pressure gradient acts to produce a density gradient. Because the molecules nearer the surface are pushed closer together, there are more of them in a given volume. That makes it more likely that water vapour and co2 molecules in the volume will intercept and absorb photons of incoming solar energy and photons of outgoing long wave radiation.

    Because the path length is short, these energised molecules soon share their extra energy with surrounding molecules of nitrogen and oxygen which make up the bulk of the atmosphere. The ensemble has a heat capacity. Therefore, the denser near surface air will get hotter than the less dense air at altitude.

    Naturally, in accordance with the gas laws, the warmer air nearer the surface will have been expanded by the higher temperature and its density thus lowered again, but this won’t fully offset the effect due to gravity acting on mass to raise pressure and density to form a gradient. (We should calculate the value).

    So even before we consider convection and radiation, there will be a ‘pre-existing’ lapse rate due to these simple thermodynamic-gravitational considerations.

    Is it included in the models?

    GD Star Rating
    loading...

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>