New Computer Fund

Monday, February 15, 2016

Layers of Debating the Greenhouse Effect

It is pretty funny how parts of the debate never seem to be resolved and some are the simplest parts until you think about what the misunderstanding really is, idealism versus realism.

The analogy of a black body versus a gray body for example.  A black body is an ideal representation of a "perfect" radiant source and since there is no such thing outside of a lab, a gray body is supposed to represent real world conditions, but it isn't something that can be observed in nature either.  You can call something a gray body or black body, but there are still arguments in physics about what each might be on a large scale.

In a lab, a black body is a tiny slit in the shell of some volume at constant temperature.  The interior of the volume is coated with something like lamp black and for precise measurements there might be a tiny object opposite of the slit to provide a stable energy source at the same temperature as the volume.  Since different frequencies of electromagnetic energy have different wave lengths, measuring a wavelength might require adjusting the thickness of the shell so that random, isotropic, radiation appears to be polarized for the instrument used to measure the energy.  Since this slit in the shell of assumed negligible thickness is in "equilibrium", it has a balancing radiant energy flow into the slit equal to the flow out of the slit.  So if you are measuring an energy flow of 250 Wm-2, the actual effective energy of the shell and the whole interior of the black body cavity would be 500 Wm-2.

What most people seem to forget is that a black body cavity is a reference not a reality.  If you have a perfect radiant surface aka shell in equilibrium, the same amount of energy exiting will be entering, so the internal energy would have to be twice that of the exiting energy.  A negligibly thin surface though would have no mass to speak of, so it would not be able to maintain energy as in forcing the black body to be in equilibrium.  This perfect shell would just be a feature of a system in equilibrium not the cause of it.

Now for the Greenhouse effect simple explanation, that shell receives about 240 Wm-2 and radiates 240 Wm-2 at some point in the atmosphere.  That point is "assumed" to be at the surface and negligibly thin by default since an ideal black body is used for reference.  That requires the interior of that shell to be at 2X 240 Wm-2 or 480 Wm-2 if the system is in equilibrium like a true black body should be.

In the lab you have the luxury of forcing the body into equilibrium, adjusting the slit dimensions and thickness of the shell to measure what you want.  You don't have that in the real world with a planet as a source so you can either stick doggedly to "your" concept or play with a number of references.  For example, from the top of the atmosphere, the interior "should have" twice the energy and from the "surface" the shell or TOA "should have" half the energy.

From TOA the surface should be 480 Wm-2 and from a surface at 390 Wm-2 the shell, TOA, should be 195 Wm-2.  The "surface difference is 90 Wm-2 and the Shell difference is 45 Wm-2 meaning that 45 Wm-2 is the black body reference effect, provided the "average" surface temperature is meaningful.   Since the real surface has other forms of energy, latent, sensible,  kinetic and potential that are internally never in "equilibrium" but instead in a sort of pseudo steady state, it is the more complex of the choices between the shell reference and the interior reference. 45 to 90 Wm-2 is a pretty big range for a reference used to determine a 4 to 8 Wm-2 impact. It isn't a complete obstacle, you can work some wicked statistical magic, but logically the "surface" in this case "shell" with the least amount of uncertainty would be less of an obstacle.  Except for the little issue of that shell not being something that actually exists.  The shell is an average of a turbulent fluid dynamics inside through and around a theoretical "TOA".  You can estimate the energy of the "shell/TOA" but cannot actual define some surface as THE shell.

When you have what I call a fuzzy reference and you then try to set definitive values of the energy flows to force some example of "equilibrium" you open the door to Sky Dragons who are just pointing out defects in your choice of reference.  Everyone should know there is a 45 to 90 Wm-2 "fuzzy" range and focus on the concept instead of trying to make believe it is a physical fact.

The infamous Earth Energy Budgets of Keihl and Trenberth illustrated that there is about 390 Wm-2 surface radiant energy, 17 Wm-2 convective or sensible energy and about 88 Wm-2 latent energy and assumed everything else is negligible.  That produces a total surface energy of about 495 Wm-2 or about 15 Wm-2 more that what would be expected from a "shell" reference.

The climate scientists blow off the Sky Dragon efforts instead of explaining the uncertainties involved.  The climate scientists also down play lower troposphere temperatures which are effectively measuring a different shell which would have a different energy, while they use a hybrid "surface" of mixed temperature measurements averaged of so large a range that the meaning of the average is in dispute.

Average incoming radiant energy is also dependent on which "surface" is being used as a reference.  Since the "surfaces" are fluid and in motion you really need an average for each layer of fluid or a range instead of a fixed number.  TSI/4 (ideal) versus TSI/pi()(oceans) which provides a range of 93 Wm-2 remarkably similar to the 90 Wm-2 difference you would expect by comparing "surface" and "shell" references.

None of this by itself will get you any greater certainty, but comparing the two does provide a useful range while eliminating most of the Sky Dragon arguments that are based on an overly simplistic understanding of the problem.  S. Manabe indicated that the range of the GHE could be more than 60 C degrees depending on your reference, but that sound science seems to have been lost in the political battle to sell concepts instead of test them.

Because of the simplistic explanation there are plenty of non-issue issues.  Evaporation for example doesn't "cool" the surface, it transfers energy to another surface which makes it easier for the energy to be moved over the surface particularly to higher latitude land area decreasing heat loss on a larger scale.  Since this is latent or hidden energy in one location that becomes sensible and radiant in another, once again you have that pesky ~90 Wm-2 that is difficult to classify depending on your choice of "surface".  The latent flux is just plain difficult to accurately estimate and once it becomes sensible energy the temperature change it produces can vary quite dramatically.  If you use a term like "irreducible imprecision" which is actually pretty accurate, you see the eyes of your audience glaze over.

However, energy transferred from the surface or absorbed in the atmosphere increases the effective temperature of the atmosphere improving the insulation value of the atmosphere.  Until you reach a radiant energy only portion of the atmosphere the plain old basic rules of thermodynamics are alive and well.  Decrease the temperature differential and you decrease heat flow.  This is actually tried in a number of passive and active or dynamic insulation schemes for homes but for the most part cost more than they save.  With energy haters in charge though, that may change.

Some statistical magic applied to the TOA and the ocean heat uptake data does allow you to reduce uncertainty to a point, but since the TOA flux varies by 10 Wm-2 or more, the best estimate is still coarse enough to drive perfectionists batty.

Applying similar statistical magic to the "surface" temperature anomaly creates a false sense of certainty, because the relevance of the global mean surface temperature anomaly given the irreducible uncertainty just fuels the Sky Dragon fires.

In general, the great climate change debate in terms of actual physics is an exercise in futility since the reality is much more complex than the ideology.  Pity, since the actual puzzle is interesting.  Unfortunately, all this is pretty basic stuff which the climate science Gurus are beyond discussing.  The neat thing about "good" physics though is there are always more than one way to skin a cat. Climate science currently requires just one series of assumptions which isn't necessarily "good" physics.  

end of rant





Friday, February 12, 2016

Interpreting the Data

Okay, you have a laboratory the size of a planet and very limited data to work with in spite of everyone and their siblings jumping on the bandwagon.  So you might develop a tendency to cherry pick things that are closer to your wheel house and ignore stuff that doesn't quite make sense. What you need are a few gut checks.

Since we live on a water world warming means more water vapor in the atmosphere.  You are not really sure how much, but there pretty much has to be more some where because warmer air holds more water until you reach saturation.

Sea surface temperature averaged over the entire ocean area, which is the energy proxy for the majority of the surface area, doesn't provide any information on how much water should be in the atmosphere.  Land surface temperature average (Tmax+Tmin)/2 isn't all that great either for a gut check.  If you stick to areas that should have moist air, you can use the difference between Tmax and Tmin, or diurnal temperature range (DTR) and as there is more moisture in the air the range should decrease.  Fantastic! BUT! You can also have the moist surface area expand so there isn't much change in DTR just an increase is in the "moist air envelope".  Got that?  As there is warming of the Earth's surface, area above about 0 C degrees would expand and there would be more moisture in the air, but you have to consider both "features" of the thermodynamics.

When the Berkeley Earth Surface Temperature (BEST) project started, they added some of this information for the average guy on the street to see.  Additional Area Above Freezing is a nice thing to see.  It reinforces the obvious, that the Earth is warming, but it doesn't explain why.  It could be due to Greenhouse gas warming/ black carbon,  just plain warming from the Little Ice Age (LIA), improvements in agriculture and snow removal technology or a combination of everything.  The increased land area above freezing is about 4 million kilometers square which is around 1% of the global surface area.  It would be nice to know the ocean area change, but we don't have that information.

BEST also had a graphic showing how DTR decreased until around 1985, when the new digital Maximum and Minimum Temperature System (MMTS) was installed but I cannot find that on their site very easily anymore.  That shift to increasing DTR isn't kosher with the physics you should expect.  It turns out there are a few issues with the change from old liquid in glass thermometers and the new digital thermometers.  For whatever reason, that issue isn't brought to people's attention very often.  What is means though is that there is some bias in the MMTS data mainly related to the Tmin readings which is part of the gut check.  Unexpected stuff is supposed to be the fun part of science and when the unexpected stuff is exactly opposite of what you expect it is really fun.  Unfortunately, this unexpected stuff could be instrumentation related error which isn't as much fun.

The not so fun part of instrumentation error is the acknowledgement.  "I don't know how much we should believe this data."  That means figuring out some ad hoc uncertainty margin for the FIIK issue.  Engineers are used to a fair amount of FIIK so they have rules of thumb which are "proven" gut checks and safety margins.  Scientists, especially very vocal scientists that have already sold the public on their remarkable abilities, tend to have more issues with the occasional FIIK.



Luckily, the Climate Research Unit provides their DTR to KNMI Climate Explorer so we can see the issue.  If you are a fan of CO2 done it, the curve down from about 1950 to about 1985 is exactly what you expect to see, CO2 warming is causing water vapor feedback which will amplify the CO2 warming.  However, 1985 to present kinda puts a hitch in your water vapor feedback giddy up.  The difference is only about 0.3 C or about 0.15 C in Tave so it is only about 0.05 C "globally" but there could be more issues couldn't there?  1900 to 1950 also is a bit different than what we should expect.

The net impact of this isn't much, but then neither is the total warming that can be easily attributed to CO2 equivalent forcing.  The more interesting part is the pre 1950 period which has the least accurate data.  Many of the earliest land surface temperatures where Tmax only and often only for summer months above the Arctic Circle which wouldn't show any DTR change.

This post is inspired by Greg Goodman's post on Climate Etc.  Greg basically noticed the difference in thermodynamic meaning of land and ocean temperatures.  As a rule of thumb, the oceans have about twice the sensible heat capacity of the "average" land area.  Adding energy to the oceans results in about half the temperature increase that the same amount of energy added to land would produce.

Greg could have gone further and noted that increasing the cold, dry polar areal coverage would increase the difference in specific heat capacity even more changing the ratio.  If the areal coverage remained the same, there would be a more consistent reference which could allow the teasing out of more information.  But with adjusting the temperatures and changing the area you have a moving reference that isn't very reliable.

You can see that the variance in the chart above gets smaller as you go back in time which can lead to a false sense of precision.  Then when the data keepers publish they tend to emphasis the remarkable accuracy they have based on using large numbers of measuring points that didn't exist early in the record.  That small 0.05 C could double the amount of uncertainty that they claim based their statistical treatments.  A small hitch but large compared to stated uncertainty.

The reality is that the difference in the heat capacity of the oceans and land mass is a feature that could be used to refine uncertainty ranges and allow a bit more accurate attribution, provided there is a standard reference.

I started a few years ago looking into a more standard reference but let it go because of the politics involved.  Oceans with the higher heat capacity and the tropical oceans with the majority of the heat content should make a more reliable reference also known as a teleconnection.  Using that as a standard, you could compare regional data adjusted for expected amplification based on specific heat capacity.  That was the main part of my 2000 years of Climate post.

Since then there have been a number of minor adjustments to the different data sets so I might do it over to see what impact they have had. I had hoped that BEST would have an absolute temperature product by now, but I guess that is still a ways off.

Just remember that this quibbling over a "small" issue is a bit like quibbling over a gun sight being off a quarter of an inch at ten yards.  If the gun happens to be a sniper rifle it could miss by a hundred feet at 1000 meters.  You need extreme accuracy in references if you are going to project to the distant past.

Wednesday, February 3, 2016

Rehashing a Rehash of Basic Thermodynamics

Heat reservoir versus heat sink - A heat reservoir is bi-directional and a heat sink is one way.  Most heat sinks aren't even close to ideal but they never reverse direction during normal operation.  For climate science the oceans are a good reservoir the atmosphere is a poor reservoir and the poles are heat sinks.

Zeroth Law of Thermodynamics - A "Thermodynamic Temperature", source temperature or sink temperature, has to accurately describe the energy available in the source and sink.  "Average" temperature of a process is pretty much meaningless.  "The zeroth law of thermodynamics states that if two thermodynamic systems are each in thermal equilibrium with a third, then they are in thermal equilibrium with each other."   Since energy varies with the forth power of temperature, a "half way" temperature would indicate different energy flows to source and sink so it would not be in equilibrium or steady state relative to both source and sink.  It is an invalid frame of reference.  For example, 0K (0 Wm-2) to 288K(390 Wm-2) would have an average temperature of 144K and an average energy of 195 Wm-2 which has an equivalent temperature of 242 K degrees.  If you use 242 K degrees as a frame of reference with respect to 288K degrees or 0K degrees there is a huge error.  The 33 C "greenhouse effect" temperature and energy range is meaningless by itself.

Latent and Convective Heat in an open system is a bitch.  Since latent heat and convection are inter-related, higher surface convection increases the rate of evaporation, and the flow of the latent heat is driven by more than just temperature differential, plus there is phase change in the stream that varies with turbulent mixing which can change the flow rate, you have a marvelously complex fluid dynamics problem without a reliable solution.  You may be able to approximate a range of possible solutions, but the problem is essentially chaotic.

All this makes for a wonderful puzzle with no perfectly correct solution.  You can "ASSUME" any number of sources and sinks which is nothing more than varying your frames of reference.  You can pigeonhole each frame with some neat sciency sounding name, like maximum entropy production, minimum entropy production, construction theory, dissipation theory and probably a few dozen others and it is pretty likely that no two independent methods will agree "exactly".  If the methods are done correctly and the data valid, you will end up with a range of possible "answers" which should define a range or region of probability.

If you consider the thoughts of S. Manabe, THE greenhouse effect should produce about 60 C of "surface" warming.  Convection which is intertwined with latent heat should produce about -30C of negative feedback to the GHE from that "surface" reference.  If the GHE and the Convective feedback are not perfectly linear, very likely, you have the potential for regime changes The trick is to find the likely range of temperatures AND the offset that may be produced by man cause CO2e "forcing".

You can reduce all this down to a simplified partial differential equation that includes all of the inter-dependencies of the known variables, but that gets you right back to the fluid dynamics problem which is essentially chaotic.  You can "complex model" the system various ways, but the results will always be depend on initial conditions resulting in hopefully the same range or region of probabilities you get with a suite of simple models.

The Climate Illuminati despise chaos but thrive on uncertainty which is a bit bizarre. A range is a range regardless of how you get there.  Since all you will ever get is a range, you need to embrace it and make decisions based on the highest probability, which is currently about 0.8 C more from our current conditions.