New Computer Fund

Wednesday, May 27, 2015

Lindzen's Iris back in vogue

Richard Lindzen's Iris Hypothesis is being discussed anew with most of the same issues still in place.  I am not really a fan of the Iris Hypothesis because it is more radiant energy related than plain vanilla thermodynamics.  Most of the radiant models require quite a few thermo and fluid dynamic assumptions that I just cannot accept.

In the tropical ocean you have surface air that stays close to saturation most of the time.  For a given temperature you have a saturation vapor pressure and as the temperature increases, the saturation vapor pressure increases and the dew point temperature also increases.  You have more potential water vapor and a larger temperature range to ring that water vapor out.  All things remaining equal, clouds should start forming at a lower altitude and persist longer.  That should be a pretty simple negative feedback to increased SST.

Clouds forming at a higher temperature and lower altitude can create greater super saturation levels and produce more super cooled water in the clouds.  Basically and increase in mixed phase clouds.  Since the latent energy has to be released and higher temperatures/more CO2 would reduce the rate of release, thicker clouds at lower levels with more water/water vapor in super saturated or super cooled conditions would change the radiant properties of the clouds.

Lindzen's Iris just assumes that this will cause more efficient ringing or the moisture reducing water vapor entrainment to high altitude cirrus clouds.  leads to a SWin versus LWout in clear tropical sky issue.  While the lens lets more LW out i.e. you can "see" a warmer surface, the SW in can "see" the surface as well which would increase surface energy uptake.  You are back to a square one radiant issue when the thermo indicates more interesting possibilities.

Part of those possibilities is that the mass of the atmosphere is pretty well fixed and just adding water vapor reduces the mass of a parcel of air, increased convection.  That is only true for water vapor, once you get into supersaturated water vapor and super cooled water you have increasing mass of the parcel.  There are regulating thermodynamic features included that aren't all that well considered in my opinion in the simple radiant models.

Since the mass of the atmosphere is effectively fixed, convection and advection would have to change with increased temperature.  I liken it to a pot lid, more heat just makes the pot lid rattle more and that rattling is a bit random.  The rattling, deep convection, is triggered at a temperature of around 27 C which in the tropics effectively limits the maximum average SST to about 30C degrees. There are somewhat isolated hotter pockets, often over shallower water, that can persist, but it appears to be unlikely that larger tropical areas can sustain greater than 30 C for very long.  There are a number of "oscillations" MJO, QBO and ENSO that are generated by and help destroy these hotter pockets.  So while the Iris Hypothesis is likely correct, all the mechanisms that would determine if it is a negative or positive feedback are not so easy to figure out.

More mixed phase clouds though is most likely a negative feedback and liquid layer topped clouds are definitely a negative feedback.  This would indicated that tropical clouds are a regulating feedback pretty much like pre-CAGW science had them pegged.  So until the complex mechanisms can be explained well enough, the Iris is likely to keep being debated.  That is the problem with simple explanations, most of the time they aren't.

Wednesday, May 20, 2015

My LinuxMint 17.1 Evaluation

First, not going to work for my Toshiba Satellite this go around.  The main reason is the WiFi connection isn't very stable.  Wouldn't be a a fatal flaw except that the security camera kinda depends on a reasonably stable WiFi connection.  I end up getting some router and camera passwords scrambled which bombed out the ZoneMinder package.  Probably my fault, but I doubt completely since that seems to be related to some of the common ZoneMinder questions.

Second, there is a graphics issue with the Toshiba and Linux in general.  The Satellite model I have is one of the few not supported at all by Linux.  Since several closely related models are, I thought I could get it to work.  From what I have seen perusing the blogs, a few have Linux running on the same machine but without some of the infra net functions I was playing to use to stream the security camera video.

Reverting back to Windows 7 wasn't too bad but could have been done with a bit more class.  LinuxMint created three windows boots and I had to experiment with all three before I hit a working combination.  I still have mint on the machine and may plan on giving it another shot, but not until I have learned a bit more about the Sricam options.

Other than the couple of screen freezes that required hard restarts, most of the package was pretty slick.  I was able to do a lot of things at once without putting half as much pressure of the CPU.  Firefox is the main browser and worked just fine.  The Sricam though is pretty much hard wired for Internet Explorer if you want admin access.  I could bypass that, but would nearly have to rewrite ZoneMinder which recommends using IE for a number of camera admin functions anyway, see the first issue.  Other than a few quick patches I am not really in the mood to learn another couple of languages and sort through a dozen different package builds to fix things.

Because of the various builds in use, getting quick and accurate help is almost impossible if you need something other than a reminder to check power.  This seems to be due to some of the problem child extra features built into CPU/GPU for laptops.  It seems my particular CPU has scale-able speed for a reason.  I will need to track down that particular adjustment in the next Linux trail since that is likely related to both of the main issues I was having.

On a brighter note though, there are several, few generations back, laptops that seem to have been focused on by various Linux groups.  Plus I have a crap Dell desktop with one of the more favored AMD CPUs from the Vista era that might make a fair Home Network server if I 86 the crap HDD in favor of a 128 GB USB memory stick that is about twice the capacity of the crap HDD and costs less than 25 bucks.

Anywho, for now I am back up with windows and have the Sricam running on iSpy with most of the basic features and I have access to an easier to patch xml driver that I can fiddle around with.  There is even a facial recognition feature that I might get to work with a Bluetooth electronic deadbolt.  Most likely just another never to be finished project, but it doesn't look all that difficult.

btw, without antivirus and mals, internet surfing was a BLAST and the numerous restarts I had to do only about 30 seconds each. So if there is a next time I will likely be a complete Linux convert.

Sunday, May 17, 2015

Linuxmint experiment - Pantum 2010 installation

Since my old laptop is dragging butt I thought I would clean things up.  I have been hearing good things about the new LinuxMint version 17.1 so I found a usb memory stick to boot off for a few days and finally bit the bullet and installed it along side Windows. 

The reason I bit the bullet is because running off the stick I was losing that I was figuring out along the way.  Once I finally figured out how to install my cheap Chinese knock off laser printer,  I decided to make it official. 

I believe one reason Linux has been a bit slow to take off is because they have some of the stranger geeks playing.  I looked through a few of the forums and found a few dozen folks asking how to install the exact same printer from over two years ago with not a single "easy" solution and most of the "solutions" created more problems than they fixed. 

There is a fairly simple way to install and unsupported printer starting with downloading the "linux" version of the driver from the OEM website.  "linux" is in quotes because the driver is in a .RPM format which isn't clearly supported by the Mint version of Linux.  There is a Redhat version that much have tickled the fancy of most manufacturers that does use the .RPM format. 

There is a supported Linux application called Alien that will convert a .RPM version into a .DEB version.  Sounds great right?  Nope, Mint needs a PostsScript Document Driver (.PPD) version and the .DEB is actually a PPD filter.  The printer install not so much a wizard, Wizard asks for a .PPD, url or you can select from the list of "supported" printers.  Not very obvious on the printer wizard apprentice is a search lens labled FILTER.  Once you create the .DEB file just cut and paste it in the search box.  Tadah, the printer prints.

I wasted about 8 hours between the install and searching out how to install the printer.  That is the easy part.  The real reason I thought about Linux is because I installed one of those cheap Chinese knock off security cameras with night vision, pan and tilt, zoom, motion detector, audio and a hand full of other features plus the absolute worst documentation in the world.  Streaming security camera video on an already slow Windows 7 laptop really was grinding things to a halt.

ZoneMinder is a "free" linux supposed security camera program with Geek^3 documentation.  There were so many alternates and alternate install procedures that I decided to print them out so I could make some sense of the mess.  Oops, there started the printer challenge.  Anywho, the LinuxMint 17.1 version happens to have a few unsupported files required for most the ZoneMinder install "recommendations".  From the forum reviews of ZoneMinder I have seen, I may learn a few new cuss words before I get that up and running.

This post may seem a bit odd for a Climate Change related blog, but actually it is a perfect  fit.  The "key" for solving climate change as I read in one climate paper, is to decouple wealth from energy.  These cheap Chinese knockoffs are in many cases fairly well made and close to dirt cheap because you are not paying the 150% to 250% mark up.  200% mark up as you know is 4 times real cost for the OEM aka intellectual property holder, which means way back when the warm and fuzzies were talking about the $100 laptop for the third world masses it already existed.  Just get rid of Microsoft, Intel and big boxes and there you go, $35 for a tablet and around $99 for a laptop.  We just start decoupling that wealth from the warm and fuzzies that have all these grand schemes to save the world and we will start making a dent in our carbon foot print. I have a couple of those $35 including shipping tablets on the way right now.

Alien, btw was written by Joel Hess and has some information on Wikipedia.  Pay attention to the last bit, ".., and using install scripts automatically converted from an Alien format may break the system."

Break might be a bit harsh, but do try and be careful now, ya hear?

Friday, May 15, 2015

Always Question Your Data or Respect Murphy's Law

This aggravates the hell out of the minions of the Great and Powerful Carbon.  The primary law of the human part of the universe is Murphy's.  People screw up.  So with the typical irrational comments on Climate Etc. concerning CO2 and "closing" the Mass Balance, here  is a little illustration.

Here are two CO2 reconstruction from the Antarctic, the Vostok/dome C composite and the high resolution Law Dome.  The two agree very well over the common period, but the longer term Composite has an upward trend starting about in the middle of the Holocene.  ~20 ppmv over almost 7000 years isn't much, but it is interesting.

It is interesting because until mankind started burning demon coal it should have been a downward trend.  Now I have removed the industrial part of the Law Dome data because I want to focus on this pre-industrial period.

This super high quality and nothing but the finest science product from NASA indicates the peak Holocene temperature was about 7000 years ago, about the same time as the Composite CO2 trend shifted to positive.  This seems odd since the Great and Powerful Carbon should be driving temperature or at least following it.

Some suspect that this discrepancy is due to "natural" smoothing or diffusion of CO2 in the Antarctic ice and snow that creates the CO2 record.  That smoothing would reduce peaks/valley amplitude and shift the CO2 record so that it lags temperature.  Real scientists know that you should check to make sure there isn't bird shit on your radio telescope antenna and that you thermocouples are not self heating before you announce your unprecedented discoveries to the world, Murphy's Law.

This particular discrepancy may not amount to a hill of beans or it might be something.  It could be useful for say pointing out other scientists that might have bird shit in their methods.

 There are some young up and comers in the paleo ocean field we might want to check.

Nope, their work agree pretty well so no obvious bird shit there.

Heck,their work even agrees pretty well with some of the retro climate science.

Their work and the Law Dome CO2 record doesn't agree all that well with this guy's work though.  I suspect bird shit.

I think Murphy's Law has bitten someone in the butt.

Saturday, May 9, 2015

Carbon Neutral, Mass Balance and other such stuff

If there is anything more confusing than "climate change" it is the carbon cycle.

The US EPA has a handy dandy "Global" carbon emissions article.  "Globally, emissions by sector look about like that.

Regionally, i.e. the US emissions by sector look like this, notice there is no Forestry aka land use piece of pie.

Total Emissions in 2013 = 6,673 Million Metric Tons of CO2 equivalent 
* Land Use, Land-Use Change, and Forestry in the United States is a net sink and offsets approximately 13% of these greenhouse gas emissions.
All emission estimates from the Inventory of U.S. Greenhouse Gas Emissions and Sinks: 1990-2013

That is because in the US forestry produces a net carbon sink.So if by some odd chance the world changed their land use/forestry practices, "global"land would be a net carbon sink reducing emissions by about 25%.  That would be elimination the 17% emissions and producing a 8% sink.  That could actually be as high as a 35% net reduction since agriculture could add another 5%.  So let's say that the "world" land changed from a 17% net carbon source to a 13% net carbon sink.  That would mean that roughly 30% of the human related carbon emissions would not cycle through the oceans.

"Globally" "nature" is a net carbon sink.  What should be "natural" though is a bit obscured by a few thousand years of human civilization.  Human "civilization" flourished in the "fertile crescent" which is the middle East.  Now that "fertile" crescent looks a lot like desert. Ancient agriculture which is still practiced to some extent in the ROW that has net land use emissions was a bit rough on the land.  'Civilizations" died out due to "climate change" which could easily be related to deforestation for energy and agricultural expansion resulting in the desert regions which could have been lush tropical rain forests at some time in the past.  All that would not have changed "nature" from a carbon sink to a carbon source, but it would have changed the carbon cycle path way.

Around 5000 years ago, which happens to be around the time of one of the major "fertile crescent" "civilization" collapses, atmospheric carbon did reverse from slight downward trend to a slight upward trend.  That "fertile crescent" region could include most of Indian,mainly the Indus Valley, and changes in Indian Monsoon patterns are a big deal as far as "global" climate goes.  This blip in the atmospheric carbon concentration is not one of the more common "climate change" talking points and land use in general takes a back seat to the demon "fossil fuels".  That is probably because it is easy to account for fossil fuels and not so easy to account for land use change.

Land Use, Land-Use Change, and Forestry (17% of 2004 global greenhouse gas emissions) - Greenhouse gas emissions from this sector primarily include carbon dioxide (CO2) emissions from deforestation, land clearing for agriculture, and fires or decay of peat soils. This estimate does not include the CO2 that ecosystems remove from the atmosphere. The amount of CO2 that is removed is subject to large uncertainty, although recent estimates indicate that on a global scale, ecosystems on land remove about twice as much CO2 as is lost by deforestation. [2] 

From the EPA link above, the amount of CO2 removed by land use is subject to large uncertainty.  The minions of the Great and Powerful Carbon are not all that great with uncertainty.  They tend to think they have a handle on it and accuse the non-believers of using uncertainty to muddy the waters.  So they use things like the "carbon cycle mass balance CONSTRAINT" to impress their loyal followers with some creative BS.  The CONSTRAINT basically just indicates that "nature" is a net carbon sink but doesn't provide any indication on what "normal" sink efficiency should be.  Henry's law is a "LAW" that provides considerable information on the ocean part of the sink, but as far as land goes we are pretty much shooting in the dark.

From ocean ph, it is pretty obvious that altering the carbon cycle pathway from stronger land sinks to relying more on the ocean sink is having an impact.  Removing a couple of gigatons of carbon from the oceans each year would also have some impact.  So we are at a point where land and ocean use could have as larger or larger impact on atmospheric carbon than demon carbon from coal and such.  One of the reasons man switched to coal was because "natural" "sustainable" sources of energy were not sustainable and had a negative impact on the local nature.  Trying to go back to "sustainable" energy that involves land use will likely make things worse, it has in the past right?


Since the biggest part of the Mass Balance debate is "what's natural", this is a look at "natural" assuming everything prior to 1750 had to be natural.  Fossil Fuel wise there wasn't much anthro going on pre 1750.  There was land use and deforestation going on though.  Fire was used to clear land and got out of hand I imagine plus wood was the primary fuel.  If you ignore that, this chart of the Indo-Pacific Warm Pool SST and Antarctic CO2 from the Law Dome ice core would be all natural variability.  If you consider that CO2 in ice cores are "naturally" smoothed over a fair long time scale and little plankton shells are also "naturally" smoothed over a different time scale, these two are a remarkably good match.

As I have shown before, the Oppo 2009 IPWP also compares well to the Lamb climate reconstruction which had the Medieval Warm Period and Little Ice Age periods.  The timing isn't perfect, but you should expect some shift with different smoothing time scales.  The selected "pre-industrial" period, 1750 happens to be close to the deepest part of the Little Ice Age anomaly.  "Natural" variability during this all natural assumed period is about +/-0.75 C and +/- 6.5 ppmv CO2.  6.5 ppmv is small compared to current ACO2 impact, but 0.75C is not small compared to current temperature anomaly.

The simple Mass Balance calculation requires estimates of ACO2 to be very accurate which requires inclusion of Land Use impact since it is estimated to be about one third of total emissions.  Land Use "emissions" would also have an impact on the "natural" carbon sink.  For attribution you need to consider both Land use emissions and Land Use sink impact which requires a baseline or "normal" sink efficiency.

If you use the Mann et al. 2015 version of the past you create the impression that all temperature change is created by ACO2.  Climate isn't simple though.  There can be multi-century lags and the solar precessional cycle is about 21,000 years.  Ocean and ice core reconstructions operate on the millennial time scales meaning you have to consider how you smooth your instrumental data to avoid spurious eureka moment spikes.  You may be able to slice paleo to annual resolution, but that will never account for millennial scale natural smoothing already a part of the proxy.

Saturday, May 2, 2015

New Temperature Versions

UAH has a new version 6.0(beta) which includes some adjustments of course.  It is being gone over with a fine tooth comb by the usual suspects since Spencer and Christy are notorious "skeptics".  "Surface" temperatures are mandatory because everything climate is based on "surface" temperature change.  "Surface" temperature will always have some issues because there is no real surface.;  Since there is an elevation consideration over land and a lapse rate that is variable, there would need to be considerable altitude and specific heat capacity adjustments, latent heat is really "hidden" as far as temperature goes and the ocean readings are mainly sub-surface rather than surface readings in many cases.  Satellites measuring the lower troposphere have to estimate a specific altitude which appears to be around 2000 meters based on the RSS version on Climate Explorer that is available in degrees Kelvin.  So they are measuring a different "surface" with different latent heat considerations.

Since satellites tend to have issues at the poles as do "surface" stations, I tend to prefer looking at the tropics, specifically tropical oceans since they represent the lion's share of total energy.  The chart above compares the UAH beta version with the newest version of ERSST.  What I see is a pretty fair comparison considering all the issues involved with both products.  There is a very small difference in the two trends and as usual the lower specific heat troposphere has more variation than the high thermal mass ocean surface/sub-surface.  I am of course no body, so I will leave it to the 'spurts to really screw this up.

 For the "global" oceans there is a little bit bigger trend difference which could be due to any number of real and calibration issues.  UAH is trying a different averaging method with the intent of improving regional temperatures.  ERSSTv4 I believe is more focused on a "global" average which would mean longer range interpolation.  ERSSTv4 no longer uses the Reynolds oiv2 satellite temperature data for its interpolation, because is caused some "significant" cooling which would most likely improve the correlation between these two data sets.  The difference really doesn't amount to a hill of beans, but the hyper-precision junkies will find some flaws that they think are "significant".

Since land "surface" temperature is an average of Tmax and Tmin and there are rumors the Tmin is suspect due to nocturnal atmospheric boundary layer variation, I am staying out of that mess.  What will likely the case though is the longer the interpolation range the greater the discrepancy between UAH and whatever land "surface" data set.

Nick Stokes has his critique on his blog and Roy Spencer has his pretty detailed explanation of the changes on his blog.