New Computer Fund

Friday, September 11, 2015

Cherry Picking your Science

With the explosion of the information age there is an explosion of "scientific" references.  Since everyone has an opinion and access to a computer there is a weird blend of scientific fact and fiction floating around.  We are becoming a nation of cybercondriacs and "natural" crusaders.  Sometimes all natural isn't what its cracked up to be and some "science" isn't all that scientific.  

The Heimlich Maneuver for drowning is a great example of "hit" science.  The good doctor Heimlich created the Maneuver to clear airways of choking victims.  He then extended the maneuver to drowning to clear airways.  When he described the maneuver for drowning he misrepresented the workings a tad.  Typically, drowning victims do not inhale lungs full of water.  Instead there is a natural involuntary spasm that closes off the airway so less than a teaspoon of water, especially salt water, can cause you to stop breathing.  An organization that provides lifeguards for water parks used the newly recommended maneuver and found it to be extremely effective, especially since it was simple enough to teach teenagers to use and didn't involve mouth to mouth reparation which can cause vomiting which is pretty gross for your average teenage first responded.  

A son of the good doctor made a lot of wild claims about his father and inspired a medical researcher to "scientifically" determine whether the drowning version of the maneuver was effective or not.  Follow this link for medical fraud info. The "scientist" found one example where the maneuver could have cause harm to a victim out of less than 30 cases where it was used.  That particular victim aspirated vomit.  Since there was a very small sample population, a bad result was "significant".  In normal science, the one fail in a small sample should have inspired the typical, "more research is indicated".  Basically there was not enough data to reach a meaningful conclusion.  However, due to the lungs full of water miscue, a lot of major organizations drop the Heimlich Maneuver for "near" drowning situations.

According to one water park lifeguard provider, the maneuver was used many thousands of times as "in water intervention" and potentially saved many hundreds of lives.  That study though merely indicated that the Heimlich Maneuver for "near" drowning "may" work but didn't appear to be harmful.  What should have been a signal for more research or no action resulted in the Heimlich Maneuver for "near" drowning victims being disapproved for basic rescue training. Follow this link for more on the controversy.

A second example is azodicarbonamide (ADA) which is an additive used in various food products with Subway breads being the big media event, though Starbucks also has it as an ingredient in some products.  ADA is a flour conditioner that increases shelf life, increases elasticity and promotes rise.  It's main use is in plastic foams like yoga mats, but the US FDA allows its use in food products.  European countries have banned its use due to indications that heavy exposure to the powder appears to cause asthmatic health issues.  The additive has been found to harm laboratory animals in concentrations of 5% and greater, but is approved as a food additive in concentrations of 45 ppm or less, 45/1000000 versus 5/100 is a huge difference.  Without any real indication that ADA is actually harmful as a flour conditioner, Subway and others are removing the additive due to popular interpretation of "science".  Ironically, ADA is an additive in many "100% whole grain" products with healthy plastered on the packaging.  

A third example is Roundup brand herbicide.  The active ingredient in Roundup is Glyphosate but the formula for Roundup has a number of proprietary ingredients including a wetting agent  polyethoxylated tallow amine (POEA).  One study aimed at genetically modified crops (GMC) also include glyphosate in different concentrations in the water of the Sprauge-Dawley rats used in the study.  One of the groups of rats given glyphosate in their water actually had less symptoms that the control group rats.  

All three of these cases represent current issues with the state of science.  When there is very little definitive evidence of harm, the noise or inaccuracy of specific testing methods can result in false positives that once linearly extrapolated appear to be more significant than they are.  The noise can be confounding factors, other real world impacts or just random chance.  

The Sprauge-Dawley Rats for example have a genetic tendency towards tumors in long term (~2 year) studies.  So a control group having more or about the same tumor growth rate as the target groups is not all that unusual.  If there is a definite harmful positive in the target groups it would have to be much more obvious or "significant" to be believable.  

A second issue is "fishing".  When an experiment is designed with several possible active elements and no predicted result, just a "let's see what happens" attitude, the results cannot ever be considered definitive or even valid.  They would indicate the possible need for future more specific testing.  

Confounding factors can sometimes be so obvious after the fact you cannot believe they were missed.  For example a study found that rural cardiac cases resulted in death more often than urban cases.  The study allowed "fishing", so diet, lifestyle and genetic traits appeared to be significant causes.  Well, emergency response time is quicker in urban setting and takes a lot longer in rural settings.  The further you live away from a hospital or emergency medical service the more likely you are to die from various emergency conditions.  

Media exposure amplifies these not quite significant findings to near catastrophic situations.  Every scientific study requires time to "prove" its worth.  There is no absolute "proof" but they can prove to be useful.  They can also be proven to be bull crap.  Almost every scientist needs recognition for their work to get funding for more work so they tend to enhance their press releases.  There is very little old fashion "science" for the sake of science because almost all scientists are not independently wealthy.  Some government and corporate grants allow for pure science but since there is money coming from somewhere some group can always accuse the scientist of being in the pocket of some entity they don't trust.

Add to that a small percentage of scientists who cheat to enhance their image or take shortcuts and another percentage that are just plain nuts, just like the general population and you end up with the old let the buyer beware caveat.  

My personal favorite though are studies that estimate the health cost saving of a particular government regulation.  Take Asthma for example.  Asthma isn't A condition it is a variety of conditions that can lead to wheezing and difficulty in breathing.  Smoking, second hand smoke, smog related to industry and transportation, rag weed, cleaning solutions, wood smoke, coal smoke, perfumes, artificial additives, natural ingredients, pine fresh scent both natural and artificial, dry air (low humidity) etc. etc. etc.all can cause of contribute to an asthmatic symptom.  Currently the government has a hard on for coal so more coal regulation will reduce asthma by so much.  Nice.  Asthma hot spots though are more closely related to diesel emission near heavily use ports and rail yards used in transporting containerized goods.  No matter what is done, a small percentage of the population will always have asthmatic reactions to perfectly natural sources.  So regulation that impact asthma will not produce zero cases, just a reduction in overall cases.  If you don't know what the baseline or "normal" rate is, you cannot accurately predict how effective the mitigation will be.  If you consider every regulation ever produced to reduce asthma with their original projects of impact, we would be in negative asthma territory. 

Flu vaccination is another good one.  Originally the vaccines were estimated to be close to 90% efficacy.  Those studies use very generic indicators of the "flu" and included the effect of the typically co-vaccination for pneumonia.  The pneumonia vaccine is very close to 90% effective but depending on the type of flu strain, the "flu" vaccine ranges from 10% to 50% effective.  In most of the studies less than 20% effective isn't statistically significant because of limits of the types of studies.  So that flu vaccine might help or might not.  If you forego the flu vaccine though you just might be labeled an "anti-vaccer".  Knowing a little bit about science in this case makes you anti-science thanks to a butt load of clueless "causers".  

The best advice I can give is "everything in moderation".  Try to enjoy life while you can so you don't succumb to Pre-Traumatic Stress Disorder or any new excuse of the day. 







Monday, August 24, 2015

What if, you only had SST?

 I haven't been putting much energy into the Global Warming changing climate debate lately.  You can disappear from the lack of debate for a year or two and pretty much not miss anything important.  The same old people are trying to use simple models to combat simple models and there are still "Global Average Surface Temperature Anomaly" food fights.

So imagine if you will, that the only observational information you had was Sea Surface Temperature from 30S to 30N and CO2 concentration change.  From that you need to model the rest of the world "surface" temperature and estimate the CO2 radiant forcing impact on "global climate".

Now if you have your pet "ideal" simple model you can use that or you can sneak a peak at some of the other data if you like and use correlations in a statistical model of you choice.  However, you cannot assume any land use impacts other than any you can support with your model, climate driven desertification or such.

30S-30N SST only by the way is 52% of the ocean area and 37% of the global area.  Energy wise, it is in the ballpark of about 70% of the global energy.

   That is your "reliable" observation data from KNMI.  Now see how well you can "project" what happens in the rest of the world.

Friday, July 10, 2015

What is CAGW?

Catastrophic Anthropogenic Global Warming (CAGW) is used by a lot of skeptics instead of AGW, the GreenHouse Effect (GHE), Climate Change, Climate Disruption, Carbon Pollution and any other alternate terminology that may be in use for Global Warming (GW).  For some reason the real CAGWer's don't or pretend to not understand the C part.

The C part is the sales hook.  Unless GW is bad, there is no reason to do anything about it that could be considered heroic.  Business as usual would be dealing with fuel efficiency, building more nuclear power plants, planting more trees, restoring wetlands, all the stuff that we were doing before the big scare.  In the US we agreed to less stuff and still ended up reducing emissions and increasing carbon storage with land use.  With the exception of a few (around 10%) of the oldest Coal plants, that were kept in service probably longer thanks to impending regulation, most of our power plants are very clean by world standards.  The US has spent plenty on Alternate energies with some hit and plenty of misses and with the exception of California have reasonable electric rates and overall good air quality.  That isn't good enough for the "believers" so the threat of catastrophe is used to push for more.

If you Google Scholar, "Catastrophic Climate Change" you will discover several thousand "scholarly" papers that contain that exact phrase.  Quite a few are written by economists that reference "fat tail probability".  It is the fat tail or low probability, catastrophic impact potential that is the real C in CAGW.  The "science" of fat tail probability seems to have originated with  Blaise Pascal's Wager.  Pascal was a 17th mathematician, philosopher and physicist that posed that all humans should believe in God because the odds are in your favor that way.  If eternal damnation is the infinite catastrophe, any sacrifice for God was squat in comparison.

Since a large number of the CAGWer's are atheists and many are vocal advocate atheists, it is pretty ironic that they find Pascal's Wager useful.  I guess there are no atheists in AGW foxholes. Scientifically, determining risk then the cost and benefit of action to offset some portion of the risk should be "business as usual".

The CAGWer's also take offense when a skeptic mentions they seem to approach CAGW more like a religion than a science.  Since their mitigation models are based on Pascal Wager like logic, what exactly do they have to take offense about?  All they are doing is preaching fire and brimstone the waving the mitigation "salvation" as a carrot.  There is no guarantee any mitigation strategy they have proposed will be successful since they have no clue what degree of "catastrophe" might be forth coming or how much benefit any mitigation might produce.  All they has is some unknown small possibility of any number of catastrophic events that can dream up.  When a skeptic attempts to build some realistic number around their guestimated potential disaster, they cry foul and take their ClimateBall back home.

A large number of skeptics agree that mankind has some impact on climate but that so far that impact has been beneficial.  People are living longer, eating better, breathing easier, enjoying abundant clean water in most cases, surviving shorter droughts with less effort, In fact most of the old scientists that started the GHE research believed it was not only beneficial but needed to avoid a new ice age.  The real science of radiant physics predicts only about 1 C of warming is likely due to a doubling of CO2 and that 1 C of warming is likely beneficial.  Speculative science has put the C in CAGW by including absolutely worst possible case amplification that still only nudges a potential C impact.   Because of that the "projections" are running much higher than the observations.

Pointing any of this out is the Climate Science equivalent of blasphemy.   Skeptical scientists have been shunned, scientifically and to some extent excommunicated from peer reviewed high impact journals.  There is even email evidence of modifying the peer review process to exclude the non believers and at least one journal editor was force out for journalistic heresy, publishing a skeptical point of view.    That is pretty weird shit in my opinion.

Just recently the editor of Science magazine managed to squeeze, climate change, the Pope and Dante's levels of hell in one opinion piece.  Before long the US Supreme Court may have a chance to rule on the separation of Climate Change Church and State :)

Thursday, July 9, 2015

Pi or not Pi

Ideal models are references and depending on just how much accuracy you need can be excellent or crap.  If you are filling a dive tank, the ideal gas laws are fantastic.  You are repeating a process done millions of times and your are generally looking at resolution of say 100 PSI.  Since the pressure on the gauge is about 14 PSI, it is negligible.  The temperature range between the tank fill site and the tank use site rarely cause more than a handful of PSI change, so there really isn't much to worry about, as long as you stay in the normal range of expectations.  When you get near the limits of the "ideal" range of the "ideal" gas laws, then you can start seeing issues provided you have gauges and such with enough precision.  You could put a high precision gauge on a dive tank and see all the fluctuations, but there really isn't any need for that precision.

This post is prompted by a new paper (PhD thesis) on CO2 forcing in the Antarctic.  That paper finds that more CO2 can cause the Antarctic to be a greater heat sink, i.e. increase the cooling rate.  This isn't anything new to me, but it is nice to see someone actually mention it in the "peer" reviewed literature.  Once you get to the extremes, about -70C and below in this cause, you started getting "other" factors come into play.  The "ideal" curve starts turning in new directions.

Believe it or not, this isn't going to have a huge impact on the "all things remaining equal" CO2 role in the Greenhouse Effect.  It will have an effect on the boring other things.

One of the other things is TSI/4 approximation.  Total Solar Irradiance (TSI) divided by 4 is a quick a dirty estimate of the energy available at at sphere for a distant point source like the sun.  It assumes you have a perfect sphere with no atmosphere or ocean to blur the lines of the spherical surface.  If Earth were a perfect sphere 1361/4 would be the "average" energy applied so Earth would have a radiant energy of about 340 Wm-2 and a temperature of about 4C degrees.  You can also use TSI/pi as an "average" energy.  That would give you an average energy of about 433Wm-2 and you can use TSI/2pi or 216 Wm-2 for another reference allowing for dark side cooling.  None of these are prefect, but they are useful references.

The TSI/4 estimate can be modified by correcting for albedo or reflection using about 0.30 or 30% reflection of solar for Earth.  To correct for reflection or albedo with TSI/pi you really should consider actual changes at the surface and not lump the whole planet together.  An example of why would be cloud cover.  Since clouds respond to surface energy, cloud cover tends to follow peak insolation.  TSI/pi already allows for incidence angle, so early morning and late evening clouds don't have much impact on the TSI/pi estimate.  Albedo close the the poles also doesn't have much impact on TSI/pi, because with angles greater than about 60 degrees there is little energy included anyway.

If you adjusted TSI/pi to allow for atmospheric refraction and TSI/4 to allow for atmospheric absorption, they would agree at a theoretical "effective" radiant layer (ERL).  So there should be an ERL of between 216 Wm-2 (day/night consideration) and 240 Wm-2 that is our "all things remaining equal" reference in the sky.  The TSI/4 fans think they have the "ideal" reference and the engineering crew think there is a range.  I am in the Engineering crew.

The "Effective" in the Effective Radiant Layer tells the engineering crew that there is sub and supra "surface" absorption/radiation and other heat transfer that can impact the accuracy of the estimates.  Sub surface would be mainly energy absorbed by the oceans at some depth that has a different residence time than energy absorbed in the atmosphere.  Supra surface would be the higher parts of the atmosphere above the theoretical ELR that have a strong radiant impact by changing the residence time of the energy in the lower atmosphere.

This is the point where theory and reality start to bite.  Both the Theorist and Engineer know that there are issues, but the two camps have difference ways for dealing with those issues.  The engineer should know there is just as real a possibility that he is wrong on the high side as the low side while the theorist tends to ignore the low side.  The Antarctic cooling in response to additional CO2 is an example of an ignored low side error. Theoretical estimates tend to be skewed towards "ideal" response, high side while engineers generally try to hedge their bets to a mid range or more Gaussian error potential.  For example, TSI/4 has a +0 and -7.5% normal error included in the Stefan-Boltzmann law.  Ideal is a limit not a range.  So TSI/4 isn't really an average it is a ideal maximum.  TSI/pi is a realistic average. Nothing wrong with using either or both as long as you "know their limitations."

I noticed with TSI/pi and the sub-surface issue, that you could model the situation as a Half-Wave rectified input with a residual DC offset.  Energy absorbed in the oceans i.e. the subsurface would have a longer residence time allowing energy accumulation not considered in a atmospheric centric model.  Basically you have a greenhouse gas effect and a greenhouse liquid effect.  You can only guess at what the residual energy stored in the oceans would be sans atmosphere, but thanks to the unique maximum density of water at 4C degree, you know that 4C is a good reference.  4C has an S-B equivalent energy of about 334 Wm-2 which would be a "DC" offset of about 118 Wm-2.

That 118 Wm-2 is just another reference and would be due to a combination of sub-surface absorption and atmospheric insolation.  BTW, none of these references consider latent heat transfer, the primary method of cooling the ocean surface or convection which is enhanced by the latent energy impact on density.  Current estimates of latent and convective energy transfer are in the ballpark of 88 Wm-2 and 25 Wm-2 respectively or about 113 Wm-2, roughly equal to the "DC" offset.  That doesn't "prove" anything but does indicate that the DC offset approach has some merit as a reference.

The combination of latent and convective heat flux driven in large part by the ocean subsurface heat retention creates a lens of sorts that blurs what has to be assumed as an "ideal" spherical surface for either the TSI/4 or TSI/pi approximations.  Which approximation best allows for that less than ideal situation should be the better reference choice.  Since TSI/pi represents a "subsurface" energy and the DC offset basically allows for the latent/convective factor, I believe more engineers would appreciate that approach.

Neither approach is perfect, but TSI/pi appears to allow for more issues.  Just my two cents.




 


Wednesday, May 27, 2015

Lindzen's Iris back in vogue

Richard Lindzen's Iris Hypothesis is being discussed anew with most of the same issues still in place.  I am not really a fan of the Iris Hypothesis because it is more radiant energy related than plain vanilla thermodynamics.  Most of the radiant models require quite a few thermo and fluid dynamic assumptions that I just cannot accept.

In the tropical ocean you have surface air that stays close to saturation most of the time.  For a given temperature you have a saturation vapor pressure and as the temperature increases, the saturation vapor pressure increases and the dew point temperature also increases.  You have more potential water vapor and a larger temperature range to ring that water vapor out.  All things remaining equal, clouds should start forming at a lower altitude and persist longer.  That should be a pretty simple negative feedback to increased SST.

Clouds forming at a higher temperature and lower altitude can create greater super saturation levels and produce more super cooled water in the clouds.  Basically and increase in mixed phase clouds.  Since the latent energy has to be released and higher temperatures/more CO2 would reduce the rate of release, thicker clouds at lower levels with more water/water vapor in super saturated or super cooled conditions would change the radiant properties of the clouds.

Lindzen's Iris just assumes that this will cause more efficient ringing or the moisture reducing water vapor entrainment to high altitude cirrus clouds.  leads to a SWin versus LWout in clear tropical sky issue.  While the lens lets more LW out i.e. you can "see" a warmer surface, the SW in can "see" the surface as well which would increase surface energy uptake.  You are back to a square one radiant issue when the thermo indicates more interesting possibilities.

Part of those possibilities is that the mass of the atmosphere is pretty well fixed and just adding water vapor reduces the mass of a parcel of air, increased convection.  That is only true for water vapor, once you get into supersaturated water vapor and super cooled water you have increasing mass of the parcel.  There are regulating thermodynamic features included that aren't all that well considered in my opinion in the simple radiant models.

Since the mass of the atmosphere is effectively fixed, convection and advection would have to change with increased temperature.  I liken it to a pot lid, more heat just makes the pot lid rattle more and that rattling is a bit random.  The rattling, deep convection, is triggered at a temperature of around 27 C which in the tropics effectively limits the maximum average SST to about 30C degrees. There are somewhat isolated hotter pockets, often over shallower water, that can persist, but it appears to be unlikely that larger tropical areas can sustain greater than 30 C for very long.  There are a number of "oscillations" MJO, QBO and ENSO that are generated by and help destroy these hotter pockets.  So while the Iris Hypothesis is likely correct, all the mechanisms that would determine if it is a negative or positive feedback are not so easy to figure out.

More mixed phase clouds though is most likely a negative feedback and liquid layer topped clouds are definitely a negative feedback.  This would indicated that tropical clouds are a regulating feedback pretty much like pre-CAGW science had them pegged.  So until the complex mechanisms can be explained well enough, the Iris is likely to keep being debated.  That is the problem with simple explanations, most of the time they aren't.




Wednesday, May 20, 2015

My LinuxMint 17.1 Evaluation

First, not going to work for my Toshiba Satellite this go around.  The main reason is the WiFi connection isn't very stable.  Wouldn't be a a fatal flaw except that the security camera kinda depends on a reasonably stable WiFi connection.  I end up getting some router and camera passwords scrambled which bombed out the ZoneMinder package.  Probably my fault, but I doubt completely since that seems to be related to some of the common ZoneMinder questions.

Second, there is a graphics issue with the Toshiba and Linux in general.  The Satellite model I have is one of the few not supported at all by Linux.  Since several closely related models are, I thought I could get it to work.  From what I have seen perusing the blogs, a few have Linux running on the same machine but without some of the infra net functions I was playing to use to stream the security camera video.

Reverting back to Windows 7 wasn't too bad but could have been done with a bit more class.  LinuxMint created three windows boots and I had to experiment with all three before I hit a working combination.  I still have mint on the machine and may plan on giving it another shot, but not until I have learned a bit more about the Sricam options.

Other than the couple of screen freezes that required hard restarts, most of the package was pretty slick.  I was able to do a lot of things at once without putting half as much pressure of the CPU.  Firefox is the main browser and worked just fine.  The Sricam though is pretty much hard wired for Internet Explorer if you want admin access.  I could bypass that, but would nearly have to rewrite ZoneMinder which recommends using IE for a number of camera admin functions anyway, see the first issue.  Other than a few quick patches I am not really in the mood to learn another couple of languages and sort through a dozen different package builds to fix things.

Because of the various builds in use, getting quick and accurate help is almost impossible if you need something other than a reminder to check power.  This seems to be due to some of the problem child extra features built into CPU/GPU for laptops.  It seems my particular CPU has scale-able speed for a reason.  I will need to track down that particular adjustment in the next Linux trail since that is likely related to both of the main issues I was having.

On a brighter note though, there are several, few generations back, laptops that seem to have been focused on by various Linux groups.  Plus I have a crap Dell desktop with one of the more favored AMD CPUs from the Vista era that might make a fair Home Network server if I 86 the crap HDD in favor of a 128 GB USB memory stick that is about twice the capacity of the crap HDD and costs less than 25 bucks.

Anywho, for now I am back up with windows and have the Sricam running on iSpy with most of the basic features and I have access to an easier to patch xml driver that I can fiddle around with.  There is even a facial recognition feature that I might get to work with a Bluetooth electronic deadbolt.  Most likely just another never to be finished project, but it doesn't look all that difficult.

btw, without antivirus and mals, internet surfing was a BLAST and the numerous restarts I had to do only about 30 seconds each. So if there is a next time I will likely be a complete Linux convert.




Sunday, May 17, 2015

Linuxmint experiment - Pantum 2010 installation

Since my old laptop is dragging butt I thought I would clean things up.  I have been hearing good things about the new LinuxMint version 17.1 so I found a usb memory stick to boot off for a few days and finally bit the bullet and installed it along side Windows. 

The reason I bit the bullet is because running off the stick I was losing that I was figuring out along the way.  Once I finally figured out how to install my cheap Chinese knock off laser printer,  I decided to make it official. 

I believe one reason Linux has been a bit slow to take off is because they have some of the stranger geeks playing.  I looked through a few of the forums and found a few dozen folks asking how to install the exact same printer from over two years ago with not a single "easy" solution and most of the "solutions" created more problems than they fixed. 

There is a fairly simple way to install and unsupported printer starting with downloading the "linux" version of the driver from the OEM website.  "linux" is in quotes because the driver is in a .RPM format which isn't clearly supported by the Mint version of Linux.  There is a Redhat version that much have tickled the fancy of most manufacturers that does use the .RPM format. 

There is a supported Linux application called Alien that will convert a .RPM version into a .DEB version.  Sounds great right?  Nope, Mint needs a PostsScript Document Driver (.PPD) version and the .DEB is actually a PPD filter.  The printer install not so much a wizard, Wizard asks for a .PPD, url or you can select from the list of "supported" printers.  Not very obvious on the printer wizard apprentice is a search lens labled FILTER.  Once you create the .DEB file just cut and paste it in the search box.  Tadah, the printer prints.

I wasted about 8 hours between the install and searching out how to install the printer.  That is the easy part.  The real reason I thought about Linux is because I installed one of those cheap Chinese knock off security cameras with night vision, pan and tilt, zoom, motion detector, audio and a hand full of other features plus the absolute worst documentation in the world.  Streaming security camera video on an already slow Windows 7 laptop really was grinding things to a halt.

ZoneMinder is a "free" linux supposed security camera program with Geek^3 documentation.  There were so many alternates and alternate install procedures that I decided to print them out so I could make some sense of the mess.  Oops, there started the printer challenge.  Anywho, the LinuxMint 17.1 version happens to have a few unsupported files required for most the ZoneMinder install "recommendations".  From the forum reviews of ZoneMinder I have seen, I may learn a few new cuss words before I get that up and running.

This post may seem a bit odd for a Climate Change related blog, but actually it is a perfect  fit.  The "key" for solving climate change as I read in one climate paper, is to decouple wealth from energy.  These cheap Chinese knockoffs are in many cases fairly well made and close to dirt cheap because you are not paying the 150% to 250% mark up.  200% mark up as you know is 4 times real cost for the OEM aka intellectual property holder, which means way back when the warm and fuzzies were talking about the $100 laptop for the third world masses it already existed.  Just get rid of Microsoft, Intel and big boxes and there you go, $35 for a tablet and around $99 for a laptop.  We just start decoupling that wealth from the warm and fuzzies that have all these grand schemes to save the world and we will start making a dent in our carbon foot print. I have a couple of those $35 including shipping tablets on the way right now.

Alien, btw was written by Joel Hess and has some information on Wikipedia.  Pay attention to the last bit, ".., and using install scripts automatically converted from an Alien format may break the system."

Break might be a bit harsh, but do try and be careful now, ya hear?





Friday, May 15, 2015

Always Question Your Data or Respect Murphy's Law

This aggravates the hell out of the minions of the Great and Powerful Carbon.  The primary law of the human part of the universe is Murphy's.  People screw up.  So with the typical irrational comments on Climate Etc. concerning CO2 and "closing" the Mass Balance, here  is a little illustration.

Here are two CO2 reconstruction from the Antarctic, the Vostok/dome C composite and the high resolution Law Dome.  The two agree very well over the common period, but the longer term Composite has an upward trend starting about in the middle of the Holocene.  ~20 ppmv over almost 7000 years isn't much, but it is interesting.

It is interesting because until mankind started burning demon coal it should have been a downward trend.  Now I have removed the industrial part of the Law Dome data because I want to focus on this pre-industrial period.

This super high quality and nothing but the finest science product from NASA indicates the peak Holocene temperature was about 7000 years ago, about the same time as the Composite CO2 trend shifted to positive.  This seems odd since the Great and Powerful Carbon should be driving temperature or at least following it.

Some suspect that this discrepancy is due to "natural" smoothing or diffusion of CO2 in the Antarctic ice and snow that creates the CO2 record.  That smoothing would reduce peaks/valley amplitude and shift the CO2 record so that it lags temperature.  Real scientists know that you should check to make sure there isn't bird shit on your radio telescope antenna and that you thermocouples are not self heating before you announce your unprecedented discoveries to the world, Murphy's Law.

This particular discrepancy may not amount to a hill of beans or it might be something.  It could be useful for say pointing out other scientists that might have bird shit in their methods.

 There are some young up and comers in the paleo ocean field we might want to check.

Nope, their work agree pretty well so no obvious bird shit there.

Heck,their work even agrees pretty well with some of the retro climate science.

Their work and the Law Dome CO2 record doesn't agree all that well with this guy's work though.  I suspect bird shit.

I think Murphy's Law has bitten someone in the butt.

Saturday, May 9, 2015

Carbon Neutral, Mass Balance and other such stuff

If there is anything more confusing than "climate change" it is the carbon cycle.

The US EPA has a handy dandy "Global" carbon emissions article.  "Globally, emissions by sector look about like that.

Regionally, i.e. the US emissions by sector look like this, notice there is no Forestry aka land use piece of pie.

Total Emissions in 2013 = 6,673 Million Metric Tons of CO2 equivalent 
* Land Use, Land-Use Change, and Forestry in the United States is a net sink and offsets approximately 13% of these greenhouse gas emissions.
All emission estimates from the Inventory of U.S. Greenhouse Gas Emissions and Sinks: 1990-2013

That is because in the US forestry produces a net carbon sink.So if by some odd chance the world changed their land use/forestry practices, "global"land would be a net carbon sink reducing emissions by about 25%.  That would be elimination the 17% emissions and producing a 8% sink.  That could actually be as high as a 35% net reduction since agriculture could add another 5%.  So let's say that the "world" land changed from a 17% net carbon source to a 13% net carbon sink.  That would mean that roughly 30% of the human related carbon emissions would not cycle through the oceans.

"Globally" "nature" is a net carbon sink.  What should be "natural" though is a bit obscured by a few thousand years of human civilization.  Human "civilization" flourished in the "fertile crescent" which is the middle East.  Now that "fertile" crescent looks a lot like desert. Ancient agriculture which is still practiced to some extent in the ROW that has net land use emissions was a bit rough on the land.  'Civilizations" died out due to "climate change" which could easily be related to deforestation for energy and agricultural expansion resulting in the desert regions which could have been lush tropical rain forests at some time in the past.  All that would not have changed "nature" from a carbon sink to a carbon source, but it would have changed the carbon cycle path way.

Around 5000 years ago, which happens to be around the time of one of the major "fertile crescent" "civilization" collapses, atmospheric carbon did reverse from slight downward trend to a slight upward trend.  That "fertile crescent" region could include most of Indian,mainly the Indus Valley, and changes in Indian Monsoon patterns are a big deal as far as "global" climate goes.  This blip in the atmospheric carbon concentration is not one of the more common "climate change" talking points and land use in general takes a back seat to the demon "fossil fuels".  That is probably because it is easy to account for fossil fuels and not so easy to account for land use change.

Land Use, Land-Use Change, and Forestry (17% of 2004 global greenhouse gas emissions) - Greenhouse gas emissions from this sector primarily include carbon dioxide (CO2) emissions from deforestation, land clearing for agriculture, and fires or decay of peat soils. This estimate does not include the CO2 that ecosystems remove from the atmosphere. The amount of CO2 that is removed is subject to large uncertainty, although recent estimates indicate that on a global scale, ecosystems on land remove about twice as much CO2 as is lost by deforestation. [2] 

From the EPA link above, the amount of CO2 removed by land use is subject to large uncertainty.  The minions of the Great and Powerful Carbon are not all that great with uncertainty.  They tend to think they have a handle on it and accuse the non-believers of using uncertainty to muddy the waters.  So they use things like the "carbon cycle mass balance CONSTRAINT" to impress their loyal followers with some creative BS.  The CONSTRAINT basically just indicates that "nature" is a net carbon sink but doesn't provide any indication on what "normal" sink efficiency should be.  Henry's law is a "LAW" that provides considerable information on the ocean part of the sink, but as far as land goes we are pretty much shooting in the dark.

From ocean ph, it is pretty obvious that altering the carbon cycle pathway from stronger land sinks to relying more on the ocean sink is having an impact.  Removing a couple of gigatons of carbon from the oceans each year would also have some impact.  So we are at a point where land and ocean use could have as larger or larger impact on atmospheric carbon than demon carbon from coal and such.  One of the reasons man switched to coal was because "natural" "sustainable" sources of energy were not sustainable and had a negative impact on the local nature.  Trying to go back to "sustainable" energy that involves land use will likely make things worse, it has in the past right?

update:

Since the biggest part of the Mass Balance debate is "what's natural", this is a look at "natural" assuming everything prior to 1750 had to be natural.  Fossil Fuel wise there wasn't much anthro going on pre 1750.  There was land use and deforestation going on though.  Fire was used to clear land and got out of hand I imagine plus wood was the primary fuel.  If you ignore that, this chart of the Indo-Pacific Warm Pool SST and Antarctic CO2 from the Law Dome ice core would be all natural variability.  If you consider that CO2 in ice cores are "naturally" smoothed over a fair long time scale and little plankton shells are also "naturally" smoothed over a different time scale, these two are a remarkably good match.

As I have shown before, the Oppo 2009 IPWP also compares well to the Lamb climate reconstruction which had the Medieval Warm Period and Little Ice Age periods.  The timing isn't perfect, but you should expect some shift with different smoothing time scales.  The selected "pre-industrial" period, 1750 happens to be close to the deepest part of the Little Ice Age anomaly.  "Natural" variability during this all natural assumed period is about +/-0.75 C and +/- 6.5 ppmv CO2.  6.5 ppmv is small compared to current ACO2 impact, but 0.75C is not small compared to current temperature anomaly.

The simple Mass Balance calculation requires estimates of ACO2 to be very accurate which requires inclusion of Land Use impact since it is estimated to be about one third of total emissions.  Land Use "emissions" would also have an impact on the "natural" carbon sink.  For attribution you need to consider both Land use emissions and Land Use sink impact which requires a baseline or "normal" sink efficiency.




If you use the Mann et al. 2015 version of the past you create the impression that all temperature change is created by ACO2.  Climate isn't simple though.  There can be multi-century lags and the solar precessional cycle is about 21,000 years.  Ocean and ice core reconstructions operate on the millennial time scales meaning you have to consider how you smooth your instrumental data to avoid spurious eureka moment spikes.  You may be able to slice paleo to annual resolution, but that will never account for millennial scale natural smoothing already a part of the proxy.















Saturday, May 2, 2015

New Temperature Versions

UAH has a new version 6.0(beta) which includes some adjustments of course.  It is being gone over with a fine tooth comb by the usual suspects since Spencer and Christy are notorious "skeptics".  "Surface" temperatures are mandatory because everything climate is based on "surface" temperature change.  "Surface" temperature will always have some issues because there is no real surface.;  Since there is an elevation consideration over land and a lapse rate that is variable, there would need to be considerable altitude and specific heat capacity adjustments, latent heat is really "hidden" as far as temperature goes and the ocean readings are mainly sub-surface rather than surface readings in many cases.  Satellites measuring the lower troposphere have to estimate a specific altitude which appears to be around 2000 meters based on the RSS version on Climate Explorer that is available in degrees Kelvin.  So they are measuring a different "surface" with different latent heat considerations.

Since satellites tend to have issues at the poles as do "surface" stations, I tend to prefer looking at the tropics, specifically tropical oceans since they represent the lion's share of total energy.  The chart above compares the UAH beta version with the newest version of ERSST.  What I see is a pretty fair comparison considering all the issues involved with both products.  There is a very small difference in the two trends and as usual the lower specific heat troposphere has more variation than the high thermal mass ocean surface/sub-surface.  I am of course no body, so I will leave it to the 'spurts to really screw this up.

 For the "global" oceans there is a little bit bigger trend difference which could be due to any number of real and calibration issues.  UAH is trying a different averaging method with the intent of improving regional temperatures.  ERSSTv4 I believe is more focused on a "global" average which would mean longer range interpolation.  ERSSTv4 no longer uses the Reynolds oiv2 satellite temperature data for its interpolation, because is caused some "significant" cooling which would most likely improve the correlation between these two data sets.  The difference really doesn't amount to a hill of beans, but the hyper-precision junkies will find some flaws that they think are "significant".

Since land "surface" temperature is an average of Tmax and Tmin and there are rumors the Tmin is suspect due to nocturnal atmospheric boundary layer variation, I am staying out of that mess.  What will likely the case though is the longer the interpolation range the greater the discrepancy between UAH and whatever land "surface" data set.

Nick Stokes has his critique on his blog and Roy Spencer has his pretty detailed explanation of the changes on his blog.

Thursday, April 30, 2015

Particulate Matter - the next "Climate Change" battle

Particulate matter (aerosols) is going to be a challenge.  The majority, as in over half, of the particulate matter in the atmosphere appears to be natural.  Salt spray, pollen, dust, pine fresh scent are all forms of particulate matter.  Just like climate change, particulate matter is given a fairly vague set of categories, pm2.5 or particles 2.5 microns and less and pm10 or particles 10 micros to about 2.5 microns.  Since things like the aroma of pine trees to radioactive fallout can be in either of the two big categories, you are not likely to get all the information you need to figure out how you should stand on this soon to be controversial issue.

Here in the Keys our two big PM issues are hydrogen sulfate or sulfide and coral dust.  The rotten egg odor is associated with decomposing bay grasses and the coral dust is because we have a lot of traffic, foot and vehicle wise on coral.  We don't invest much money in maintaining manicured lawns which would reduce the coral dust but increase our water usage.

Two other main sources outside of our little slice of paradise are Saharan dust and smoke from wild fires on the mainland.  I had my first asthma like attack thanks to wild fire smoke that can fires around Lake City.  Not sure it was really asthma, but it lead to a chain of medical events that I would have been better off not experiencing.  The Saharan dust has caused a lot of problems including asthma cases in most of the Caribbean.   We also have lots of salt spray, but that appears to not be a problem.

These natural and man made aerosols provide an important function in the atmosphere, they can become cloud condensation nuclei.  Clouds will still form without nuclei, but at a much lower temperature.  Not to worry though, there seems to be plenty of both natural and not so natural aerosols around to create clouds.

In the US, man made aerosols were pretty obvious since they helped create smog.  That lead to the clean air act and over the past 40 plus years the US has done a fair job of cleaning up the air.  More can be done of course, but there is a point where it costs a great deal for very little improvement.

Aerosols due to too many people occupying the same area is nothing new.  London for example had to regulate people use hard coal rather than wood or more smoky coals.  Eventually, thanks to electricity most of the burning was centralized so it was further away from people and then as people move closer to these power sources and population expanded, processes were designed and installed to scrub the emissions.  A relatively modern coal power plant has wet scrubbers and even particle arrestors (filters) which removes most of the aerosol emissions.

The rest of the world as in the developing and undeveloped nations don't have our "advanced" 50 plus year old technology or cannot afford to use the technology they have because cleaning emissions cost money.  China for example has built a lot of close to state of the art coal power plants complete with emission controls but don't use the emission controls very much.  Since the electrical grids are limited, there are also plenty of the rural population using coal, wood and even dried dung for cooking and heating.  This creates a great deal of indoor air pollution which leads to premature deaths but exactly how many is a difficult thing to estimate.

Thanks to the green movement and fuel prices, the US has seen and increase in wood, often wood pellets, being used to replace oil fired boilers and space heaters.  Most of these are very efficient and produce little smoke, but everything needs to be maintained to maintain that high efficiency so there is a growing aerosol issue thanks to "sustainable" bio-fuel, aka wood.

U.S. Air Quality is a website that keeps track of such things.  Here is their April 27, 2015 post;

"DUST AND REMNANT SMOKE FROM CENTRAL AMERICA AFFECTING SOUTHERN US
The south is experiencing a mix of smoke and dust--the smoke still coming from the weeks old agricultural burnings in Central America. Moderate to USG AQI's were recorded in southern Texas, Louisiana, and much of Florida today (EPA AirNow Combined Loop, top left). Smoke (light to medium in density) can be seen over the Mexican Gulf from the prescribed fires in Mexico and other northern Central American countries (NOAA HMS, top right). The plume off the coast of Florida is believed to be remnant smoke from these fires as well. MODIS Terra imagery (bottom left) shows a correspondence of high AOD with the presence of smoke over the coast of Florida as well as in Texas. The NAAPS Aerosol Model (bottom right) predicts not only smoke but dust to be a factor in the elevated AOD over western Texas, believed to have reached a surface concentration of 5.12 mg/m3, with smoke hitting a surface concentration of 64 µg/m3. The dust is presumably domestic, originally being kicked up in or around El Paso."

Agricultural burning is a big source but then agriculture in general is a big source.  Changes to "conservation farming" which generally requires genetically modified crops and "Round-Up" help reduce argicultural emissions, but then the warm and fuzzies are not very supportive of GMC and Monsanto in general.

On the power generation side of things, other than some very old coal power plants (~10% of all US plants), US power generation including demon coal, is pretty low in aerosol emissions.

The basic Green solution to aerosols is a revenue neutral carbon tax (RNCT).  With the RNCT, supposedly the money will be taken from the rich evil abusers of the atmosphere and given to the poor folks that are suffering because of our excesses.  Other than that you are not going to get very many details of how much money is involved, where the money really goes and how much good this redistribution of money is going to do.  You will get the same song and dance about how good taxation is going to save the world.  On top of that you will probably be  fined pretty severely should you do something stupid like build a camp fire or use that inefficient fire place that help sell you on that house you bought.

Since the Olympics btw, China has determined they might have to start using their emissions controls and I believe the US has promised to help finance that.  All the while China will be using its Asian Infrastructure Investment Bank to help other nations build  coal fired power plants so they can blackmail the developed world for some of that RNCT money.  The RNCT money by itself has too many strings attached, like not using cheap coal for example, for those developing nations to use for developing.

While all this is going on Germany, thank to its Green Fear of the atom is building more coal plants to fill in for all its "Alternate Energy Sources" which tend to be a bit unreliable at following electrical loads.

Germany is the Green poster child for the warm and fuzzies.  Since they are paying about four times the current US average electrical rate, it is obvious that if you throw enough money at a problem you will get Green approval, even when you use demon coal.

An energy mix, Coal, NatGas, Nuclear, Hydro and yes those "alternate energies", is a good way to go, but until some fairly major technological break through happens, the "alternate energies" have limits of about 20% maximum of the mix in most of the "developed" nations.

Since the Greenies have hard ons for Coal, Nuclear, GMC, Fracking and common sense, it is pretty unlikely that the RNCT "solution" is going to do much other that cost several times what it accomplishes.

Not to worry though, using the standard creative statistics they will show that if the RNCT reduces pm2.5 by x percent it will save y number of lives and be "invaluable".  Then once the RNCT is pushed through they will find another needed tax until they get at least half of whatever you might hope to make in your lifetime.  It takes a lot of money to keep saving the world.

Thursday, April 23, 2015

Signatures

I have enjoyed reading some older explanations of "signatures" of global warming as related to Greenhouse Gas increases.  "Signatures" of anthropogenic caused warming or in other words warming that could only be caused by CO2 equivalent gases would be nice.  That would give the world a real test of Climate Change, as in human related climate change, so we could move on.

Stratospheric Cooling with Tropospheric Warming (SCTW) should be a signature of well mixed greenhouse gases.  The well mixed CO2 equivalent gases would basically blanket the lower troposphere reducing the flow of energy through the stratosphere.  Realclimate had a hilarious post on this subject some time ago.  There is also the Tropical Tropsphere Hot Spot (TTHS) which is a "signature" of GHG warming but it is not a "unique" signature.  If warming is caused by GHGs then there would be a TTHS, but if the warming is caused by something else then there would still be a TTHS.  If there is no TTHS then that would be evidence of no warming, at least in the tropics.

The fact of the matter is neither of these are "unique" signatures.  If there is any warming, since there are already GHGs in the atmosphere, the troposphere would warm and the stratosphere cool.  However, you can have variations in stratospheric ozone and water vapor that can change the temperature of the stratosphere that would change the troposphere temperature.  You can also have changes in ocean surface temperature which would cause changes in troposphere temperatures and moisture with water being another of those greenhouse gases.  If the atmosphere were perfectly stable, then you could easily figure out what caused what, but thanks to dynamics it is not so easy.

You can get a better idea by assuming CO2 equivalent gases have a 1C per 3.7 Wm-2 impact on lower troposphere temperatures and then tease out an estimate for lower stratospheric cooling related to that change.  Since any warming would also have an impact, you would need to estimate the equivalent change in energy flux due to "other" warming.  The two are very similar making attribution difficult.

The tropical oceans for example, while still impacted by AGW, have more latent and convection heat flux change and much less radiant heat flux change.  The TTHS is supposed to be "driven" by the latent flux increase, provided of course the convective flux changes at some predictable rate so it doesn't eliminate the TTHS.  Richard Lindzen hypothesized an Iris Effect where increased SST would reduce tropical cirrus cloud impact offsetting a great deal of the tropical warming.  That basically would erase most of the TTHS.  I am not particularly sure that Dr. Lindzen's Iris analogy is all that great, but changes in deep convection related to increased SST would change cloud cover distribution, stratospheric water vapor, stratospheric ozone and vary the Brewer-Dobson circulation.  Notice that the tropical SST is about 0.5 C warmer than "average" and about the same as in the early 1940s.

My focus has been of the Brewer-Dobson circulation changes which impact the planets real heat sinks, the poles, by changing pole ward advection of stratospheric water vapor and ozone which changes the intensity of Sudden Stratospheric Warming Events and "Arctic Winter Warming".

That is a big mouthful of not very often discussed climate mechanisms.  A simple way to think of this is that variation is inefficient and stable is efficient as far as engines performance goes.   We have polar winter vortexes that if stable tend to reduce heat loss and  when unstable allow more heat loss.  In the news the "POLAR VORTEX" is the new villain in the winters are colder that usual explanation toolbox. What they are referring to is a break down in the vortex related to large changes in the high northern jet stream.  A stable jet stream/polar vortex would related to a milder winter, a milder winter allows the surface to retain more energy which is really what global warming is about, retaining more energy.  If there isn't an increase in energy retention then no TTHP or SCTW.

Well if you talk to any true believer in AGW/CLIMATE CHANGE, you know that the oceans are gaining energy which is energy retention, therefore we should have the signature TTHS and SCTW.  Well, that isn't a given.  Right now the Southern Hemisphere is doing the heat uptake thing and the Northern Hemisphere is doing the heat rejection thing.  A few years I estimated the heat rejected by a large SSW event to be of the same order of magnitude as the energy imbalance.  There are now papers that tend to confirm that my estimate was pretty good.

There are several interesting things about the hemispheric imbalance.   The current stage of the solar precessional cycle is a biggie.  There is more solar energy available in the southern hemisphere summer along with a larger percentage of ocean surface area.  With out any consideration of change in GHG concentration, there should be more ocean heat uptake in the southern hemisphere.  In order for there to be an "equilibrium" the northern hemisphere heat rejection would have to increase OR yep that is a big or, more energy must be stored in the form of glacial ice.  The part after the big or would mean Earth is primed for a shift into some degree of glacial period.  Fortunately or not depending on your point of view, glacial mass requires land area and there aren't many people/nations willing to donate to the glacial cause.  With the current state of technology it is much easier to melt snow than it is to grow glacial mass.

This low probability of glacial mass growth leads to the second interesting point.  Since land area required for glacial growth isn't really symmetrical any significant growth would produce an orbital wobble.  A wobble is variability which implies inefficiency for our heat engine which could trigger a more rapid rate of heat rejection should someone donate enough land for the global glacier international park.

This potential wobble influence leads to the third interesting point, where is there any "equilibrium" involved in this climate problem?  We have shifting psuedo-steady states with pretty unpredictable time frames.  However, if you talk with the true believers again there is no evidence of past variability or any reason to consider hemispheric imbalances.  Besides that would complicate a perfectly "obvious" theory with several "signatures" that will likely pop up any decade now, provided of course we have an extremely stable atmosphere that allows us to use "equilibrium" assumptions validly.

Kind of reminds me of those Boy's Life cartoons.  Instead of circular logic it is more like a circle jerk.





Monday, April 20, 2015

Carbon Balance

There will continue to be a debate over the carbon balance because it is complex.  This carbon cycle graphic from NASA Earth Observatory is about average.  Human emissions via fossil fuels are about 9 gigatons per year and about half remains in the atmosphere with land and ocean sinks removing a little over half.  The annual transfer between the land/ocean and atmosphere is roughly 200 gigatons or about 22 times greater than the annual human emissions.  Man kind does more than burn fossil fuels we also impact the land and ocean carbon sinks.

Since the oceans provide a great deal of food for humans and their livestock, some portion of that carbon is more rapidly cycled than if it were not disturbed.  Same with land where food sources are more rapidly cycled due to a few billion people and their livestock.  If it were not for mankind, some portion of that carbon would still be cycled, but what is "normal" isn't all that easy to determine.

Obviously, burning the very long term sequestered carbon in fossil fuels is part of the change in the carbon cycle, but land and ocean abuse also play a significant role.  Since a very small change in land and ocean carbon sinks can completely offset fossil fuel emissions, it should also be obvious that changes in agricultural, ocean harvesting and general construction have a large impact.

Estimates of land use impacts, specifically soil carbon, are wild ass guesses.  For example, "pre-industrial" is vague and soil carbon estimates are limited to at most the top meter of soil.  Replacing deep rooted native plants with shallow rooted crops could have twice the impact of current estimates.  In the southeastern US, former cotton and tobacco fields have been replaced with tree farms and orchards.  Today the yield per acre is a factor of ten or more greater than it was at the turn of the 20th century, so more of these crops can be grown on less acreage freeing considerable land for "conservation" of soil uses.  Most likely because of that transition, the US is now a net carbon sink, meaning that US land area is taking up more carbon than it is releasing.

Globally, land carbon balance is negative, meaning it is a source rather than sink and is contributing about a third to the atmospheric carbon increase.  If that land area were managed to produce an equal carbon sink, then roughly two thirds of the atmospheric carbon uptake would be reduced.  "Science" tends to select when it wants to use gross values and net values in a haphazard manner greatly complicating understanding of the situation.  From what I have been able to determine so far, land use has roughly twice the estimated impact, but that really depends on what "pre-industrial" condition is selected.

Potential damage of increased fossil fuel use appears to be over estimated by a factor of two which depends on what "normal" is selected again invoking "pre-industrial" definitions.  If you pick 1700 to 1918 as "pre-industrial"  you have potentially more warming than if you select 1000 to 1200 as "pre-industrial".    The is a growing "war", if you will, in paleo-science with some factions claiming a "pre-industrial" nirvana that never changes, the hockey stick crowd and the new guard of ocean paleo-climatologists indicating considerable variability in past climate.

So despite the claims of "consensus" there are large areas where the climate science is far from being resolved.  Politically, the advantage in this uncertainty goes to the precautionary principal crowd.  It is really to their advantage to keep large uncertainties all the while playing the Merchant of Doubt uncertain card, belittling the realists with rational questions.  Actually solving some parts of the climate problem reduces the precautionary urgency, so don't expect many simple cost effective mitigation attempts to be very well publicized.

Conservation agriculture for example has a positive impact of soil carbon and water retention but requires evil Monsato products like "Roundup" brand weed killer.  Like any product, Roundup can be over used and since it isn't "natural" it doesn't have much "green" support.  Pesticides which also make life bearable can be over used and aren't on the "green" happy face list along with antibiotics, genetically modified crops, radiation and a surprising large amount of scientifically developed "solutions" to feeding the world and making the world more bearable.  Use of these scary science developments can reduce land abuse increasing the acreage that can be set aside for "conservation" which is really a much longer time scale version of crop rotation.

To add insult to injury, the Green Police have started alienating themselves from the third world counties they profess to care for.  Coal is one of the least expensive energy sources for many nations that are not allowed to dabble in nuclear and the new Asian Bank initiative is providing funding for projects the Green Police cannot stomach.  Every one of these developing nations will run into the same environmental problems faced by the developed nations and will turn to the same "solutions" used in the developed world with the appropriate 10 to 30 year lag time.  Instead of trying to force "solutions" on the third world the developed world is better off actually solving problems by losing the not in my backyard mentality knowing that those solutions will be copied, likely without appropriate attribution of intellectual property, by the poorer nations of the world.

All this basically means is that global "de-carbonation" is an incredible myth for at least the next 30 years so focus on the "low hanging fruit", land use improvement, is about the only viable option until some of the energy of the future technologies arrive and are then predictably pirated by the third world.

We live in a hand me down world folks.  Leading by example is more than a cliche, it is business as usual.  If we use it, properly, they will too.  You cannot force "solutions" on a cost conscious world.


Saturday, April 11, 2015

Are Human Influences on Climate Really Small?

Are human influences on climate really small is the title of a post by Steven Koonin on Climate Etc.  The magnitude of the radiant gas forcing, CO2 and equivalent "greenhouse" gases, can be relative to your perspective.  That really isn't the way it should be.  Increasing one force on a stable system should be fairly easy to figure out.

For a "global" impact though you can pick a number of frames of reference.  Surface air temperature aka lower troposphere temperature, as I have mentioned is the worst possible choice of frames.  From the "surface" you have radiant, latent and convection energy flows that has different impacts on the "surface" temperature.  You can use an "effective" surfaces energy and temperature, for example 395 Wm-2 radiant, 90 Wm-2 latent plus 25 Wm-2 convective would produce an effective surface energy of 510 Wm-2 which would be roughly and effective temperature of 308 K degrees.  With a Top of the Atmosphere energy value of 240 Wm-2 which would have an effective temperature of 255 K degrees.  So the combined "greenhouse" effect from the "surface" would be 510-240=270 Wm-2 per 308-255=53 K degrees.  That produces a 5.1Wm-2/K forcing ratio.  There is of course some uncertainty involve so the "standard" of 5.35 Wm-2/K is close enough.

So if you are going to determine "small" for a doubling of CO2 which happens to use the "standard" dT=5.35ln(CO2f/CO2i), you should stick with an apples to apples comparison.  If you neglect latent and convective heat flux you would get 395-240=155 Wm-2 per 289-255=34 K degrees producing a 4.56 Wm-2/K ratio.

With a doubling of CO2 expected to add 3.7 Wm-2 of "forcing", everything else remaining equal, you would have a 3.7/270 yields 1.3% to 3.7/155 yields 2.4% impact on "surface" energy/temperature provided temperature and energy have a linear relationship.  They don't have a linear relationship of course, but for small changes you have small error with that assumption.

A simpler "guestimate" is to use the estimated forcing versus the estimated "force" or Down Welling Longwave Radiation (DWLR) estimated at about 340 Wm-2.  That gives you a touch over 1% impact, all things remaining equal, and the DWLR value should include all feedbacks for your small change and hopefully small estimation error.  I prefer that simplification since DWLR energy is roughly equal to the average ocean energy at an average temperature of ~4C degrees.  That produces an all things remaining equal estimate of  almost one C degrees per CO2 equivalent doubling or 3.7 Wm-2 of anthropogentic forcing.

So if you pick the worse thermodynamic frame of reference and assume away the reasons that it is the worst frame of reference you get a bigger impact than if you pick a simpler frame of reference.  Imagine that?

Following Koonin's post the "let's" play thermodynamics games begin.  The simple minded warmists pick the "global" surface temperature sans complications, the coldists pick their cherry, but if you consider dT~5.35ln(CO2f/CO2i) you should pick effective surface temperature and energy which includes latent and convective heat loss estimates.

The you could blow that off and use the "subsurface" temperatures which have less latent and convective flux to worry about.  I have been saying this for some time, but I noticed some others are moving into a similar train of thought.

Issac Held has a post on "Addicted to Global Mean Surface Temperature" and Roy Spencer an anti-skydragon post or soil temperatures.  Neither quite gets to the real simplicity of "sub-surface" energy which is the actual best possible reference for a change in atmospheric forcing.  It will be interesting though to see just how complex and convoluted the extreme factions of the debate will go to preserve their personal ideology.


Saturday, March 28, 2015

Ted Cruz, NASA and NOAA

Ted Cruz is a Texas senator is the chairman of the senate Subcommittee on Space, Science, and Competitiveness.  There is a bit of a to do over a confrontation between Senator Cruz and NASA administrator Charles Bolden over NASA'a "core mission".  What is interesting is how this has been overly played by the warm and fuzzy minions.

Between NASA and NOAA we have two huge agencies tasked with space and inner-space research.  There is some overlap of responsibilities and needed joint cooperation, but there is considerable redundancy that is not only not cost effective but counter productive.  NASA GISS for example has a surface temperature product and climate science division as does NOAA (NCDC) and the GFDL.  The NASA GISS head at the time James Hansen predicted that CO2 related anthropogentic climate change would be extremely hazardous, potentially 4 C of warming while Syukuro Manabe with the GFDL predicted less impact, about 2 C  Currently observations tend to indicate that Manabe, with the GFDL, tasked with the inner space duties, knew more about the inner space climate than Hansen tasked with the outer space duties (NASA).  So we have a huge group of scientists on the public payroll during their job apparently pretty well and a few scientists on the public payroll venturing into areas outside of their agencies per view, not doing all that great of a job.

Senator Cruz appears to have mention that perhaps NASA should try to focus more on their real mission instead of competing, somewhat poorly, with other agencies doing their job.  That should be a fairly common sense type of concern for a senator watching out supposedly for your tax dollars.

From thinkprogress, “Our core mission from the very beginning has been to investigate, explore space and the Earth environment, and to help us make this place a better place,” Bolden said.

Well that's fine.  Nasa was founded in 1958 at the beginning of the cold war space race and had a noble mission statement at that time.  NOAA was founded in 1970 to focus on the inner environment, Oceanographic and atmosphere, sort of a spin off of NASA.  Bolden appears to not have gotten that memo.

According to media and blogs like ThinkProgress, Senator Cruz is an anti-science imbecile because he questions NASA's competition with NOAA and says things like, “Almost any American would agree that the core function of NASA is to explore space,” he said. “That’s what inspires little boys and little girls across this country … and you know that I am concerned that NASA in the current environment has lost its full focus on that core mission.”

How about some NASA history?  
To understand and protect our home planet,
To explore the universe and search for life,
To inspire the next generation of explorers,
… as only NASA can.


I find is hard to find fault with Senator Cruz's paraphrase there. 

If NOAA or its National Weather Service needs new weather or climate science platforms, NASA's mission would be to assist in the design and placement of those space vehicles.  If communications satellites are needed, NASA's mission is to assist in the design and placement of those space vehicles.  That doesn't mean that NASA is in the television industry or the climate science industry, it is in the space industry.    If NASA wants to do the job of NOAA, then eliminate NOAA, or how about just everyone sticking to their specialty instead of free lancing?


Monday, March 23, 2015

New papers getting a look

Bjorn Stevens and crew have been busy.  I stumbled on a Sherwood et al. paper that included Bjorn Stevens as one of the als.  The paper concerns some issues with the "classic" radiant forcing versus surface temperature and "adjustments" that should be considered.  If is reviewed and ready for publication but hasn't hit the presses yet.  A biggy in the paper concerns the "fungibility" of dTs which I have harped on quite a bit invoking the zeroth law of thermodynamics.  "Surface" temperature where the surface is a bit vague isn't all that helpful.  Unfortunately, surface temperature is about all we have so there needs to be some way to work around the issues.  Tropical SST is about the best work around, but that really hasn't caught on.

In my post How Solvable is a Problem I went over some of the approximations and their limitations.  I am pretty sure the problem isn't as "solvable" as many would like, but it looks to be more solvable than I initially thought.

Since Dr. Stevens also have a recent paper on aerosol indirect effects, I thought I would review the solar TSI and lower stratosphere correlation.

I got pretty lazy with this chart but is shows the rough correlation which is about 37% if you lag solar by about a year.  It is better when volcanic sulfates are present in the tropics, but I can safely say most of the stratospheric cooling is due to a combination of volcanic aerosols and solar variability.   The combinations or non-linearly coupled relationships are a large part of the limit to "solution".  When you have three or more inter-relationships you get into the n-body type problems that are beyond my pay grade.  You can call it chaos or cheat and use larger error margins.  I am in the cheat camp on this one.

The cheat camp would post up charts like this.

We are on a simple linear regression path and about to intersect another "normal range" so surface temperatures in the tropical oceans should crab sideways in the "normal" range with an offset due to CO2 and other influences.  Not a very sexy prediction but likely pretty accurate.  "Global" SST with light smoothing should vary by about +/- 0.3 C and with heavy smoothing possibly +/- 02 C degrees.
Plus or minus 0.3 C is a lot better than +/- 1.25 C, but with one sigma as error margins there is still a large 33% error window.  So technically, I should change my handle to +/- 0.3 C to indicate an overall uncertainty instead of the +/- 0.2 C, CO2 only claim  That would indicate that I "project" about 0.8 C per 3.7 Wm-2 from the satellite baseline with +/- 0.3 C of uncertainty.  The limit of course is water vapor and aerosols which tend to regulate the upper end.  Global mean surface temperature still sucks, but with the oceans as a reference, it sucks less.  This hinges on some better understanding of solar and volcanic interacting "adjustments" which is looking more likely.

If there are more ocean, especially tropical ocean papers that attempt to estimate "sensitivity" sans the flaky land surface temperature 30%, my estimate should start making more sense.  It is heartening to see clouds being viewed as a regulating feedback than a dooms day positive feedback which took a lot longer than I expected.  Still a bit surprised it took so long to the dTs "fungibility" issue to be acknowledged since that was about my first incoherent blog post topic.  "Energy is fungible the work it does is not" was the bottom line of that post.  Since we don't have a "global" mean surface energy anomaly or a good way to create one, adjusting dTs is the next best route.  Then we may discover a better metric along the way.  Getting that accepted will be a challenge.

With more of the sharper tacks being paid attention to, Stephens, Stevens, Schwartz, Webster etc., this could turn into the fun puzzle solving effort I envisioned when I first started following this otherwise colossal waste of time.


Wednesday, March 18, 2015

How solvable is a problem?

If you happen upon this blog you will find a lot of post that don't resemble theoretical physics.  There is a theoretical basis for most of my posts but it isn't your "standard" physical approach.  There are hundreds of approaches that can be used, and you really never know what approach is best until you determine how solvable a problem is.

One of the first approaches I used with climate change was based on Kimoto's 2009 paper "On the confusion of Planck feedback parameters".  Kimoto used a derivation of the change in temperature with respect to "forcing" dT=4dF, which has some limits.  Since F is actual energy flux not forcing you have to consider types of energy flux that have less or more impact on temperature.  Less would be latent and convective "cooling" which is actual energy transfer to another layer of the problem and temperature well above or below "normal".   The 4dF implies a temperature which has exactly 4 Wm-2 change per degree.  Depending on your required accuracy, T needs to be in range so than the 4dF doesn't create more inaccuracy than you require.  So the problem is "solvable" only within a certain temperature/energy range than depends on your required accuracy.  If you have a larger range you need to adjust your uncertainty requirements or pick another method.

You can modify the simple derivation to dT=4(a*dF(1) + b*dF(2) +....ndF(n+1)) which is what Kimoto did to compare state of the science at the time, estimates of radiant, latent and sensible energy flux.  You can do that because energy is fungible, but you would always have an unknown uncertainty factor because while energy is fungible, the work that it does is not.  In a non-linear dissipative system, some of that work could be used to store energy that can reappear on a different time scale.  You need to determine the relevant time scales required to meet the accuracy you require so you can have some measure of confidence in your results.

Ironically, Kimoto's paper was criticized for making some of the same simplifying assumptions that are used by climate science to begin with.  The assumptions are only valid for a small range and you cannot be sure how small that range needs to be without determining relevant times scales.

In reality there are no constants in the Kimoto equation.  Each assumed constant is a function of the other assumed constants.  You have a pretty wicked partial differential equation.  With three or more variables it becomes a version of the n-body problem which should have Nobel Prize attached to the solution.  I have absolutely no fantasies about solving such a problem, so I took the how solvable is it approach.


The zeroth law of thermodynamic and the definition of climate sensitivity come into conflict when you try to do that.  The range of temperatures in the lower atmosphere is so large in comparison to the dT=4dF you automatically have +/- 0.35 C of irreducible uncertainty.  That means you can have a super accurate "surface" temperature but the energy associated with that temperature can vary by more than one Wm-2.  If you use Sea Surface Temperature which has a small range you can reduce that uncertainty but you have 30% of the Earth not being considered, resulting in about the same uncertainty margin.  If you would like to check this pick some random temperatures in a range from -80C to 50 C and convert them to temperature using the Stefan-Boltzmann Law.   Then average both and reconvert to compare.  Since -80C has a S-B energy of 79 Wm-2 versus 618 Wm-2 for 50 C neglecting any latent energy, you can have a large error.  In fact the very basic greenhouse gas effect is based on 15 C (~390 Wm-2) surface temperature versus 240 Wm-2 (~-18 C) effective outgoing radiant energy along with the assumption there is no significant error in this apples to pears comparison.  That by itself has roughly a +/- 2C and 10 Wm-2 uncertainty on its own.  That in no way implies there is no greenhouse effect, just most of the simple explanations do little to highlight the actual complexity.

Determining that the problem likely cannot be solved to better than +/-0.35C of accuracy using these methods is very valid theoretical physics and should have been a priority from the beginning.


If you look at the TOA imbalance you will see +/- 0.4 which is Wm-2 and due to the zeroth law issue that could just as easily be in C as well.  The surface imbalance uncertainty is larger +/- 17 Wm-2, but that is mainly due to poor approaches than physical limits.  The actual physical uncertainty should be closer to +/- 8 Wm-2 which is due to the range of water vapor phase change temperatures.  Lower cloud bases with more cloud condensation nuclei can have a lower freezing point.  Changing salinity changes freezing points.  When you consider both you have about +/- 8 Wm-2 of "normal" range.

Since that +/- 8 Wm-2 is "global", you can consider combined surface flux, 396 radiant, 98 latent and 30 sensible which total 524 Wm-2 which is about half of the incident solar energy available.  I used my own estimate of latent and sensible based on Chou et al 2004 btw.  If there had not been gross underestimations in the past, the Stephens et al. budget would reflect that.  This is a part of the "scientific" inertia problem.  Old estimates don't go gracefully into the scientific good night.

On the relevant time scale you have solar to consider.  A very small reduction is solar TSI of about 1 Wm-2 for a long period of time can result in an imbalance of 0.25 to 0.5 Wm-2 depending on how you approach the problem.  With an ocean approach, which has a long lag time, the imbalance would be closer to 0.5 Wm-2 and with an atmospheric approach with little lag it would be closer to 0.25 Wm-2.  In either case that is a significant portion of the 0.6 +/- 0.4 Wm-2 isn't it?

Ein=Eout is perfectly valid as a approximation even in a non-equilibrium system provided you have a reasonable time scale and some inkling of realistic uncertainty in mind.  That time scale could be 20,000 years which makes a couple hundred years of observation a bit lacking.  If you use Paleo to extend you observations you run into the same +/-0.35 minimum uncertainty and if you use mainly land based proxies you can reach that +/- 8 Wm-2 uncertainty because trees benefit from the latent heat loss in the form of precipitation.  Let's fact it, periods of prolonged drought do tend to be warmer.  Paleo though has its own cadre of over simplifiers.  When you combine paleo reconstructions from areas that have a large range of temperatures the zeroth law still has to be considered.  For this reason paleo reconstructions of ocean temperatures where there is less variation in temperature would tend to have an advantage, but most of the "unprecedented" reconstructions involve high latitude, higher altitude regions with the greatest thermal noise and represent the smallest areas of the surface.  Tropical reconstructions that represent the majority of the energy and at least half of the surface area of the Earth paint an entirely different story.  Obviously, on a planet with glacial and interglacial periods the inter-glacial would be warmer and if the general trend in glacial extent is downward, there would be warming.  The question though is how much warming and how much energy is required for that warming.

If this weren't a global climate problem, you could control conditions to reduce uncertainty and do some amazing stuff, like ultra high scale integrated circuits. With a planet though you will most likely have a larger than you like uncertainty range and you have to be smart enough to accept that.  Then you can nibble away at some of the edges with combinations of different methods which have different causes of uncertainty.  Lots of simple models can be more productive than one complex model if they use different frames of reference.

One model so simple it hurts is "average" ocean energy versus "estimated" Downwelling Long Wave Radiation (DWLR).  The approximate average effective energy of the oceans is 334.5 Wm-2 at 4 C degrees and the average estimate DWLR is about 334.5 Wm-2.  If the oceans are sea ice free, the "global" impact of the average ocean energy is 0.71*334.5=237.5 Wm-2 or roughly the value of the effective radiant layer of the atmosphere.  There is a reason for the 4 C to be stable thanks to the maximum density temperature of fresh water of 4 C degrees.  Adding salt varies the depth of that 4 C temperature layer, but not is value and that layer tends to regulate average energy on much longer time scales since the majority of the oceans are below the 4 C layer.  Sea ice extent varies and the depth of the 4 C layer changes, so there is a range of values you can expect, but 4 C provides a simple, reliable frame of reference.   Based on this reference a 3.7 Wm-2 increase in DWLR should result in a 3.7 Wm-2 increase in the  "average" energy of the oceans, which is about 0.7 C of temperature increase, "all things remaining equal".

Perhaps that is too simple or elegant to be considered theoretical physics?  Don't know, but most of the problem is setting up the problem so it can be solved to some useful uncertainty interval.  Using just the "all things remaining equal" estimates you have a range of 0.7 to 1.2 C per 3.7 Wm-2 increase in atmospheric resistance to heat loss.  The unequal part is water vapor response which based on more recent and hopefully more accurate estimates is close to the limit of positive feedback and in the upper end of its regulating feedback range.  This should make higher than 2.5 C "equilibrium" very unlikely and reduce the likely range to 1.2 to 2.0 C per 3..7 Wm-2 of "forcing".  Energy model estimates are converging on this lower range and they still don't consider longer time frames required for recovery from prolonged solar or volcanic "forcing".

If this were a "normal" problem it would be fun trying various methods to nibble at the uncertainty margins, but this is a "post-normal" as in abnormal problem.  There is a great deal of fearful over confidence involved that has turned to advocacy.  I have never been one to follow the panic stricken as it is generally the cooler heads that win the day, but I must be an exception.  We live in a glass half empty society that tends to focus on the negatives instead of appreciating the positives.  When the glass half empties "solve" a problem that has never been properly posed, you end up were we are today.  If Climate Change is worse than they thought, there is nothing we can do about it.  If Climate Change is not as bad as they thought, then there are rational steps that should be taken.  The panic stricken are typically not rational.