New Computer Fund

Monday, December 29, 2014

Clouds, ENSO, Adjustments and Feedback

I love when the conversation turns to clouds and whether they are a positive or negative feedback.  Clouds are both.  Whether they are a "net" positive or negative feedback depends on where they change.  Since they are not "fixed" I doubt there is any definitive proof they are anything other than a regulating variable since they can swing both ways.

At higher latitudes clouds are most likely positive feedbacks.  Winter clouds tend to let the cloud covered region get less cold.  In the tropics, clouds tend to keep areas from getting too warm.  The battle ground should be the mid latitudes.

Since most of the energy absorbed is in the tropics, knowing how well the tropics correlated with "global" temperature would be a good thing to know so that you can get a feel for how much "weight" to place on the "regional" responses.  With the exception of the 21st century, the "Global" temperature have followed tropical temperatures extremely closely.  Now it would be nice to just compare changes with cloud cover with changes in tropical temperature right?

Nope, the cloud fraction data sucks the big one.  Since that data sucks so bad you are stuck with "modeling".  You can use a super doper state of the art climate model or you can infer cloud response based on logic.  Both are models since you are only going to get some inferred answer, the logic just cost less.

Logic: Since warmer air can hold more moisture than colder air, all things remaining equal, there would be an increase in clouds with ocean warming.  If clouds were "only" a positive feedback to surface warming, the system would be unstable and run away.  Simple right.  You can infer clouds must have some regulation effect on tropical climate or there would be no tropical oceans.  They would have boiled away or be frozen.

That logic obviously has limits.  There is more to the globe than the tropics and there is and has been warming in the tropics.  Could it be we have reached a tipping point that will change millions of years worth of semi-stable temperatures?

This is where paleo data could come in handy.

 I use this Oppo et al. Indo-Pacific Warm Pool quite often because it agrees well with the tropical oceans which agree well with "global" temperatures and is backed by the simple logic that the oceans wag the tail.  So provided the IPWP does "teleconnect" with "global" climate, things look pretty "normal".  There was a cooler period that Earth is recovering from and now she is back into her happy zone.  There are absolutely no guarantees that is the case, just evidence that it could be.

Since "global" climate follows tropical oceans so closely except for that little 21st century glitch,  there probably isn't much more to worry about in the tropics.

If I use the not so elaborately interpolated GISS data, the 250km version instead of the 1200km version, there isn't any glitch to be concerned about.  Climate suddenly gets a lot more simple.  The tropical oceans have been warming for the past 300 years to recover from the little ice age and now thing are back to "normal" for the past 2000 years.  CO2 has some impact but likely not anywhere near as much as advertised.  That would imply clouds/water vapor are not as strong a positive feedback as estimated.

Well, simple, really doesn't cut it in climate science.  There are a lot of positions to fill and mouths to feed so complexity leads to full employment and happy productive climate scientists.

That means "adjustments" will be in order.  Steven Sherwood, the great "adjuster", has a paper that explains what is required to keep climate scary enough for climate scientists to help save the world from itself.  Part of the "adjustments" is removing or managing ENSO variability and volcanic "semi-direct" effects so that the monstrous CO2 signal can be teased out of the "noise".  Then, clouds will be a net positive feedback and water vapor will drive climate higher and we will finally see the ominous TROPICAL TROPOSPHERIC HOT SPOT!  Then finally, it will be "worse than we thought".

Think just for a second.  If the tropical oceans "drive" global climate, and ENSO is a feather of the tropical oceans, why in the hell would you want to remove the climate driver?  Because it is inconvenient.  Natural variability (read ENSO) cannot POSSIBILITY be more than a tiny fraction of climate because my complex models told me so.  According to a few trees in California and Russia, that Oppo et al. reconstruction has to be total crap because climate never varied one bit until this past half century once we got the surface stations tuned and adjusted to fit the models that are tuned and adjusted to match our theory that CO2 is tuning and adjusting climate.  Can't you thick headed, Neanderthal, deniers get it that we are "climate scientists" and know what we are doing!!

Frankly, no.

Added;  In case you are wondering, there is a 94.2% correlation between annual ERSSTv3b tropical SST and GISS 250km "global" temperatures.  The tropical oceans are about 44% of the global area.  The 11 yr average difference between the two is +/-0.1 and close to +0.2/-2.8 annually.  While long range interpolation does produce a more accurate "global" average temperature it doesn't produce a more accuarte "global" average energy due to very low temperature anomalies having the same "weight" as much higher energy tropical anomalies.

Friday, December 26, 2014

From the Basics- the 33 Degree "Discrepancy"

The 33 degree discrepancy is the starting point for most discussion on the Greenhouse Gas Effect.  It basically goes like this:

The Earth receives 240 Wm-2 of solar energy which by the Stefan-Boltzmann Law would have a temperature of 255K degrees,  The surface of the Earth is 288K degrees which by the Stefan-Boltzmann Law would have an energy of 390 Wm-2.  The differences 33C(K) degree and 150 Wm-2 are the Greenhouse Gas Effect.

Sound very "sciency" and authoritative like it is carved in stone.  It is actually a logical statement.  If this and if that then this.  So there are several hidden assumptions.


First, the Stefan-Boltzmann law has an uncertainty of 0.924 which is a bit more of a low side instead of a plus or minus type of uncertainty.  That is important to remember because a lot of statistics assume a "normal" distribution, +/- the same amount around a mean, to be accurate.  The S-B correction is related to efficiency which cannot exceed 100% so 0.924 implies you can "expect" around 92.4% efficiency.  I am sure many can argue that point, but that is my understanding.

So using that uncertainty "properly", both sides of the 33C discrepancy should have -0.76% error bars included in there some where.  I have never seen that displayed on any AGW site.  So for the solar energy received you should have 240 Wm-2 to 222 Wm-2 or about 18 Wm-2 of slop.

That "slop" may be due to a number of factors, but since the 33C discrepancy assume a constant reflection, there is a good chance that the two are inextricably inter related.  There is no "perfect" black body and low angle reflection is likely the reason.  We can get flat surfaces to behave like 99.9999.. percent black bodies, but most objects are not perfectly flat.

Now this large error range was actually included in one of the more recent Earth Energy Budgets by Stephens et al.

Right at the bottom by Surface Imbalance is a +/-17 Wm-2.  Now that error range is based on a large sampling of guestimates, not the basic S-B law, but it is pretty amazing to me how well the old guys included uncertainty.  Now you could jump in and say, "but that is a +/- error range!"  True, but it is based on guestimates.  You could rework the budget so that you could reduce that range to about half or around +/-8.5 Wm-2, but you are going to find it hard to get below that range.


The Mysterious Land Amplification Issue


I notice the odd land amplification in the Northern Hemisphere between latitudes 30 and 90 North a long time ago.  The Berkeley Earth Surface Temperature project pick up on it but by and large it is pretty much ignored.  Originally, I was pretty certain it had to be due to land use and/or surface station impacts due to land use, but there isn't really anyway that I have found to make a convincing case.  So it is still a mystery to me.

In the chart above I tried to highlight the situation.  Excluding the poles since the SH coverage really sucks, all of the globe pretty much follows the same course.  The highly variable region from 30N-60N really makes a jump with the 1998 El Nino.  It looks like it tried to jump prior to Pinatubo, but got knocked back in line for a few years.  Southern Hemisphere and equatorial land temperature don't even come close to making the same leap.


The RSS lower troposphere data makes a similar leap but with about half the amplitude.


Berkeley Earth Tmax makes the big leap.


Berkeley Earth Tmin makes about the same leap though Climate Explorer picked a different scale.  That should reduce the chances that it is a UHI issue.  Even though there are some issues with Tmin, a land use cause should have more Tmax influence than Tmin if it is albedo related in any case.  That could also put Chinese aerosols out of the running.



Playing around with the Climate Explorer correlation option, the amplification seems to correlated with the growing season so it could be related to agricultural land use or the more variable vegetation related CO2 swings.  DJF have the worst correlation which for this time frame, 1979 to present would be consistent with the change in Sudden Stratospheric Warming events and greater Arctic Winter Warming.  Climate Explorer has limited series for their correlation option and I haven't attempted to upload any of my own yet.


So there is still a mystery there for me.  This post is just to remind me of a few things I have looked at so perhaps I will stop recovering the same ground and possibly inspire some insomniac to join in on the puzzle.

Monday, December 15, 2014

The Never Ending Gravito-Thermal Effect

I have my issues with "ideal" models, they are useful but rarely "perfect.  So when an "ideal" model/concept is used in a number steps during a derivation, I always look for how wrong can that be.  I may be a bit of a pessimist, but that is something I consider "normal" because virtually nothing is absolutely 100% efficient.  The Gravito-Thermal "Effect" seems to boil down to interpretation of an ideal kinetic temperature concept used to formulate the ideal gas laws.

Hyper-Physics has a very nice explanation of Kinetic Theory.

 The force exerted by any number of gas molecules can be simplified by assuming "perfectly" elastic collisions with the walls of a container, ignoring collisions with other molecules, so that kinetic energy is *exactly* equal to;
If you include gravity, the force exerted upwardly would always be less than the force exerted downward.  When you have a lot of molecules and a small distance between top and bottom, the difference is negligible.  The greater the distance, the more likely there will be some significant difference in the force applied to the top and bottom of the "container" which is the Gravito-Thermal effect.  If you have a scuba tank, don't expect to see any temperature difference.  If you have a very tall tank with very few molecules, there is a likelihood there will be some "kinetic temperature" difference between the top and bottom.  Kinetic temperature is in quotes because what it really is is average applied force.  That is the very first assumption made in the derivation.

So if you install a pressure sensing plate, like a thermocouple at the bottom and top of a very tall container they would measure different applied pressures even though everything else in the container remains constant.

Remember though that Kinetic or translation temperature doesn't include anything but three directional degrees of motion or freedom.  Real molecules have more tricks up their sleeve that can produce radiant energy transfer and stored energy as in potential or enthalpy.  The ideal gas concept promotes conduction which is related to diffusion in a gas, to the grandest of poobahs of heat transfer.  In the real world conduction is gases is pretty slow and not all that efficient.

Most folks would drop the subject about now and admit that there would be a "real" temperature difference in some cases, but there are so many other "real" issues that have to be considered that any further discussion of the Gravito-Thermal effect based on an ideal gas is about the largest waste of time imaginable.  Not so in the world of academia where every nit is a potential battle ground.

Key points for them wishing to waste their time would be is gravity an "ideal" containment since gravity doesn't produce heat, it is the sudden stops that produce the heat. If the gas has the "potential" to whack something and doesn't, it doesn't transfer an force so it is not producing the heat which would be the temperature in the case of an ideal gas.

Some of the more creative seem to think this "ideal" case will result in a fantastic free energy source that will save the world.  I am not sure why "saving the world"  always seems to boil down to the more hair-brained of concepts, but that does appear to be the tendency.  A neutron busting the hell out of a molecule produces much more energy which has a proven track record of providing usable energy when properly contained and not embellished with Hollywood fantasy super powers.  But as always, one persons dream is another's nightmare. Even if all the fantasy inventions worked, there would still be a need for someone to save the world from perfection.

During the last "debate" I mentioned that for a rotating planet, the maximum velocity of the molecules in the upper atmosphere would be limit by the escape velocity or speed.  So even ignoring all the other minor details, gravity has containment limits which would be temperature limits.  Earth for example loses around 3 kilograms per second of hydrogen in spite of having a geomagnetic shield that helps reduce erosion of the atmosphere and there are molecules somewhat suspended in pseudo-orbits of various duration depending on their centrifugal force versus gravitational force.  Centrifugal and gravitational forces again don't produce heat until that energy is transferred.  So a cluster of molecules could be traveling along at near light speed, minding their own business, having a "local" temperature that would tend to change abruptly if the clusters whacks another cluster.  Potential energy is not something to be ignored.

Speaking of potential energy, during the last very long discussion, the Virial Theorem made a showing and I was chastised for mentioning that the VT produces a reasonable estimate.  This lead to another *exact* discussion where if you force the universe to match the "ideal" assumption required, mathematically, the "solution" is *exact*.  Perfection doesn't really exist sports fans.  Every rule has its exception which is what makes physics phun.  In "ideal" cases those constants of integration are really constants but in the real world they are more likely complex functions.  More often than not, assuming the constant is really constant is close enough for government work, but a smart engineer always allows for a bit of "slop" or inefficiency if you prefer.  Some scientist tend to forget that, so IMHO, it is always nice to have an engineer on hand to provide reality checks.

What was interesting to me about the whole discussion was how the universe tends to prefer certain version of randomness more than others.  For the Virial Theorem, T=2*Tp or kinetic energy is equal to 2 times the potential energy.  So Total Energy is never equal to a perfectly isothermal of maximum entropy state.  Since the universe is supposed to be moving towards an ultimate heat death or true maximum entropy some billions and billions of years in the future, potential energy should slow be reducing over time.  That would make the Virial Theorem a good estimate for the way things are now which should be close enough for a few billion generations.  So for now, potential is about 2/3 of total so the things physical in the universe should prefer a ratio in the ballpark of 1.5 to 2.

If you have read some of my older posts, V. M. Selvam likes to use the Golden Ratio of ~1.618... in her Self Organizing Criticality analysis and Tsallis among others finds similar common ratios for "stable" systems.  Nothing is required to be "stable" in dynamics forever so "preferred state" is probably a better term than "stable state".  When things get close to "equilibrium" 2nd and 3rd order influences can tend to ruin that "equilibrium" concept which is joined at the hip with the entropy concepts.

Boltzmann's concept of entropy would then be a bit too ideal, which started the whole Gravito-Thermal debate to begin with.  Gibbs, Tsallis and many others have functions, intentional or not included in their definitions of entropy to allow for the "strangeness" of nature.  Nature probably isn't strange at all, our ideal concepts are likely the strangeness, which is apparent in any debate over the Gravito-Thermal Effect.


Update:  Since I started this mess I may as well link to John Baez and his Can Gravity Decrease Entropy post.  He goes into more detail on the Virial Theorem on another post in case you are curious.  The main points, IMHO, is that a gravitationally bound system cannot ever really be in "equilibrium". with its surrounds and the basic requirement for an isothermal or even an adiabatic system is the need for some "real" equilibrium.  Boltzmann's entropy is a attempt to maximize "within a volume", that f=ma issue and a system bounded by gravity is trying to increase entropy by decreasing potential energy, i.e. compressing everything to create heat/kinetic energy.  A gravitationally bound system will either completely collapse or portions will boil off.  The Ideal kinetic model maximizes entropy by not allowing anything to boil off.

.