New Computer Fund

Monday, December 15, 2014

The Never Ending Gravito-Thermal Effect

I have my issues with "ideal" models, they are useful but rarely "perfect.  So when an "ideal" model/concept is used in a number steps during a derivation, I always look for how wrong can that be.  I may be a bit of a pessimist, but that is something I consider "normal" because virtually nothing is absolutely 100% efficient.  The Gravito-Thermal "Effect" seems to boil down to interpretation of an ideal kinetic temperature concept used to formulate the ideal gas laws.

Hyper-Physics has a very nice explanation of Kinetic Theory.

 The force exerted by any number of gas molecules can be simplified by assuming "perfectly" elastic collisions with the walls of a container, ignoring collisions with other molecules, so that kinetic energy is *exactly* equal to;
If you include gravity, the force exerted upwardly would always be less than the force exerted downward.  When you have a lot of molecules and a small distance between top and bottom, the difference is negligible.  The greater the distance, the more likely there will be some significant difference in the force applied to the top and bottom of the "container" which is the Gravito-Thermal effect.  If you have a scuba tank, don't expect to see any temperature difference.  If you have a very tall tank with very few molecules, there is a likelihood there will be some "kinetic temperature" difference between the top and bottom.  Kinetic temperature is in quotes because what it really is is average applied force.  That is the very first assumption made in the derivation.

So if you install a pressure sensing plate, like a thermocouple at the bottom and top of a very tall container they would measure different applied pressures even though everything else in the container remains constant.

Remember though that Kinetic or translation temperature doesn't include anything but three directional degrees of motion or freedom.  Real molecules have more tricks up their sleeve that can produce radiant energy transfer and stored energy as in potential or enthalpy.  The ideal gas concept promotes conduction which is related to diffusion in a gas, to the grandest of poobahs of heat transfer.  In the real world conduction is gases is pretty slow and not all that efficient.

Most folks would drop the subject about now and admit that there would be a "real" temperature difference in some cases, but there are so many other "real" issues that have to be considered that any further discussion of the Gravito-Thermal effect based on an ideal gas is about the largest waste of time imaginable.  Not so in the world of academia where every nit is a potential battle ground.

Key points for them wishing to waste their time would be is gravity an "ideal" containment since gravity doesn't produce heat, it is the sudden stops that produce the heat. If the gas has the "potential" to whack something and doesn't, it doesn't transfer an force so it is not producing the heat which would be the temperature in the case of an ideal gas.

Some of the more creative seem to think this "ideal" case will result in a fantastic free energy source that will save the world.  I am not sure why "saving the world"  always seems to boil down to the more hair-brained of concepts, but that does appear to be the tendency.  A neutron busting the hell out of a molecule produces much more energy which has a proven track record of providing usable energy when properly contained and not embellished with Hollywood fantasy super powers.  But as always, one persons dream is another's nightmare. Even if all the fantasy inventions worked, there would still be a need for someone to save the world from perfection.

During the last "debate" I mentioned that for a rotating planet, the maximum velocity of the molecules in the upper atmosphere would be limit by the escape velocity or speed.  So even ignoring all the other minor details, gravity has containment limits which would be temperature limits.  Earth for example loses around 3 kilograms per second of hydrogen in spite of having a geomagnetic shield that helps reduce erosion of the atmosphere and there are molecules somewhat suspended in pseudo-orbits of various duration depending on their centrifugal force versus gravitational force.  Centrifugal and gravitational forces again don't produce heat until that energy is transferred.  So a cluster of molecules could be traveling along at near light speed, minding their own business, having a "local" temperature that would tend to change abruptly if the clusters whacks another cluster.  Potential energy is not something to be ignored.

Speaking of potential energy, during the last very long discussion, the Virial Theorem made a showing and I was chastised for mentioning that the VT produces a reasonable estimate.  This lead to another *exact* discussion where if you force the universe to match the "ideal" assumption required, mathematically, the "solution" is *exact*.  Perfection doesn't really exist sports fans.  Every rule has its exception which is what makes physics phun.  In "ideal" cases those constants of integration are really constants but in the real world they are more likely complex functions.  More often than not, assuming the constant is really constant is close enough for government work, but a smart engineer always allows for a bit of "slop" or inefficiency if you prefer.  Some scientist tend to forget that, so IMHO, it is always nice to have an engineer on hand to provide reality checks.

What was interesting to me about the whole discussion was how the universe tends to prefer certain version of randomness more than others.  For the Virial Theorem, T=2*Tp or kinetic energy is equal to 2 times the potential energy.  So Total Energy is never equal to a perfectly isothermal of maximum entropy state.  Since the universe is supposed to be moving towards an ultimate heat death or true maximum entropy some billions and billions of years in the future, potential energy should slow be reducing over time.  That would make the Virial Theorem a good estimate for the way things are now which should be close enough for a few billion generations.  So for now, potential is about 2/3 of total so the things physical in the universe should prefer a ratio in the ballpark of 1.5 to 2.

If you have read some of my older posts, V. M. Selvam likes to use the Golden Ratio of ~1.618... in her Self Organizing Criticality analysis and Tsallis among others finds similar common ratios for "stable" systems.  Nothing is required to be "stable" in dynamics forever so "preferred state" is probably a better term than "stable state".  When things get close to "equilibrium" 2nd and 3rd order influences can tend to ruin that "equilibrium" concept which is joined at the hip with the entropy concepts.

Boltzmann's concept of entropy would then be a bit too ideal, which started the whole Gravito-Thermal debate to begin with.  Gibbs, Tsallis and many others have functions, intentional or not included in their definitions of entropy to allow for the "strangeness" of nature.  Nature probably isn't strange at all, our ideal concepts are likely the strangeness, which is apparent in any debate over the Gravito-Thermal Effect.


Update:  Since I started this mess I may as well link to John Baez and his Can Gravity Decrease Entropy post.  He goes into more detail on the Virial Theorem on another post in case you are curious.  The main points, IMHO, is that a gravitationally bound system cannot ever really be in "equilibrium". with its surrounds and the basic requirement for an isothermal or even an adiabatic system is the need for some "real" equilibrium.  Boltzmann's entropy is a attempt to maximize "within a volume", that f=ma issue and a system bounded by gravity is trying to increase entropy by decreasing potential energy, i.e. compressing everything to create heat/kinetic energy.  A gravitationally bound system will either completely collapse or portions will boil off.  The Ideal kinetic model maximizes entropy by not allowing anything to boil off.

.  





Wednesday, November 26, 2014

Why do the "Alarmists" Love Marcott et al.?

Because it looks like what they want.  The Marcott et al. "hockey" stick is spurious, A result of a cheesy method, not including data that was readily available and a few minor date "correcting" errors.  The authors admit that the past couple of hundreds years are "not robust" but even highly respected institutions like NOAA include the "non-robust" portion as part of a comedic ad campaign.



I don't think that self-deception is actionable since that is just part of human nature, but when the likes of NOAA join the moonbeams, it can become more than comical.  Unlike the Mann hockey stick where inconvenient (diverging) data was cutoff, Marcott et al. just didn't dig a bit deeper to find data that didn't "prove" their point.  A great example is the Indo-Pacific Warm Pool (IPWP).  Oppo et al. published a 2000 year reconstruction in 2009 and Mohtadi et al. a Holocene reconstruction of the IPWP in 2010.  If you compare the two, this is what you "see".



The sparse number of data points in Mohtadi 2010 picks up the basic rough trend, but the Oppo 2009 indicates what is likely "normal" variability.  When you leave out that "normal" variability then compare to "normal" instrumental variability, the instrumental data looks "unprecedented".

Since the lower resolution reconstructions have large, on the order of +/- 1 C of uncertainty and your cheesy method ignores the inherent uncertainty of the individual times series used, you end up with an illusion instead of a reconstruction.

The funny part is that normally intelligent folks will defend the cheesy method to the death instead looking at the limits of the method.  The end result is once trusted institutions jumping on the group think bandwagon.

The data used by Marcott et al. is available on line in xls format for the curious and NOAA paleo has most of the data in text or xls formats so it is not that difficult to verify things fer yerself.  Just do it!

My "just do it" effort so far has the tropical ocean temperatures looking like this.



The tropical oceans, which btw are the majority of the oceans, tend to follow the boring old precessional orbital cycle with few "excursions" related to other climate influencing events like Ice dams building/breaking, volcanoes spouting off and the occasional visit of asteroids wanting a new home.  That reconstruction ends in 1960 with some "real" data and some last known values so there is not so much of a "non-robust" uptick at the end.  It only includes "tropical" reconstructions and there are a few more that I might include as I find time and AD reconstructions to "finish" individual time series.

Saturday, November 22, 2014

The Problem with Changing your Frame of Reference from "Surface" Temperature to Ocean Heat Content

The few of you that follow my blog know that I love to screw with the "geniuses" aka minions of the Great and Powerful Carbon.  Thermodynamics gives you an option to select various frames of reference which is great if you do so very carefully.  Not so great if you flip flop between frames.  The minions picked up the basics of the greenhouse effect fine, but with the current pause/hiatus/slowdown of "surface" warming, they have jumped on the ocean heat content bandwagon without considering the differences that come with the switch.

When one theorizes about the Ice Ages of Glacial periods, when the maximum solar insulation is felt at the 65 north latitude, the greater energy would help melt snow and ice store on land in the higher northern latitudes.  (Update:  I must add that even the shift to maximum 65 north insolation is not always enough to end an "ice age".)  Well there is more land mass in the northern hemisphere, so with more land benefiting from greater solar, what happens to the oceans that are now getting less solar?  That is right sports fans, less ocean heat uptake.  There is a northern to southern hemisphere "seesaw" because of the variation in the land to ocean ratio between the hemispheres.

So the Minions break out something like this Holocene temperature cartoon.



Then they wax all physics-acal about What's the Hottest Temperature the Earth's Been Lately moving into how the "unprecedented" rate of Ocean Heat Uptake is directly caused by their master the Great Carbon.  Earth came from the word earth, dirt, soil, land etc.  The oceans store energy a lot better than dirt.

If you want to use ocean heat content, then you need to try and reconstruct past ocean heat content.  The tropical oceans are a pretty good proxy for ocean heat content so I put together this reconstruction using data from the NOAA Paleoclimate website.  Based on this quick and dirty reconstruction, the oceans have been warming during the Holocene and now that the maximum solar insolation is in the southern hemisphere, lots more ocean area, the warming of the oceans should be reaching its upper limit.  Over the next 11,000 years or so the situation will switch to minimum ocean heat uptake due to the solar precessional cycle.  Pretty basic stuff.

With this reconstruction, instead of trying to split hairs, I just used the Mohtadi, M., et al. 2011.
Indo-Pacific Warm Pool 40KYr SST and d18Osw Reconstructions. which only has about 22 Holocene data points to create bins for the average of the other reconstructions, Marchitto, T.M., et al. 2010.
Baja California Holocene Mg/Ca SST Reconstruction, Stott 2004 Western Tropical Pacific and the two,  Weldeab et al. 2005&6 equatorial eastern and western Atlantic reconstructions.  There are plenty more to choose from so if you don't like my quick and dirty, go for it, do yer own.  I did throw it together kinda quick so there may be a mistake or two, try to replicate.

I have been waiting for a while for a real scientist to do this a bit more "scientifically", but since it is raining outside, what the heck, might as well poke at a few of the minions.

Update:  When Marcott et al. published their reconstruction they done good by providing a spread sheet with all the data.  So this next phase is going to include more of the reconstructions used in Marcott et al. but with a twist.  Since I am focusing on the tropical oceans, Mg/Ca (G. Ruber) proxies are like the greatest thing since sliced bread.  Unfortunately not all of the reconstructions used extend back to the beginning of the Holocene.  The ones that don't will need to be augmented with a similar recon is a similar area if possible or they are going to get the boot.  So far these are the (G. Ruber) reconstructions I have on the spread sheet.


As you can start to see, the Holocene doesn't look quite the same in the tropics.  There isn't a much temperature change and some parts of the ocean are warming throughout, or almost and some are cooling.  The shorter reconstructions would tend to bump the end of the Holocene up which might not be the case.  That is my reason for giving them the boot if there are others to help take they a bit further back in time.  As for binning, I am going to try and shoot for 50 years if that doesn't require too much interpolation.  Too much, is going to be up to my available time and how well my spread sheet wants to play.  With 50 year binning I might be able to do 30 reconstructions without going freaking insane waiting for Open Office to save every time I change something.  I know, there are much better ways to do things, but I am a programming dinosaur and proud of it.


Update: After double checking the spread sheet, a few of the shorter reconstructions had been cut off due to the number of points in my lookup table.  After fixing that, the shortest series starts 8600 years before present, 1950.


That is still a bit shorter than I want but better.  The periods where there aren't enough data points tend to produce hockey sticks upright or inverted which tends to defeat the purpose.  So until I locate enough "cap" reconstructions, shorter top layer or "cap" reconstructions can bring data closer to "present" and lower frequency recons to take data points back to before the Holocene starting point, I am trying Last Known Value, instead of any fancy interpolation or curve fitting.  That just carries the last available data value to the present/past so that the averaging is less screwed up.  So don't freak, as I find better extensions I can replace the LKV with actual data.  This is what the first shot looks like.




Remember that ~600 AD to present and 6600 BC and before have fill values, but from 6500 BC to 600 AD the average shown above should be pretty close to what actual was there.  The "effective" smoothing is in the ballpark of 300 years, so the variance/standard deviation is small.  Based on rough approximation, a decade bin with real data should have a variance of around +/- 1 C.  Also when comparing SST to "surface" air temperature, land amplifies tropical temperature variations.  I haven't figured out any weighting so far that would not be questioned, but weighting the higher frequency reconstructions a bit more would increase the variation.  In any case, there is a bit of a MWP indication and possibly a bigger little ice age around 200-300 AD.

Correction, +/- 1 C variance is a bit too rough, it is closer to 0.5 C for decade smoothing ( standard deviation of ~0.21 C) in the tropics 20S-20N that I am using. For the reconstruction so far the standard deviation from 0 to 1950, which has LKV filling is 0.13 C.  So instead of 1 SD uncertainty, I think 2 SD would be more appropriate estimate of uncertainty.  I am not at that point yet, but here is a preview.



A splice of observation with decadal smoothing to the recon so far looks like that.  It's a mini-me hockey stickette about 2/3rds the size of NOAA cartoon.  The Marcott "non-robust" stick is mainly due to the limited number of reconstructions making it to the 20th century which LKV removes.

Now I am working on replacing more of the LKV fill with "real" data.  One of the reconstructions that I have both low and high frequency versions of is the Tierney et al. TEX 86 for Lake Tanganyika which has a splicing choice.;  The 1500 year recon is calibrated to a different temperature it appears than the 60ka recon.  Since this is a Holocene reconstruction, I am "adjusting" the 1500 year to match the short overlap period of the longer.  That may not be the right way, but that is how I am going to do it.  There are a few shorter, 250 to 2000 year regional reconstructions that I can use to extend a few Holocene reconstructions, but it looks like I will have to pitch a few that don't have enough overlap for rough splicing.  Here is an example of some of the issues.


This is the Oppo et al. 2009 recon of the IPWP that I use very often because it correlates extremely well with local temperatures combined with the lower resolution Mohtadi 2010 recon of the same region.  They overlap from 0AD to 1950, but there is very little correlation.  Assuming both authors knew what they were doing, there must be an issue with the natural smoothing and/or dating.  Since both are in C degrees there would be about +/- 1 C uncertainty and up to around +/-300 years dating issues.  If I wiggle and jiggle to get a "better" fit, who knows if it is really better?  If I base my uncertainty estimate on the lower frequency recon of unknown natural smoothing, I basically has nice looking crap.  So instead I will work under the assumption that the original authors knew what they were doing and just go with the flow, keeping in mind that the original recon uncertainty has to be included in the end.

With most of the reconstructions that ended prior to 1800 "capped" with shorter duration reconstructions from the same area, often by the same authors, things start looking a bit more interesting.


Instead of a rapid peak early and steady decline, there is more of a half sine wave pattern that looks like precessional cycle solar peaking around 4000 BC then starting a gradual decline.  Some of the abrupt changes, though not huge amplitude changes in the tropics, appear around where they were when I was in school.  There is a distinct Medieval Warmer Period and an obvious Little Ice Age,  I am kind of surprised the original authors of the studies have left the big media reconstructions to the newbies instead of doing it themselves.

I have updated the references and noticed one blemish, Rühlemann et al. 1999, uses an Alkenones proxy with the UK'37 calibration.  Some of the "caps" are corals since there was not that many to chose from.  Since the corals are high resolution, I had to smooth some to decade bins to work in the spreadsheet.  I am sure I have missed a reference here or there, so I will keep looking for them any any spreadsheet miscues that may remain.  Stay tuned.

And


Since there is a revised version here is how it compares to tropical temperatures.  I used the actual temperatures with two scales to show the offset.  The recon and observations are about 0.4 C different and of course the recon is over smoothed compared to the decade smoothed observation.  Still a mini-me hockey stick at the splice but not as bad as most reconstructions.

references: No: Author

31 Kubota et al., 2010 (32)
36 Lea et al., 2003 (36)
38 Levi et al., 2007 (39)
40 Linsley et al., 2010 (40)
5 Benway et al.,2006 (9)
41 Linsley et al., 2011 (40)
45 Mohtadi et al., 2010 (43)
60 Steinke et al., 2008 (56)
62 Stott et al., 2007 (58)
63 Stott et al., 2007 (58)
64 Sun et al., 2005 (59)
69 Weldeab et al., 2007 (65)
70 Weldeab et al., 2006 (66)
71 Weldeab et al., 2005 (67)
72 Xu et al., 2008 (68)
73 Ziegler et al., 2008 (69)

Not in the original Marcott.
10(1) with Oppo et al. 2009
36(1) with Black et al. 2007
38(1) with Newton et al. 2006
74 Rühlemann et al. 1999
75(1)Lea, D.W., et al., 2003, with Goni, M.A.; Thunell, R.C.; Woodwort, M.P; Mueller-Karger, F.E. 2006