If you vary the normal load, the battery's "sensitivity" to discharge will change. With a high load there is less noticeable variation in discharge "Sensitivity". The battery's normal discharge rate is so fast it is hard to notice any change. With a cellphone battery that has very low nominal loads, you can tell when a charge only lasts a few days instead of a week. Longer time frames allow for more obvious changes. If you increase the precision of the high load test you can notice the slight changes, but what are a few milli-Amperes among friends.
The main basic flaw in the Climate Science "for the public" definition of "Sensitivity" is that is does separate discharge and charge sensitivity impacts. If you have a steady load, the energy flow to that load varies with the charge state of the battery. As the battery, the oceans charge, the rate of change in the energy flow decreases in a predictable manner if you know the load and the capacity. Since the Sun is the charger which has a fairly constant output, normal solar charge/discharge cycles would be easily predictable at or near the full charge state and would vary with the charging state. Each "wiggle" in the battery voltage would indicate the state of the charge.
Since the Paleo Mg/Ca derive reconstruction plus every other "Global" metric agrees with a nice steady recharge cycle and not some wild and crazy anthropogenically inspired catastrophe, it might be something is amiss in the Climate Science instrumentation manipulation aka re-analysis procedures.