Don’t cash those chips in yet.

From the NSIDC:

“Sea ice extent has fallen below the 2005 minimum, previously the second-lowest extent recorded since the dawn of the satellite era. Will 2008 also break the standing record low, set in 2007? We will know in the next several weeks, when the melt season comes to a close. The bottom line, however, is that the strong negative trend in summertime ice extent characterizing the past decade continues.”

Note, 2007 has pulled further into the lead in the last week of August. However the bet comes out, though, the serious story is that this year has essentially equalled in practical terms last year’s unprecedented decline. Note in the chart above the comparison with the previous ice extent minimum in 2005.
Deltoid has some fun with the denialist camp on this one, starting with some recent unfortunate foolishness on the Register. Comments off, discussion referred to Deltoid please.

The Two Tailed Beast

Somebody with an interest in economics, one Tyler Cowen, a.k.a. “Marginal Revolution” gets this much right:

Make of climate models what you may, there is lots of evidence that a) biodiversity is being hammered, and b) climate change will bring desertification, drought, and possibly coastal flooding to many parts of the world, among other dilemmas. I don’t have a lot of faith in the exact predictive powers of climate models, or for that matter economic models, but uncertainty about outcomes should make us worry more not less. Uncertainty usually has two tails, not just one.

Interesting comments, too. Comments off here; please comment on the referenced article.

With a tip o’ the ol’ fedora to John Fleck.


The Hits Just Keep On Coming!

Here are Statcounter hits 99,997 through 100,003.

Alas, hit number 100,000 was just the kind I like the least. somebody looking for gold, Gold, GOLD! The hit was sort of ironic, though. They were googling for “Gold Frames”…

Here’s the log, with IP’s trimmed in case anybody doesn’t want it published that they were on In It during work time. The winning hit came in over the noon hour central daylight time today.

Any of the above can have a researched article of their choice at any time, and a dinner any time we are within 50 miles of each other. The Montreal based user (my only reader in my hometown as far as I know) could collect very soon!

Pardon the self-indulgence. Back to our regularly scheduled programming soon!

Oh, I really do like to meet my readers. Had a great dinner in Pasadena last week with a reader (thanks Erik!). I would love to hear from anyone passing through Austin who wants to join me for coffee. Get the occasion in before I’m actually famous and start avoiding people!

An Idea for Breaking the Carbon Logjam

Hans Gerbach has an article in Economists’ Voice suggesting a viable way to entice global participation in a carbon reduction regime. It’s an interesting example of economic thinking in some ways. While I fear it does a better job explaining why the outcome is likely to be self-destructive if not suicidal, than identifying a practical way to get around it, at this point I am for considering anything.

Here’s the teaser:

The problems with the Kyoto accord are clear
to all. The developing world and the United
States did not agree to join in CO2 emissions
reductions and as to the signatories, what will
make them achieve their reduction targets?
It is easier either not to join or not to comply
and let others do all the work of emissions
reduction. That is the fundamental free-rider
problem. Greenhouse gases disperse around
the globe and burden everyone. One country’s
reductions burden it alone but benefit everyone.
That makes it quite a trick to get largely selfish
states to reduce emissions.

The long-term nature of addressing climate
change compounds the free-rider problem. Even
if a long-term reduction path is chosen at the
get-go, each year presents a new opportunity
for each nation to renounce its responsibilities
and free-ride on the reductions of others. It is
as if the monumental act of coordination (which
could not be fully achieved even once at Kyoto)
really needs to be reenacted each and every year
when countries come to actually implement
reductions. Without a way around this problem,
any agreement, even if entered with the best of
intentions, will soon become a hollow shell.

Alas, that makes the kind of sense that the modern world makes.

Here’s the link. Gersbach proposes a kind of workaround that is interesting. You may have to jump through some hoops on the website. It wants you to identify a university with which you are affiliated but it doesn’t really enforce it. Try your alma mater.

In short he proposes a global fund to collect carbon taxes and reward carbon reductions.In a way it reminds me of the McKitrick solution. It essentially requires an objective measurement mechanism of carbon emissions per nation that will not be easy to implement in practice. There are also some ethically dubious grandfathering effects here that won’t fool the less developed countries for a second.

Pielke, Part of the Problem

This question of “speaking up” can cut the other way, though.

I received a bit of denialist drivel in email, pointing among other things to the infamous bogus CO2 record, to give you an idea of the quality of the correspondence.

However, I while judging someone by the company they keep may have some value, it is a mistake to judge someone by the company that keeps them. So, in the same message, there is a pointer to a recent article in Prometheus linking to a recent summary in a libertarian blog of the hockey stick story. This item deserves some attention, if not especially on the basis of merit, as it seems that the Bishop Hill article (and its catchy appelation of a “Jesus Paper”) will have some resonance among the opponents of timely action on global change issues.

A particularly intransigent comment to Pielke’s above-referenced Prometheus article from William Eschenbach reads as follows:

The problem is not the behavior of the few. A few people will always do wrong. The problem is that the behavior of the community as a whole has been just what you said. They have not stood up to oppose the bad science done in their name. They have not clamored for an investigation into the bad science. They have, in large part, done absolutely nothing in response to this abysmal situation. Nothing. No public statements. No behind-the-scenes maneuvers. Nothing. Zip. Zero.

Instead, by and large, they have in your words “stayed out of the limelight” … and now you are claiming that they are the victims in this case?

In my opinion, they have no one but themselves to blame for the fact that they are being tarred with the same brush as the miscreants.

First of all this assumes the existence of “miscreants”, which goes far further than even the excessive Wegman report (which in its own excesses raises some uncomfortable questions about the conduct of modern science) does. Second, it places an onus on a “community” to police itself in a way that provides an unrealistic model of the community. The IPCC, even constrained to WGI, provides summaries of the positions of a wide range of loosely connected communities, among whom dendrochronology forms but a tiny corner.

One wonders who is expected to “speak up” and when. And how, in the light of limited resources and competitive funding, one is expected to find the time to work out the details. The concept that some oceanographer or satellite engineer or icthyologist has some obligation to “get into the limelight” about something as narrow as tree rings doesn’t ring true to anyone actually working in the field.

There are real issues with the conduct of science, but the question at hand is how important they are. It is absolutely crucial to note that no responsible party, neither Wegman nor McIntyre himself denies that the millenial temperature curve will likely turn out hockey-stick-like once enough data is collected and analyzed. In fact, a contrary result would be quite surprising!

They argue that the statistical methodology for obtaining these results is inadequate, and stake out a position of defending the integrity of science. It is hard for me not to sympathize with these claims. Few close to modern science will deny that the process has important flaws, but fewer still are in much of a position to address them.

The problem of the conduct of science pales in importance to the problem of bringing the human impact global environment into stability, though. The tragedy is that these quibbles are inflated onto accusations of such spectacular dishonesty specifically in order to color the policy debate.

Regarding the hockey stick, I have to line up with Pielke in shrugging and saying I haven’t spent the effort to figure out how the science shakes out and I don’t really plan to. It seems likely to me that the temperature really did follow a hockey stick pattern (as so many things do nowadays). The scientific question is only the extent to which the data confirms that expectation, not whether in fact the hypothesis has been shown invalid. And in the grand scheme of science this particular question has very modest importance, despite the political weight placed upon it.

The denialists are not especially interested in the question as to whether the hockey stick is real, it turns out. They are mostly interested in what it reveals about the IPCC process. And here, it is hard to say they don’t have something.

The accusation, removed of acrimonious ranting, is 1) that IPCC knows in advance what result it is delivering, and 2) that the process for delivering its report is too informal and that papers are rushed into print in order to meet the IPCC deadlines. On the first point, this only amounts to an accusation in the event that there is no consensus. Since, in fact, Pat Michaels notwithstanding, there is one, there is little basis for worry on the first point. It’s simply tautological. If you are asked to report on a matter on which you are convinced and your reader is not, obtaining the result you hold to be true is not itself evidence of bias.

On the second point, one can make a case that papers are rushed into print specifically in order to be referenced by IPCC. Since “getting the runs in time for IPCC” is the driving force of climate modeling these days, and this distorts the software engineering process, I can actually state confidently that there is some truth to the complaint.

As usual, the forces of truth and justice are caught between a rock and a hard place, though. In demanding a formal process to justify the nontrivial changes in social structure and international relations required by the state of things, people resisting such changes have a solid point. However, they proceed further by also resisting the massive changes in the scale and scope of earth end environmental sciences that would be required by such a process.

Does it matter, though? That depends on whether the “conspiracy” is drummed up or real. It is usually possible to reinforce ideas of conspiracy when there is a segment of the public inclined to believe in one. Whatever error may or may not be involved in selecting certain trees for inclusion in a dendrochronology may constitute malfeasance if one is in a particularly judgmental frame of mind. What, then, is the moral status of quibbling about tree rings when the radiative balance of the atmosphere is being forced at a rate without remote precedent in the entire history of mammals.

In the end, science is an imperfect instrument, and we must nevertheless make decisions based on what we know. By stressing the former and not the latter fact, by fertilizing the ground where others are happy to plant wild conspiracy theories, McIntyre and now Pielke do an enormous disservice.

As such, they are ironically part of the very problem they identify, placing more attention to the advancement of their own reputations and positions than on the advancement of knowledge and governance.

It’s literally tragic that they are recycling this endless quibbling about bristlecone pines rather than stepping back and looking at the balance of evidence. There is simply no way to formalize the process entirely. Human judgment is easily derailed, but we will have to collectively judge this issue and come to difficult and necessarily imperfect decisions of major consequence, soon.

If somebody wants to talk about “malefactors”, let’s talk about the people who are working so hard to skew this matter away from the big picture. It’s not about publication records and tenure cases. It’s about survival. It’s about whether or not to extract so much value from the world that the world itself becomes valueless.

Bristlecone pines or not, the carbon has to stay in the ground.

Climate Models – Is There a Better Way?

What is the simplest possible CGCM (‘climate model’ with 3D fluid dynamics)?

I don’t want the best possible match to observed climate. Such a thing serves little purpose, as I explain briefly below. I would like to see the simplest useful climate model with full 3D primitive equation dynamics, moist physics and radiative transfer. Such a thing would be very informative.

At PyCon 2007 in Dallas, just as I was moving to Texas, I had the pleasure of being a party to a hallway conversation with one of the keynote speakers, Robert “r0ml” Lefkowitz, who had just given an amazing keynote about source code as communication medium.

Now I grew up exposed to several alphabets, (Roman in English, French, Czech and Hungarian variants, Hebrew in Hebrew and Yiddish variants, musical notation, numbers) and have always considered arrays of symbols and what each symbology could represent a point of deep fascination, so he was talking to me.

In the hallway, I ended up explaining how there would probably never be an Einstein of climate, the system being too messy and contingent to allow for blazing reorganizing insights. r0ml suggested that there might, nevertheless, be a Mozart of climate modeling. Appealing to my grandiosity is generally successful, and so I have been unable to entirely shake the idea since.

I doubt that I am Mozart, but I would like to pave the way for him or her. In short, I would like to create a climate model that you would like to read.

Climate models are not devices intended to explain the past trajectory of global mean temperature or predict its future. Climate models are efforts to unify all knowledge about the atmosphere/ocean/cryosphere system. The basic equations and boundary conditions are put in; elaborate circulations with many close resemblances to the actual circulation come out. While there are free parameters in the models, they are far fewer than the degrees of freedom in the resulting dynamics. The fidelity of the models thus represents a real and striking success.

However, the sensitivity of the models (the simple relationship between greenhouse forcing and temperature) has only a handful of degrees of freedom. The tuning of the models has been informal. It is conceivable that the models have been inadvertently tuned to cluster about the sensitivity that other arguments indicate, but that’s not necessarily a bad thing. The sensitivity is very likely in the neighborhood of 2.5 – 3 C / doubling. Perhaps in the absence of other evidence the spread would have been a bit larger, but it’s difficult to see how it could have been dramatically different. Surely, no amount of further modeling is likely to say anything different, and any pretense that obtaining the global greenhouse sensitivity is the purpose of the effort has always been an oversimplification which should long ago have been abandoned.

Most effort at present is involved in making models more complex. This is in response to the intense desire for a closed climate/carbon system, as well as to addreess other geochemical and ecological questions. It is my opinion that these efforts are ill-advised in the extreme; they will have very limited intellectual or predictive value because they are vastly underconstrained. What’s more, they will add a vast range of tunable parameters. The new Earth System Modles stand a good chance of becoming what current GCMs are accused of being, i.e., models which can be tuned to yield any desired result.

Models serve many purposes, in research, training and public communication. The lack of a model that is easily understood, modified and run is increasingly unnecessary and inappropriate. Such a model would be dramatically simpler than existing codes, and possibly somewhat lower in simulated days per floating point operation. Developing it would be diametrically opposite to contemporary trends. In addition to opening the field to investigation by amateurs, it would resolve some important questions in the course of its development. Specifically, in seeking the simplest coupled GCM, one identifies which phenomena are actually important under present day circumstances.

We should also seek to create in this context a model which is applicable to other observable and imaginable planets, thereby facilitating investigations into the theory of geophysical fluid dynamics, and allowing for the widest possible range of algorithms.

Efforts to increase model fidelity by increasing resolution are compatible with the approach of radical simplification. In fact, investigation of specific phenomenology is compatible as well. The objective should be a model that is not only actually readable and actually read, but reliably modifiable and actually modified, testable and tested, validatable and validated.

Efforts like PRISM and ESMF are well-intended but fail to move in the right direction. Contemporary software development techniques must be imported from the private sector.

The resource base for this could be as small as twenty person-years, say five developers, a manager and one support staff over three years. I doubt it could be a hobby, though the first step could be a hobbyist effort: I’d like to see a clean, readable implementation of the MIT “cubed sphere” grid to kick things off.

I’d be happy if someone else took this on. If you want me involved it requires some close variant of Python. It also requires some support, by which I mean money. I am not the sort of coder who can write scientific code all day and volunteer scientific code all night. That all grumbled, you actually really need this thing.

Yesterday’s SciPy talk on Cython was very encouraging, by the way. Alternatively I understand there are Python bindings to PETSc, (the NSF proposal stresses that approach) but that might not serve the purpose of maximum accessibility.

Knocking on Your Golden Door

OK, so here I am in Pasadena for the SciPy meeting. The other time I managed to attend, I stayed at a marginal motel at the edge of town, probably not entirely safe. Now I am staying at one of those places where they practically charge you for even breathing. ($7 to park, $9/day for internet, $5.50 for a bowl of oatmeal, which somehow the plumpness of the raisins fails to justify…) I’m not sure which I dislike more. Readers who are fussy about their tax dollars will be relieved to know that it’s private grant money (and not climate) that’s funding this trip. But the presentations are wonderful.

Anyway, the scientific Python community continues to make amazing advances, and pretty soon there won’t be a need for a conventional language in high performance codes at all. So when do I get around to pulling the Torvalds maneuver. “I am writing a climate model as an intellectual exercise, I invite participation.” The trouble is I’m not energetic/nerdy enough to work on what other scientists tell me to do all day and on what I want all evening. The soul rebels. The clarinet gets rusty, the cookware gets dusty, the main rationale for living in Austin (honky tonks!) goes unattended. The long shot grant form NSF to pay me to do what I want is still pending but in retrospect I’m not sure I made the case all that convincingly.
Google is funding some of the work I saw presented today, but I can’t blame them for not funding mine, on account of it’s complete vapor so far. I almost have to make a living as a writer, if only so I can get some actual science done in my spare time! So keep them clicks coming in!
Anyway, a couple of articles on climate models showed up this week. Here’s an official summary of climate modeling from the DOE’s CCPP, lead author Dave “Darth” Bader, and here’s Oak Ridge senior scientist John Drake’s argument. I’ve met both, and they are very smart and decent fellows. That said they can both be relied upon to give a DOE-friendly report. (It’s also reassuring to see Isaac Held on the author list of the CCPP report. I am confident that the report will not be stretching the truth too far on that account.) What do you think?
I think that past achievements are remarkable but I have my doubts about the current direction. Is there room for another approach? Does another useful approach exist? Well, I actually think so, but I’m damned if I can figure out how to get anyone who can afford to give it a try to do that, with me in the loop or otherwise. Maybe I’m just a little cracked. It’s been known to happen, but I still think I have a shot at doing something important left in me. There are worse fates than just chugging away doing applications coding, I suppose, and having an interesting intellectual life in some disembodied community meanwhile.

That all said, those of you who criticize climate models without much basis in experience would do well to read the Drake article and the Bader report.

Meanwhile, is there any In It reader in LA who’d care to join me over beer this Friday evening? Let me know. Perhaps you can pry my trade secrets out of me.

De Nile

Everybody’s favorite river seems to flow in every corner of the world.

For instance, consider how quickly Floridians stop worrying about hurricanes.

Meanwhile, Dot Earth reports that US science agencies shy away from the question of how to deal with the fractured communications between science and the public. I have to say that when I first heard about this issue I had some doubts about Glantz’s association with NCAR but he makes a very cogent case.



This peculiar figure is still up at a NASA site. The cooling rates are monstrously high (per annum!) and the boundary between land and sea is too sharp and there is altogether a misleading amount of detail. I seem to recall William Connolley warning me that this map was broken. But there it sits.

Meanwhile Robert Rohde’s wonderful GlobalWarmingArt site finds the evidence equivocal:

Here the rates are per decade, and while still large, are not stunningly large. The time series are longer and hence perhaps less noisy, and the trends are far less uniform. (And as usual, Robert has made a visually beautiful image. Quite a few of those many hits to this site have been people coming by to admire my closeup of one of his sea level rise maps.)

Anyway, the NASA site with the peculiarly shaded map has a link to:

Comiso, J. C., Variability and trends in the Antarctic surface temperatures from in situ and satellite infrared measurements, J. Climate, 13(10), 1674-1696, 2000; Kwok, R, and J.C. Comiso, Spatial patterns of variability in Antarctic surface temperature: Connections to the Southern Hemisphere Annular Mode and the Southern Oscillation, Geophys. Res. Lett., 29(14), 10.1029/2002GL015415, 2002;

And therein we find this:

Although the NASA web page references the article, and while it has some obvious features in common, this figure doesn’t perfectly match their fancy shaded map (check the area by the Ross Sea). That said it does show relatively steep gradients at the shore, and very high rates of change. Notice that the warming signal in the surrounding seas is far more pronounced than the cooling in the interior. Notice especially the intense warming near the Amundsen embayment, (a bit west of South America) which is spectacularly not where you want it.

So what’s going on?

Wikipedia (and thereby, William, no doubt) refers us to

Thompson and Solomon 2002, Interpretation of Recent Southern Hemisphere Climate Change, Science, v 296 pp 895 ff.

They in turn make a strong case for a correlation of cold Antarctic interiors and a tightening of the “SAM”, which is the anomalously strong phase of the Antarctic polar vortex, a mode which appears to be increasing, and which can be dynamically attributed to a sharp decline in ozone over the period record. Ozone, of course, heats the stratosphere, so its decline will lead to anomalously cold temperatures. Then you need to invoke the thermal wind law and (hmm skipping a few steps) voila! a tightened Antarctic vortex, and tightened temperature gradients around the Antarctic rim.

Of course for every person worried about the retreat of Arctic sea ice there is somebody willing to celebrate the advance of Antarctic ice. The map shows that ice is advancing through the relatively limited areas of cooler water, but that doesn’t do much to separate cause and effect. Any ideas out there?

Anyway the short version of the story is at least plausibly argued to be like this. Antarctica seems to be special because of ANOTHER human impact on the global environment. As the ozone depletion subsides, this will be tested, as the anticipated forcings will both be towards warming in the Antarctic interior.

Update: I see Atmoz has taken this on in plenty of detail. The info I wanted from William is there too, along with many comments. And he says the shiny map is “probably the work of a PR droid” and points to this, via NASA, from Wikipedia:

Go figure.

I think there is actually something to complain about here in a McIntyrean way: how are these drastically different results from a single agency supposed to be reconciled? I note that the web publication of the later image refers to the earlier one without explaining the dramatic differences.

And while I haven;t heard a cogent explanation for the advancing Antarctic sea ice, I have heard a cogent explanation for interior cooling, along with, now, data that shows it isn’t happening…

None of which changes the fact that so far all evidence seems to agree that warm water is being delivered to the structural weak point of the West Antarctic Ice Sheet.