Tag Archives: statistics

Chartjunk? Slate’s Kaus vs. NYT’s David Leonhardt on hospital beds and better health care


An example of chartjunk: several colors and 26,000+ pixels to describe five hard-to-read numbers

David Leonhardt put forth a gee-that’s-unexpected-but-possible thesis in the Dec. 30 New York Times: Limiting hospital beds forces hospitals to provide more efficient care; allowing limitless beds pressures hospitals to fill them with patients undergoing unnecessary, costly treatments (Economic Scene: Health Cuts With Little Effect on Care):

Since 1996, the Richmond area has lost more than 600 of its hospital beds, mostly because of state regulations on capacity. Several hospitals have closed, and others have shrunk. In 1996, the region had 4.8 hospital beds for every 1,000 residents. Today, it has about three. Hospital care has been, in a word, rationed.

The quality of care in Richmond is better than in most American metropolitan areas, according to various measures, and it continues to improve. Medicare data, for example, shows that Richmond hospitals do a better-than-average job of treating heart attacks, heart failure and pneumonia.

…when it comes to health care costs, Richmond’s rationing has made a clear difference. In 1992, it spent somewhat less than average, per capita, on Medicare — 126th lowest out of 305 metropolitan areas nationwide. Since then, though, costs have risen at a significantly slower pace than they have elsewhere. As a result, Richmond had the 39th lowest costs in 2006.

A pretty tantalizing concept, and one that helps advocate for the school of thought that doctors should have some sort of restriction on how much “Cadillac” health care they can provide a patient. But even if Leonhardt’s thesis is correct, Slate’s Mickey Kaus is correct to point out that Leonhardt provides virtually no useful evidence to prove it:

1. In Richmond the number of beds per 1000 residents fell from 4.8 in 1996 to “about three.” You would now expect Leonhardt to unleash a string of stats showing that medical care in Richmond has gotten better despite these limits. You would be wrong. Care in Richmond is “better than in most American metroplitan areas,” says Leonhardt. OK, but what was it like before? Maybe it was better than nearly every metro area before. Richmond hospitals do a “better-than-average job of treating heart attacks,” Leonhardt says. OK, but were they much-better-than-average before? Anyway, that’s just heart attacks. … Oh, and a patient named Janet Binns–actually, a patient’s daughter–feels there is “nothing cheap about the care.” Well, all right then!

This goes on, for six more points, with Kaus accusing Leonhardt of being spoon-fed statistics from the White House’s own partisans.

Kaus himself doesn’t provide enough beef to make the accusation a clear verdict, but he’s right that Leonhardt’s statistical reasoning is shallow. He commits the most basic fallacy of “correlation does not imply causation” (What other reforms, demographic changes would’ve affected hospital Medicare data?). He just plain omits, Kaus points out, any substantial data (what are the “various measures” in which Richmond performs well in? In the on-the-other-hand statement, “Some of its hospitals do poorly on Medicare’s metrics,” what are these metrics, and how statistically significant are they compared to the ones Leonhardt uses to support his thesis?). And there are assumptions that seem like common sense…but on second thought…need more explanation (why is Richmond’s performance on heart attacks, heart failure and pneumonia the quality-care metric only worth mentioning?)

This thin-on-numbers piece is a bit unexpected considering that Leonhardt, according to his bio, studied applied mathematics at Yale. But maybe it’s not a failure of the reporting as it is the storytelling form. Leonhardt may have all the numbers, but lacks the column inches to describe them all.

So I’d consider it a textbook example of how traditional narrative can fall flat. Leonhardt’s claim just begs to be illustrated with a few charts and graphs. A line chart showing Richmond’s heart-attack-treatment metric from 1992 to 2006 would concisely refute Kaus’s point (at least one aspect of it) in far lesser space than 80 words. As it is, Leonhardt’s article is effectively a textual example of what infographic-guru Tufte calls “chartjunk”: an unnecessary amount of ink to, at best, clumsily support an important theory, or, at worst, to hide the skimpiness of the actual data.

The Times still leads the way in alternative forms of storytelling. You’ve probably already seen their amazing infographic plotting Netflix rentals by geography. 3,000 words (some of which would be devoted to pithy, but generally unhelpful, cherry-picked quotes from your average-Joe-Netflix-user, to illustrate why ‘Milk’ was so popular in Chelsea) would barely have covered the trends a single metro area, nevermind a dozen.