A Debit to Fortune for repeating an old canard of the business press: that an increase in the personal savings rate means Americans are becoming thriftier.
The problem is that the savings rate, as calculated by the government, bears a deeply uncertain relationship to the amount of money Americans are actually saving. And so, leaving the true meaning of the savings rate unexamined, Fortune offers us an eye-catching but unsupported assertion: that the savings rate is the “bright spot in a dark economy.”
Before we explain why, here is Fortune’s argument:
But as dark as the next three or four quarters could be, the U.S. economy appears to be undergoing a more lasting, and ultimately uplifting, shift.
Americans who for decades have spent an increasing share of their incomes and taken on more and more debt are now, for the first time in years, saving instead.
And then the details:
The personal savings rate, which measures the amount of disposable personal income that isn’t spent, ticked up to almost 3% in the second quarter of 2008, after almost four years below 1%.
The first problem here is that Fortune hasn’t actually reported what the savings rate is, because it doesn’t define “disposable personal income.” Given that the piece presents savings as a matter of choice, a reasonable reader might assume that disposable income is what those of a certain generation (or Jim Cramer) call “mad money”—the kind that doesn’t need to go to groceries, rent, tuition, etc. In other words, extra cash that can either be thrown around or saved.
But the reasonable reader would be wrong.
The amount of money left over after all the necessities have been covered is actually called “discretionary income.” “Disposable income” is all after-tax income, which is to say the pool of money available for necessities and luxuries alike.
Given this definition of disposable income, the implications of changes in the personal savings rate are not so clear. Certainly, non-discretionary spending is somewhat flexible: eat beans and rice instead of meats and vegetables, say, or pay cheaper rent after you’re foreclosed out of your option-ARM house and there are a glut of empty homes on the market.
But what Fortune doesn’t get here is that many Americans who have to spend less are cutting into bone, not fat (or even muscle). They don’t have discretionary money to spend (see below). Cutting back for them means not taking your prescription medicines or juggling which payment can be delayed this month and which can’t.
This may be required under the circumstances, but it’s not a clear good. If people are willing—or are forced—to make these kinds of sacrifices, they are pretty scared. After all, the middle and working classes weren’t doing so well even before the crisis. It’s not like everyone was living high on the hog, with Midtown lunches at Michael’s and a brownstone in a nice part of Brooklyn.
It’s simply wrong, and somewhat glib, to see the rise in savings rate as a come-to-Jesus moment for the American public.
And stepping back from, well, reality a moment, we note that even on its own terms the importance of Fortune’s stat is unclear: the rise in savings rate is thus far a single-quarter aberration—of which there have been others over the past eight years.
Furthermore, even when there was something of a trend, it didn’t necessarily last very long. The savings rate went even higher than the most recent figure in 2001, for example, but then slumped down into negative territory by 2005. Only time will tell whether the most recent rise is the start of a trend, and if so what the nature of that trend is.
But we return to the more substantive problem here.
The fact is, as we have pointed out in the past, there is no single American public when it comes to economic issues. And stats on discretionary income clearly demonstrate what is pretty much self-evident to anyone over the age of five: that some households have far more choice about how to spend and how to save than do others.