A Debit to Fortune for repeating an old canard of the business press: that an increase in the personal savings rate means Americans are becoming thriftier.
The problem is that the savings rate, as calculated by the government, bears a deeply uncertain relationship to the amount of money Americans are actually saving. And so, leaving the true meaning of the savings rate unexamined, Fortune offers us an eye-catching but unsupported assertion: that the savings rate is the “bright spot in a dark economy.”
Before we explain why, here is Fortune’s argument:
But as dark as the next three or four quarters could be, the U.S. economy appears to be undergoing a more lasting, and ultimately uplifting, shift.
Americans who for decades have spent an increasing share of their incomes and taken on more and more debt are now, for the first time in years, saving instead.
And then the details:
The personal savings rate, which measures the amount of disposable personal income that isn’t spent, ticked up to almost 3% in the second quarter of 2008, after almost four years below 1%.
The first problem here is that Fortune hasn’t actually reported what the savings rate is, because it doesn’t define “disposable personal income.” Given that the piece presents savings as a matter of choice, a reasonable reader might assume that disposable income is what those of a certain generation (or Jim Cramer) call “mad money”—the kind that doesn’t need to go to groceries, rent, tuition, etc. In other words, extra cash that can either be thrown around or saved.
But the reasonable reader would be wrong.
The amount of money left over after all the necessities have been covered is actually called “discretionary income.” “Disposable income” is all after-tax income, which is to say the pool of money available for necessities and luxuries alike.
Given this definition of disposable income, the implications of changes in the personal savings rate are not so clear. Certainly, non-discretionary spending is somewhat flexible: eat beans and rice instead of meats and vegetables, say, or pay cheaper rent after you’re foreclosed out of your option-ARM house and there are a glut of empty homes on the market.
But what Fortune doesn’t get here is that many Americans who have to spend less are cutting into bone, not fat (or even muscle). They don’t have discretionary money to spend (see below). Cutting back for them means not taking your prescription medicines or juggling which payment can be delayed this month and which can’t.
This may be required under the circumstances, but it’s not a clear good. If people are willing—or are forced—to make these kinds of sacrifices, they are pretty scared. After all, the middle and working classes weren’t doing so well even before the crisis. It’s not like everyone was living high on the hog, with Midtown lunches at Michael’s and a brownstone in a nice part of Brooklyn.
It’s simply wrong, and somewhat glib, to see the rise in savings rate as a come-to-Jesus moment for the American public.
And stepping back from, well, reality a moment, we note that even on its own terms the importance of Fortune’s stat is unclear: the rise in savings rate is thus far a single-quarter aberration—of which there have been others over the past eight years.
Furthermore, even when there was something of a trend, it didn’t necessarily last very long. The savings rate went even higher than the most recent figure in 2001, for example, but then slumped down into negative territory by 2005. Only time will tell whether the most recent rise is the start of a trend, and if so what the nature of that trend is.
But we return to the more substantive problem here.
The fact is, as we have pointed out in the past, there is no single American public when it comes to economic issues. And stats on discretionary income clearly demonstrate what is pretty much self-evident to anyone over the age of five: that some households have far more choice about how to spend and how to save than do others.
According to a 2007 study released by The Conference Board, while the number of households with discretionary income has increased in recent years, actual discretionary spending power is concentrated among the wealthiest.
About 73 million U.S. households now have discretionary income, up from about 57 million in 2002, according to a report by The Conference Board. The percent of the U.S. population with discretionary income has increased to nearly 64 percent, up from 52 percent in 2002.
Now the flipside of this good news is that more than a third of US households had no discretionary income, and that was before the current crisis. So if these households spend less, they are presumably cutting into necessities.
And there is another problem here. The Conference Board again:
‘While the percentage of households with discretionary income has risen over the past several years, purchasing power remains concentrated in the wallets of the affluent,’ says Lynn Franco, Director of The Conference Board Consumer Research Center. ‘More than three out of every four discretionary dollars flows to householders earning $100,000 or more. And their average discretionary income is more than 2.5 times above average.’
If we apply this knowledge to the savings rate issue, we can generate two different, although not necessarily mutually exclusive, scenarios. One, households across the board are spending less, and the ones at the bottom of hurting because of it. Two, the increase in savings is concentrated in the wealthiest households, who have the bulk of the discretionary income and therefore the most choice of whether to spend or save.
To complicate the matter further, the definition of the savings rate is strangely narrow. Here is John Steele Gordon, on the American Heritage blog, defending Americans against charges of wild spending:
The trouble here is not with Americans; it is with the definition of the savings rate, which is hopelessly out of date. That definition is the percentage of after-tax income that is not spent. If a family takes home $50,000 and spends $47,500, its savings rate is 5 percent. If it takes home $50,000 and spends $50,500, its savings rate is -1 percent.
That definition was not a bad one in 1932 and 1933, when few American families owned their own homes, were the beneficiaries of company pension plans, held any substantial financial assets, or paid any income taxes. Today it is a meaningless statistic.
Just consider. Every time a family sends a mortgage check to the bank, part of that money increases their equity in the house. But that doesn’t count as savings. Contributions, by employer and employee, to a 401(k) or other retirement plan don’t count because they are made with pre-tax income. Unrealized capital gains in stocks, bonds, or real estate don’t count. Social Security taxes don’t count either, even though they amount to 12.5 percent of total income from salaries or wages up to a little less than $100,000.
And then Steele offers some useful context:
It might be noted that the savings rate peaked in 1977, when it was over 11 percent, and has been in near continuous decline ever since. Why? In 1978 the 401(k) revolution began, and more and more Americans, more than happy to have Uncle Sam help out, began saving out of pre-tax income rather than after-tax income. So the statistic called the personal savings rate declined while the amount of wealth being saved in the real world began to increase sharply.
And here is another defense, from late 2006—a time when the savings rate was in the basement, but banks were “bulging at the seams with record amounts of cash socked away by everyday consumers.” Michael Giusti, of Bankrate.com, explains these two seemingly contradictory facts by bringing the idea of wealth into the equation:
By many measures, even with a falling savings rate, U.S. consumers are wealthier than they have ever been.
How wealthy? Well in 2006,
Federal Deposit Insurance Corp., or FDIC, records show that American banks have more cash in their vaults than at any other point in recent history—with $6.4 trillion deposited in the domestic offices of U.S. banks as of June.
Huh. Judging from the situation today, it looks like banks have gone on a bit of a spending spree themselves. We’re starting to wonder whether the problem is less that Americans have been overspending and more that they have been stowing their money in the wrong places.
Brian Lawler, writing for The Motley Fool, notes that
when the news comes out that the average U.S. citizen has a negative savings rate, everyone tends to bemoan consumers’ overspending, undersaving ways. Yet in reality, people in the United States do a great job of saving for the future—if you measure by the more appropriate metric of economic wealth, which accounts for rising asset values even before they are sold.
The problem, of course, with these kinds of investments, as both Giusti an Lawler point out, is that if the housing or stock markets plummet, it can wipe people’s savings out. As Giusti puts it:
Unfortunately, stagnating home values, a dip in stock prices or some unforeseen calamity could cause the whole equation to shift radically out of balance.
And it is worth pointing out that by this logic, the low savings rate indicates not profligacy but rather a general trust in the stability of the economic system—after all, aren’t we always told to trust the markets? Well, Americans did, apparently, and look where it got them.
As for that issue of choice and consumer spending, a CNN piece from the middle of this year makes the following point:
The fact that consumers continue to borrow against their homes, even as they decline in value, shows how troubled Americans are.
‘It signals how consumers are struggling to get cash,’ [senior director of consumer economics at Moody’s Economy.com Scott] Hoyt said.
You don’t need a PhD in economics to know that anyone borrowing on their homes in early 2008 was in serious trouble. The business press needs to move beyond the idea that luxury spending by the general public got us into this mess. And that scrimping by the average American will go very far toward righting the Good Ship Economy.
Sometimes, Dear Reader, things really are as bad as they look.