- Discuss the trend in the U.S. savings rate.
- Define a subprime loan and explain the difference between a fixed-rate mortgage and an adjustable-rate mortgage.
- Discuss what can go wrong with a subprime loan at an adjustable rate. Discuss what can go wrong with hundreds of thousands of subprime loans at adjustable rates.
- Define risk and explain some of the risks entailed by personal financial transactions.
Joe isn’t old enough to qualify, but if his grandfather had deposited $1,000 in an account paying 7 percent interest in 1945, it would now be worth $64,000. That’s because money invested at 7 percent compounded will double every ten years. Now, $64,000 may or may not seem like a significant return over fifty years, but after all, the money did all the heavy lifting, and given the miracle of compound interest, it’s surprising that Americans don’t take greater advantage of the opportunity to multiply their wealth by saving more of it, even in modest, interest-bearing accounts. Ironically, with $790 billion in credit card debt, it’s obvious that a lot of American families are experiencing the effects of compound interest—but in reverse (Frank, 2005).
As a matter of fact, though Joe College appears to be on the right track when it comes to saving, many people aren’t. A lot of Americans, it seems, do indeed set savings goals, but in one recent survey, nearly 70 percent of the respondents reported that they fell short of their monthly goals because their money was needed elsewhere. About one-third of Americans say that they’re putting away something but not enough, and another third aren’t saving anything at all. Almost one-fifth of all Americans have net worth of zero—or less (Taylor, 2007; Frank, 2005).
As we indicated in the opening section of this chapter, this shortage of savings goes hand in hand with a surplus in spending. “My parents,” says one otherwise gainfully employed American knowledge worker, “are appalled at the way I justify my spending. I think, ‘Why work and make money unless you’re going to enjoy it?’ That’s a fine theory,” she adds, “until you’re sixty, homeless, and with no money in the bank” (Gardner, 2008). And indeed, if she doesn’t intend to alter her personal-finances philosophy, she has good reason to worry about her “older adult” years. Sixty percent of Americans over the age of sixty-five have less than $100,000 in savings, and only 30 percent of this group have more than $25,000; 45 percent have less than $15,000. As for income, 75 percent of people over age sixty-five generate less than $35,000 annually, and 30 percent are in the “poverty to near-poverty” range of $10,000 to $20,000 (as compared to 12 percent of the under-sixty-five population) (Rubin, et. al., 2000).
Disposing of Savings
Figure 14.11 “U.S. Savings Rate” shows the U.S. savings rate—which measures the percentage of disposable income devoted to savings for the period 1960 to 2010. As you can see, it suffered a steep decline from 1980 to 2005 and remained at this negligible savings rate until it started moving up in 2008. The recent increase in the savings rate, however, is still below the long-term average of 7 percent (Economic Research, 2008; Dickson, 2007).
Now, a widespread tendency on the part of Americans to spend rather than save doesn’t account entirely for the downward shift in the savings rate. In late 2005, the Federal Reserve cited at least two other (closely related) factors in the decline of savings (Federal Reserve Bank of San Francisco, 2005):
- An increase in the ratio of stock-market wealth to disposable income
- An increase in the ratio of residential-property wealth to disposable income
Assume, for example, that, in addition to your personal savings, you own some stock and have a mortgage on a home. Both your stock and your home are (supposedly) appreciable assets—their value used to go up over time. (In fact, if you had taken out your mortgage in 2000, by the end of 2005 your home would have appreciated at double the rate of your disposable personal income.) The decline in the personal savings rate during the mid-2000s, suggested the Fed, resulted in part from people’s response to “long-lived bull markets in stocks and housing”; in other words, a lot of people had come to rely on the appreciation of such assets as stocks and residential property as “a substitute for the practice of saving out of wage income.”
Subprime Rates and Adjustable Rate Mortgages
Let’s assume that you weren’t ready to take advantage of the boom in mortgage loans in 2000 but did set your sights on 2005. You may not have been ready to buy a house in 2005 either, but there’s a good chance that you got a loan anyway. In particular, some lender might have offered you a so-called subprime mortgage loan. Subprime loans are made to borrowers who don’t qualify for market-set interest rates because of one or more risk factors—income level, employment status, credit history, ability to make only a very low down payment. As of March 2007, U.S. lenders had written $1.3 trillion in mortgages like yours (Associated Press, 2007).
Granted, your terms might not have been very good. For one thing, interest rates on subprime loans may run from 8 percent to 10 percent and higher(consumeraffairs.com, 2005). In addition, you probably had to settle for an adjustable-rate mortgage (ARM)—one that’s pegged to the increase or decrease of certain interest rates that your lender has to pay. When you signed your mortgage papers, you knew that if those rates went up, your mortgage rate—and your monthly payments—would go up, too. Fortunately, however, you had a plan B: with the value of your new asset appreciating even as you enjoyed living in it, it wouldn’t be long before you could refinance it at a more manageable and more predictable rate.