Unit 6

Superpower: A Tale of Two Eras, 1945-present

In 1945 World War II ends with the U.S. not only victorious, but uniquely positioned in the world for economic growth for decades afterwards.  But in 1945-46, Americans were not confident of that growth. Fears about the economy topped polls as Americans’ greatest concern.  Although the War had brought a few years of full employment, the memory of the Great Depression was fresh. Prices (and profits) rose rapidly in 1945-46 as corporations were released from war-time price controls.  Unions, newly empowered and expanded thanks to late-1930’s legislation that reversed the anti-union efforts of the 1920’s, struck in key industries to try to regain the purchasing power lost to the price increases.  In addition, the War proved to be very costly with over 400,000 American deaths.

Fears of slowdown and a return to unemployment proved largely groundless. Instead of a return to the Great Depression, the U.S. began several decades of strong economic growth. Forming the foundation of this growth were five fundamentals:

  1. While the U.S. had lost over 400,000 to the War, relative to population, the U.S. losses were the smallest of any major power at only 0.32% of 1939 population. (see Wiki) Others suffered much worse. The Soviet Union lost over 14% of its population and Germany 8-10%. The U.S. still had a largely in-tact and experienced workforce available.
  2. Other major nations suffered more after the war due to starvation, hunger, and disease, losing more population as it took time to restore governments, currency, and banking in the defeated and formerly occupied nations.
  3. The war had been fought elsewhere.  The U.S. escaped the kind of total destruction of industrial and civilian infrastructure that the war brought to Europe, China, and Japan. For the coming decade, the U.S. would effectively be the last producer still standing.  This meant U.S. firms could export and sell to huge overseas demand with little competition until Europe and Japan rebuilt.  Europe and Japan would not become effective industrial competitors in world markets until the 1960’s.
  4. The U.S. had enormous pent-up demand for consumer goods, particularly consumer durables at home. Because the Great Depression (no income) and the War (not allowed), U.S. production of consumer durables such as cars and homes had not kept up with population growth since 1929. The existing housing and automotive stock was severely depleted. Domestic sales would be high for many years as both the population and income expanded, and the old, depleted stock was replaced and updated.
  5. The world as a whole, or at least the non-communist nations, coordinated their economic policies to promote trade, growth, and a stable global money system based on the U.S. dollar instead of gold.

Macroeconomic Performance and Policy: Really Two Eras

The post-World War II era economically separates into two distinct eras separated by periods of transition. The primary distinction between the two eras is the dominant approach of government to the economy and the resultant performance. The first period of transition is short: 1945-1948 as the nation experiences a short-lived burst of inflation followed by a recession as the nation demobilizes from the war effort. The adjustment, though, is mild compared to what happened following the end of World War I. The nation’s fears of return to Great Depression disappear in 1948-1949 as the economy rapidly snaps back from recession and begins a long-run growth trend.

Era One: The Keynesian “Golden Age”: 1950’s and 1960’s

The first extended era consisted of the 1950’s and 1960’s. There were several recessions, but all of them were very mild and very short relative to anything that ever happened before. Inflation remained virtually non-existent at 1-2% over the era, although it began to creep up towards the end of the 1960’s. Unemployment rose sharply in each recession, but never reached double-digit range. Employment growth snapped back quickly at the end of each recession. More significant than the apparent taming of the business cycle, though, was the long-run growth rate. Real GDP grew at an approximate average of 4% per year for well over two decades. During these years the federal government, under both parties, pursued strong Keynesian-inspired economic policies. Government fiscal policy, particularly deficit spending, was used to help manage the economy and recover from recessions and the financial industry was tightly regulated.

Finally at the end of the 1960’s and the early 1970’s pressures begin. The efforts of the government to increase spending on the War in Vietnam in the late 1960’s while the economy was at full employment begins to lead to slowly rising inflation. The government temporarily increases taxes in 1968-69 to engineer a recession in order to slow inflation. The recession is short, but inflation is unchanged. In fact, it begins to rise further. The transition decade of the 1970’s has begun.

In the 1970’s the economy see-saws between sharp recessions and bursts of growth. The growth however is too weak to absorb the large influx of new entrants into the labor pool: baby boomers graduating from school and former stay-at-home mothers who begin to seek employed work. Despite the recessions, inflation continues to rise slowly throughout the decade. And under pressure from the entering baby boom, the unemployment rate seems stuck at a higher level. Two international oil price shocks (193 and 1978) trigger concerns about energy and further slow the economy. The 1970’s are described as an era of “stagflation” – a combination of stagnant growth and inflation.

Era Two: Neoliberalism and the “Reagan Revolution”

As the stagflation of the 1970’s continued, both inflation and unemployment reached double-digit levels in the recession of 1980. Voters and policy makers lost faith in the Keynesian policy recommendations. The country turned to a different approach to economic policy. Policy makers turned away from Keynesian-style government management of the macro economy through fiscal and monetary policy with goal of achieving full employment with stable inflation. Instead, policymakers followed a economists as the mainstream of economists changed their views. The new dominant economic theories were a mix of pre-Keynesian laissez-faire thinking mixed with revived theories of monetary policy. The new mix of theories/policies is known in the economics profession and in the rest of the world as “neo-liberalism” (as in a return to the 19th century “liberal” of laissez-faire). In U.S. politics however, the theories became known as monetarism, supply-side economics, and conservatism. Key features of this thinking are that the private economy will achieve optimal outcomes if left alone and deregulated. Banking and finance in particular are to be deregulated. Government is viewed as “the problem” and should balance its budget and avoid any fiscal policy actions except to cut taxes. As for managing the macro economy towards goals of growth, full employment, and low inflation, this approach relies on tax cuts to motivate growth and central bank monetary policy to manage inflation. Full employment, it was assumed, would naturally follow or, if not, then could only be addressed by retraining efforts.

President Carter actually began the shift in this direction in 1978 by deregulating the airline, trucking, and rail industries. He also began some small deregulation on banking by eliminating controls on interest rates payable. However, despite Carter’s efforts, the movement to a new economic policy regime was most associated with the 1980 election of Ronald Reagan. It became known as the Reagan Revolution. It had an auspicious birth. The nation plunged into a severe double-dip recession in the early 1980’s as The Federal Reserve raised interest rates to double-digit levels to bring an end to inflation. The recession proved to be the most severe since the Great Depression. It also proved to be an inconclusive test of the new economic policies. The recession indeed ended, but the combination of tax cuts implemented by Reagan and the increase in spending (mostly on military) was similar to what Keynes would have recommended in such a deep recession. Only time would tell.

As the 1980’s turned to the 1990’s and then the early 2000’s, it became clear a different policy regime existed. Both parties followed similar policies despite their rhetoric otherwise. The results were also clearly different from the Keynesian “Golden Age”. Recessions were more rare: only two between 1982 and 2007. The two recessions were milder also. Inflation gradually decreased throughout the period too. But real GDP growth also moderated. Despite the lack of severe recessions, real GDP growth rates averaged significantly lower than in the 1950’s-1960’s. Employment was painfully slow to recover from the two recessions despite them being mild recessions. Employment took 2 1/2 years to recover from a short 8 month recession in 1990. Employment would not recover from a similar short recession in 2001 for 4 years. Inflation was clearly under control, the economy grew steadily but slowly, and yet full employment proved elusive.

The neoliberal era of slower but steady growth with low inflation and slow employment growth came to be known as “The Great Moderation”. It came to an end in 2008 when a mild recession that started in the financial sector suddenly exploded into a great, global financial crisis of 2008-09. The financial crisis created a recession much more severe than anything seen since the Great Depression. For a few months at the end of 2008 and early 2009, the economy was shrinking so fast and losing employment so quickly that some wondered if we would see “Great Depression v.2.0”. The crisis bottomed out in spring 2009, but has been slow to recover. It is fair to say that we are in another transition period today where both economic theory and policy is changing.

Looking Back at the Modern Era

The U.S. in 1947

Economic growth and improved standards of living have increased so much since 1947 that it is worth spending a moment to focus on what was the “typical” lifestyle of the “typical” American just after World War II.

By 1940, the “typical” family’s income had recovered to 1929 levels after the Great Depression. By 1946 after the war, the typical family income was 1/3 higher than in 1929. But this higher level of gross income masked true well-being. Taxes were much higher as a result of the war and the expansion of the income tax. Savings was higher, too, as most families had tried to recover the losses in wealth incurred in the Great Depression bank failures and market crash. In 1946, the typical family spent the same amount of money on consumption as they had in 1929 – 17 years with no real increase in consumption. So while technically their incomes were higher, actual living standards in the sense of goods and services purchased and consumed were unchanged from 17 years earlier.

The lack of production of consumer durables, houses, automobiles, and appliances, also contributed.  Life was “spartan” compared to today.  There was only 1 car for every 3 adults (we have 3 cars for every 4 adults today). Housing was scarce. Fewer than 44% of families owned their own home (over 66% do today). There was substantial “doubling up” of families.  One family in 14 lived with another family in the same residence. Seventy percent of unmarried adults lived with other adults or families. One in four senior citizens (over 65) lived with in their children’s households.

More startling is a peak at those homes themselves. Frank Levy in “New Dollars and Dreams” recounts: “One third [of all homes] had no running water, two-fifths had no flush toilets, three-fifths had no central heat, and four-fifths were heated by coal or wood. About half did not have electric refrigerators, while one-seventh did not have radios. Television and air conditioning were unknown.”

Geographically, the economic activity was heavily concentrated in New England, Middle Atlantic and Great Lakes states.  California, Oregon, and Washington, while growing very rapidly, were relatively small.  The South was still poorer than other regions and was still largely agricultural.  Central cities were thriving and suburbs were struggling.  This was largely because of the lack of both highways (Interstate highways and freeways wouldn’t arrive until 1956) and the lack of automobiles made living in the city easier and more sensible. That’s where the manufacturing jobs were.

The stage was set for growth.

The 1950’s

The “1950’s” can really be thought of as the period of 1949-1962.  The ideas of “Keynesian Economics” began to guide government economic policy. The government and military, although downsizing significantly from the peaks of WWII, would never return to the small size that characterized even the Great Depression, let alone the 1920’s or earlier eras. With a larger government also came taxes. Initially, corporations carried a significant part of the burden of income taxes. But starting in the 1950’s the burden was increasingly transferred to individuals via the individual income tax. This trend has continued to the present.

Prior to World War II virtually all federal taxes were paid by the wealthiest members of society. Less than half of American workers paid any income tax at all prior to WWII. In order to fund the war income taxes were increased and the majority of workers had to pay income taxes for the first time, but even after the war federal taxes on average Americans remained very low. As federal tax rates on the wealthy were decreased starting in the 1960s, the tax burden on middle and low-income Americans began to grow.

While most people are aware of the rates of the various income tax brackets, many people don’t realize that you actually pay a significantly lower portion of your income to taxes than is indicated by the income  tax bracket that you fall into. {from RationalRevolution.net]


As shown in the above graph, the top marginal tax rates during both the period from the start of World War II through 1980 were high, ranging from 70 to 90%. Yet, despite these high marginal rates, the GDP growth rates were substantially higher on average than later in the century. Contrary to popular political rhetoric, high marginal tax rates are not correlated with slow GDP growth. In fact, the opposite is true. Now a few words of caution are in order here. First, the pictured rates here are the marginal tax rate. Marginal rates are the tax rate applied to the next dollar earned above a certain threshold. The rates pictured here are for the highest tax bracket. Only the very highest income individuals paid these rates and then they only paid them on the amounts above the tax bracket threshold. For example, in 1940 the highest tax bracket started at $1,000,000 of income per year and there were less than a dozen individuals in the country with incomes in that bracket. The average tax rate (or “effective” tax rate) is always much lower than the marginal bracket. In fact, throughout the entire period virtually nobody, regardless of tax bracket, paid more than 30% of their total income as federal taxes. Finally, while the highest marginal tax rate was high during the 1950’s and 1960’s, the level of the highest tax bracket was similarly high, often requiring a high six-figure or even million dollar income to reach the highest tax bracket and highest marginal rate. In 1978 and 1982 when marginal tax rates were lowered dramatically, the threshold for reaching the highest tax bracket was also lowered to close to $100,000, putting vastly more people into the highest tax bracket.

But taxes were not the focus of popular attention in the 1950’s and early 1960’s. Instead, it was about new technologies, growth, families, and “Progress”. One of the most significant economic influences to emerge from this era was the baby boom. Birth rates were understandably very low in the Great Depression and World War II. During the Depression because families couldn’t afford children and during the War because prime-age men were overseas fighting. Starting in 1946, though, couples began making up for lost time making babies. Lots of them. (trust me, I’m one of those babies). The birth rate remained very elevated until approximately 1965. The large number of children born during these two decades become known as the “baby boom” and they would dominate the economy for much of their life. One of the first effects of the baby boom was a need for more schools, colleges, and universities. One of the few positive effects of the Great Depression was an increase in average years of education. Prior to the Great Depression, only a minority of children finished high school and a very tiny minority of those H.S. graduates completed university. But without jobs to get in the Depression, young people stayed in school in increasing numbers to finish high school. The trend continued post-war and accelerated to include university and college educations. The G.I. Bill put returning veterans in college classrooms. The combined effects of students completing more years of schooling on average and having much larger numbers of young people due to the baby boom meant a huge expansion in schools, colleges, and universities. Most community colleges were founded during this period.

The 1950’s is really the era when a national market began to emerge. Interstate highways and increased production of autos put Americans on the road. New housing was built to satisfy the pent-up demand, most of it in new suburbs of large cities. The middle and working classes, especially whites, begin to move out of large cities and into suburbs. Central cities continued to attract migrants for some time, but instead of the migrants being poor European immigrants, these  migrants were Americans. They were former farmers, blacks from the South, or Hispanics. These migrations, minorities and poor into cities and middle-class whites into suburbs, would eventually lead to other problems.

Television dawned and rapidly penetrated into nearly all households.  Since television at the time was largely limited to the offerings of 3-4 national networks, television had a two-fold effect on the economy. First, it helped form a true single, national culture, making it easier to make products that appealed to all parts of the country.  Second, it provided a cost-effective channel for advertising, marketing, and promotion.  It gave national producers and stores an advantage over local or regional, adding to further concentration in business.

Productivity grew significantly across the whole economy. Exports were significant.  Business thrived, but so did workers. Unions reached their peak of membership as a percent of workforce in the 1950’s. Middle-class wages and incomes rose. Family incomes rose significantly as large numbers of formerly poor or working-class families rose to join the middle class. There was some narrowing of income inequality. There were 3 recessions and also some periods of inflation, but overall they were short and mild compared to the Great Depression and depressions before 1929.  It appeared that the new “Keynesian economic policies” were working.  Social security began to provide for most Americans over age 65.  Indeed, the idea of “retirement” became a popular concept.  In 1947, over half of all white males over age 65 continued to work. By the mid-1960’s, though, retirement was the expectation of most people over 65.

The 1960’s: turbulence and Signs that All is Not Well

The Sixties continued the economic well-being of the 1950’s, but socially and politically ended in turbulence.  Growth continued and with it rising incomes of typical families. Emboldened by studies (later disproved) that seemed to show a precise trade-off between unemployment and inflation and the birth of large computers and statistical modeling, economists began to try to not just ease recessions, but to “fine-tune” the economy precisely. By the end of the decade such “fine-tuning” efforts would fail. But social change and unrest was happening as the nation struggled with both a strong military-industrial complex that demanded increased funding and a realization that economic opportunity wasn’t necessarily widely shared.

In the years 1962-1965, the nation was shocked repeatedly. First, the cold war nearly became an active nuclear war during the Cuban missile crisis.  By this point the “cold war” between the U.S. and the U.S.S.R was in full swing. Normally the two superpowers fought each other through “proxies” – smaller nations that were funded to fight each other so that the U.S. and U.S.S.R wouldn’t directly fight each other. Direct conflict was risky since both sides could easily annihilate the world with the relatively new nuclear missiles. In 1960 President Eisenhower, a former WWII general (think D-Day) warned of a growing “military-industrial complex” in Washington that pushed constantly for increased military spending. The Korean War had reached an armistice by 1953. Since then the U.S. had only engaged in covert and limited military actions overseas. By the early 1960’s though, a war of colonial independence in Southeast Asia in Vietnam was becoming a civil war between U.S.-backed Vietnamese generals in the South and Soviet-backed leaders in the North. The war was initially a guerilla civil war. But on August 2, 1964, U.S. ships and planes engaged the North Vietnamese navy in the Gulf of Tonkin off Vietnam sinking 3 North Vietnamese ships and killing 6. Two days later the U.S. claimed that North Vietnam attacked U.S. ships again, although later declassified U.S. documents show there was no attack at all. Nonetheless, the claimed attack led to Congress passing the Gulf of Tonkin resolution allowing the President to commence war. The Vietnamese guerilla civil war now became a major conventional war with U.S. involvement. The U.S. continued to escalate throughout the rest of the decade until the abandoning the conflict in 1974. The war proved extremely costly. In human terms, over 58,000 Americans were killed, a number dwarfed by the nearly two million Vietnamese. Over 3.5 million American servicemen served in the war in Southeast Asia. Another 5.5 million were in uniform elsewhere. Most of these soldiers were drafted, not volunteered. The draft had never been ended after the Second World War. The draft was finally ended in 1973. The war was largely unpopular at home, particularly with younger people who were after all the targets of the draft. Economically, the war would prove very, very expensive for the government leading the military to compete with social needs at home for spending.

Perhaps the single most shocking event of the decade was the assassination of a popular sitting President, John F. Kennedy in 1963. Kennedy was younger than previous presidents and was very charismatic. He seemed to epitomize the confidence on growth and “progress” that had grown in the 1950’s.

A series of network TV specials, led by CBS’s 1960 “CBS Reports: Harvest of Shame”, focused attention on parts of the nation that aren’t benefitting from the nation’s growth and where poverty and hunger persisted. Graphic images of poverty and hunger, brought right into people’s living rooms for the first time thanks to the new medium of television shocked the nation. The new medium also provided live coverage of the growing civil rights movement aimed at ending segregation in the South and restoring voting rights to all Americans. A particularly compelling moment came with live TV coverage of the March on Washington led by Dr.Martin Luther King when he gave his famous “I Have A Dream” speech. The nation responded to both the civil rights marches and the Kennedy assassination with a blitz of social legislation that would change the economic landscape.  The Civil Rights Act of 1964 established, among things, that racial discrimination in employment is illegal and establishes the Equal Employment Opportunity Commission for enforcement. Later laws would also outlaw age discrimination (1967) and discrimination against the disabled (1973 and 1990). Social Security benefits were strengthened to help support seniors. Efforts to create universal healthcare for Americans had been made since 1948. Despite the European nations all having universal healthcare, it was still impossible to pass for all Americans. Instead, though, Medicare was created providing government-paid socialized medical care for all senior citizens (over 65).  Numerous welfare programs were established including Aid For Dependent Children, Urban Renewal, Medicaid, and the Job Corps.  Many of these programs were promoted under President Johnson’s banner of a “War on Poverty”.

The latter part of the sixties proved a paradox. Economically, the country continued to grow and incomes rose for all segments. Inflation remained relatively low, but was rising incrementally. Unemployment was near record lows, at times as low as 3%. Yet, civil unrest grew and cultural norms changed rapidly. The growth of the suburbs, the South, and the West, combined with white flight led to moves of businesses, particularly manufacturing out of the central cities of the North and Northeast.  Behind them, the cities had to cope with increased needs, a deteriorating infrastructure, and a weakened tax base.  The nation’s growth was not reaching minorities in the cities. Frustration among minorities with the oppressive tactics used by predominantly white police forces led to riots and destruction in the many major cities including LA, Newark, Detroit, and others in the mid- to late-sixties.  Some cities would never fully recover from the damage of the riots.

Separately, the baby boomers were now moving onto college campuses in record numbers, but many of them with the threat of draft hanging over them. The baby boom generation gave birth to rock-n-roll and took over the popular culture. Unrest with established ways was common. The civil rights movements of the early 60’s merged with the young people protesting the war. By 1968, the The riots of the sixties also merged with rebellion on college campuses.  College students were largely rebelling against the Vietnam War and the draft.  The draft, which had been instituted in 1941 for World War II had never been ended.  The increasing demands of the Vietnam War for soldiers triggered protests and at times riots on numerous college campuses.

Yet despite the civil unrest and political divisions over the war, civil rights, and cultural issues, economically the U.S. economy continued to prosper.  Productivity continued to grow. Typical family incomes continued to grow.  Income inequality continued to decline. There were indeed signs of coming economic problems. Inflation began to creep upward slowly but persistently. The government proved unable to fund both the Vietnam War, the Cold War military expenditures and the growing War of Poverty without either increasing taxes or borrowing. More significant was the decline of American export power.  European nations and Japan were now recovered from World War II and competing with Americans, often times with newer factories.

Transition Away From Keynesian Golden Age: The 1970’s and Stagflation

The deterioration of U.S. exports in the 1960’s continued. At the same time, imports grew as the country attempted to continue its growth in consumption and also fund the Vietnam War.  Inflation rose. The U.S. dollar had been the only currency still backed by gold after World War II.  In the Bretton Woods agreement at the end of the war that tried to re-build a world trade system (a task sorely missed after WWI!), most exchange rates were fixed. That is, the U.S. dollar was pegged to a fixed price of gold and most other currencies fixed at an exchange rate into dollars. As trade flows ran against the U.S. and inflation rose, the U.S. faced a currency crisis. Gold reserves were depleting. In 1971, President Nixon then ordered the U.S. off the gold standard. Ever since, the U.S. dollar has been a “fiat currency“.

In 1973, though, the real shock occurs. In response to the U.S. decision to re-arm the Israeli army following the Yom Kippur War, the Organization of Petroleum Exporting Countries (OPEC) embargoes oil shipments to the U.S.  The price of oil increases dramatically as you  can see in this graph oil prices (real and nominal) from 1861 – 2008. The hundred year era of cheap oil was over.
Graph Oil Prices

Later in the decade, the price would again double. The rise in the real price of oil from less than $20 per barrel (2008 dollars) to over $100 per barrel in less than a year was only one of several fundamental shocks to the economy that occurred in the 1970’s.  While the fundamentals in the late 1940’s favored growth, the fundamentals in the 1970’s favored stagnation.

The 1970’s brought several recessions. But unlike earlier recessions, inflation would stubbornly not fall very much in the recession. Instead, after the recession it would resume a slow but steady climb again but from a higher level. By the end of the decade, inflation would reach higher than 15%. Unemployment, too, appeared to be resistant to government efforts to bring it down after each recession. To many economists and government policymakers, it appeared that Keynesian economic policy had failed. In reality, while economic growth had slowed compared to the 1960’s (largely due to oil price shocks) the economy was still growing, particularly compared to the growth rates we have experienced between 1990-2010. But the economy could not grow fast enough to create jobs for all those who wanted them at the time.

Besides oil, the 1970’s saw two major demographic shifts that would drastically alter the makeup of the workforce. First, the baby boomers, the result of the 1946-1963 elevated birth rates, began to graduate in large numbers and enter the workforce.  Second, a major change in cultural values resulted in women, particularly married women, entering the workforce.  In 1960 and earlier, it was the norm for unmarried women to hold jobs, but the typical woman quit the workforce when she got married, never to return unless she divorced.  By 1980, the norm in society was that women should have their own “careers” or at least their own employment, even if married. By the 1980’s most married women are  in the workforce.  These two trends created a huge new influx of workers looking for work. Creating new jobs for all these new workers would be a formidable task for a rapidly growing economy.  For an economy where the cost of energy had just doubled, it was too much.  Despite several years of rapid growth in real GDP and total output, the unemployment rate continued to rise as the labor force swelled from new entrants.  The economy suffered a severe recession in 1974-75 from which employment recovered very slowly. President Carter’s term of 1977-80 proved to have above – average growth in real GDP (over 4% per year), but it wasn’t enough to keep unemployment rate from rising given the onslaught of new would-be workers.

During this decade inflation also grew into a major problem.  Behind the scenes, one of the fundamental problems was that productivity had stagnated.  Along with it, incomes of typical families also began to stagnate.  The nearly 30-year rise in family incomes was over.  The 1970’s became known as the decade of “stagflation”.  The problem was further compounded when The Federal Reserve began a tight money policy in 1979 to attempt to stop inflation. By 1980, interest rates to buy a home had reached 14%. The combination of a hostage crisis in Iran, double-digit inflation, and near double-digit unemployment doomed Jimmy Carter’s bid for re-election in 1980.

The Neo-Liberal Era Dawns: 1980’s and 1990’s

Prior to 1978, there was clear political consensus on Keynesian policies and a recognition that while markets and private enterprise may produce startling new products and wealth, there were clear roles and responsibilities for government in a modern economy. Government was seen as the provider of social insurance such as Social Security, Medicare, and welfare, the provider of equal opportunity in the form of public education and universities, and the provider of macroeconomic policies that would encourage full-employment.  But the problems of the 1970’s, inflation, relatively high unemployment that wouldn’t come down, and the oil shocks challenged this consensus. Add to this mix a group of economists led by the Milton Friedman at the University of Chicago set out to challenge and “disprove” Keynesian theory. These economists began to win dominance in academic research in economics in the 1970’s. Many of their students and followers then went into politics. In 1978 a devout follower of the new theories, called “neo-liberalism”, Margaret Thatcher won election in Great Britain. In the U.S., President Jimmy Carter actually began much of the work recommended by these anti-Keynesians, in particular he began a program of deregulation and urged the Federal Reserve to fight inflation. But Carter would not be the name associated with the movement. That mantle fell to Ronald Reagan. His landslide election in 1980 firmly cemented a change in direction in US government with respect to the economy and business. After 1980, the general view in politics, including both parties, viewed government as usually the “villain” in any economic story.  Deregulation was favored and macroeconomic policy was to be handled by The Federal Reserve, the central bank, with an eye towards stabilizing inflation as much as employment. The era of Neo-Liberalism had arrived.

Deregulation and New Philosophy

In an attempt to “jump-start” the economy and promote improved productivity, President Carter began a program of deregulation.  Under his direction, the airline, trucking, and railroad industries would be deregulated and competition promoted. He also pursued anti-trust lawsuits against AT&T (the monopoly telephone company – both landline and long-distance) and IBM.  Eventually these lawsuits would settle resulting in the break-up of the AT&T and keeping IBM from becoming a monopoly in the newly created personal computer business. Carter also appointed a new chairman of the Federal Reserve Board, Paul Volcker, and told him to do whatever was needed to bring inflation down (hence the high interest rates). But despite these efforts, Carter was stuck with the “stagflation” label. With inflation and unemployment both over 10% and interest rates running from 15-20%, the nation responded to the appeals of Ronald Reagan for “less government” in 1980.  Reagan swept into office with a landslide.

Note on Labels: I am referring to the dominant economic-political philosophy of this post-1980’s era as Neo-liberal. This is how the philosophy and theories are described in the economics field. It is based on the idea that the theories that advocated laissez-faire, small-government, unregulated markets before Keynes wrote in the 1930’s are called Classical Liberal theories. The “liberal” part means that the theories advocated free (or “libre” in French) markets. In the rest of the world today, these Reagan-style ideas of free markets, deregulation, and small government are often referred to as “neo-liberal”. However, in the U.S. since the late 1960’s the word “liberal” as used in everyday political discussions and campaigns has reversed meaning from its historical usage and from the usage in much of the rest of the world. Today, in the U.S., “liberals” in political debates are usually those people who advocate some degree of Keynesian government involvement in the economy. And, today, those people in political discussions who call themselves “conservatives”, “right-wing”, or “libertarian”, are actually the advocates of what everybody else calls neo-liberalism. Confusing, right?

Reagan led the shift in American economic policies. Reagan and his successor, G.H.W. Bush defeated more traditional Keynesian-style Democrats in 1984 and 1988. By 1992, when William Clinton ran and won the White House for the Democrats, the shift was complete. Clinton and the Democrats now shared the same deregulation, let free markets do it, privatize government functions as the Republicans. The post-World War II consensus of Keynesian policies, social insurance, and narrowing income inequality was over.  The nation made a determined turn towards more deregulation and a reliance on markets without government interference.  There were some differences between the two parties, but the differences were relatively minor and greatly over-blown by political rhetoric. No longer was the U.S. government committed to using economic policy first and foremost to maintaining full employment, creating jobs, and increasing middle-class incomes. Instead, the dominant focus of policy became controlling inflation, maintaining rising prices in stock and housing markets, and promoting globalization. Policies regarding jobs and employment shifted from making sure there were enough jobs in total to trying to better train the unemployed to get what jobs were available. “Full Employment” as a goal gave way to “full employability” of workers. With this shift came a subtle social and cultural shift. In the 1960’s the existence of unemployed workers was seen socially as a failure of government to follow policies that would create enough jobs. By the 1990’s, the existence of unemployed workers was seen as evidence that the unemployed workers were themselves to blame – they were lazy, unskilled, or uneducated.

Taxes and Deficits

Reagan had campaigned on a promise to lower tax rates. And he did it. (see graph above in the discussion of the 1950’s) Reagan managed to pass a major income tax rate cut and restructuring of income tax rates in 1981 and again in 1986. The reforms also substantially reduced the number of tax brackets down to 3.  Although Reagan and his supporters had originally claimed that by reducing tax rates people would be motivated to work more and earn more and thus, the total dollars collected in taxes would actually rise. This did not happen. Instead, tax dollars collected relative to GDP and the budget declined. The original Reagan plan also called for substantial cutbacks in social programs and social spending, but the plan encountered resistance in Congress.  The compromise was that social spending was limited in growth.  What was not limited, though, was military spending.  Military spending, particularly for equipment and technology, rose dramatically as the Cold War intensified.

As The Federal Reserve continued its push to keep money tight and interest rates high as a way to fight inflation. And gradually, inflation has declined from then until now when we face less than 1% inflation in 2010. But back in 1980-1983, the tight money policy plunged the economy into two recessions nearly back-to-back. The second, the recession of 1982-83 was, at the time, the worst since the Great Depression. In late 1983 and 1984 as the tax cuts took effect and the government increased it’s spending in many areas (particularly military), the economy bounced back quickly. Although political leaders claimed it was a validation of the wisdom of the new economic policy ideas, the reality is that it was an inconclusive test. Keynesian policy also would have called for tax cuts and spending increases in the same situation.

One thing the tax cuts did for certain though, was to reverse a 35 year trend of decreasing government deficits as a % of GDP. Tax cuts with increased military spending and a severe recession in 1982 resulted in large deficits.  In the 1990’s the deficit relative to the economy would decrease again under President Bill Clinton but this was largely the result of Clinton being fortunate enough to have 8 years with a recession (so GDP never shrank) and having relative peace so that the growth in military spending could be restrained. In 2001, George W. Bush (G.H.W.Bush’s son) took office. Bush immediately cut taxes dramatically and particularly for people in upper income brackets. Also in the same year, the attacks on the World Trade Center (9/11) took place leading the nation to war in Afghanistan and the present “Global War on Terror”. This was followed in 2003 with the war in Iraq. Both wars and the creation of the Dept. of Homeland Security led to dramatic increases in government spending. Bush’s administration also opened with a mild recession in 2001 from which jobs would not recover until 2005. These combined effects, tax cuts, two wars, and a recession meant that government deficits skyrocketed during the Bush administration. In nominal terms (not accounting for inflation), the national debt doubled during the Bush administration.
national Debt to GDP

Banking and Deregulation

One of the areas where Reagan and his successor, George H.W. Bush (the elder) made major deregulation efforts was in banking and finance. Since the 1930’s and the Great Depression, banking and Wall Street were strongly regulated. For example, following the 1929 stock market crash and the widespread bank failures of 1932-33 Congress led an investigation of the banking industry. What came from that investigation and public outrage at bankers in the Great Depression was a set of laws that severely regulated banking. Commercial banks that take customer deposits were prohibited from engaging in stock market and investment banking activities. Banks that made primarily mortgage loans were very strictly regulated. Investment banks (Wall St) were kept separate from commercial banks (where you keep your checking account and do your banking). Banks were limited in size. As a result, between the 1930’s through the 1970’s, banking was considered a safe, boring business with mild, but assured profits.

That began to change in the 1980’s. Gradually, new laws were passed starting in 1978 and continuing all the way until 2007 that gradually repealed all of the Great Depression era financial regulation. As each sector of the banking and finance industries became deregulated, they began to engage in high-risk behavior in order to raise profits even higher, knowing the government insured against loss. “Savings and loans” were banks that were limited to making real estate loans and holding longer-term “savings” deposits of customers.  In the early 1980’s regulation of these institutions was greatly reduced. Despite the prevailing political view that “the market” would regulate and discipline managements, it didn’t work out that way. Deregulation combined with dramatically increased risk-taking by S&L executives (and fraud in some cases).  The result was a wave of failures that overwhelmed the S&L deposit insurance agency (FSLIC) in the 1989 Savings and Loan Crisis. The federal government had to step in with a nearly $200 billion rescue package.  After that, S&L’s were regulated as banks. This unprecedented (at that time) bailout of the banks exacerbated the government deficits.

But despite the failure of the S&L’s, the trend toward deregulation of banking and finance continued. Banks got larger. In 2000 the prohibition on commercial banks engaging in investment banking disappeared. To compound the problem, banks got more creative and invented new types of financial contracts called derivatives. Derivatives typically involved “bets” on what would happen to some other financial asset or price. For example, one type of derivative is called a “Credit Default Swap”. It is essentially a bet between two parties about whether some other party will default on a loan or bond. A huge volume of derivatives revolved around mortgages which required banks to sell increasing numbers of home mortgages from 1995 through 2006 in order to earn increasing fees and profits. Banking was no longer dull and boring.

In the late 1990’s, much of the excitement in the financial markets centered on new startup companies related to the Internet issuing new stock. This led to a major investment bubble called the dot.com bubble. At one point, a simple company called “Pets.com” was begun with venture capital money. The firm simply offered to sell pet food over the Web. It had a TV advertising campaign that was memorable featuring a sock puppet. The firm never made a profit. The firm did not project making a profit for many years. Yet it went public in a stock offering and at one point the firm was valued in total at over several hundred million dollars. It filed for bankruptcy a couple of years later. When the dot.com investment bubble burst in 2000, the NASDAQ (a stock exchange for newer, startup companies) collapsed and dropped in value by over 60%. It has never recovered to its previous levels. Banking was definitely not boring. But the most exciting decade was yet to come.

Once the dot.com bubble burst, Wall St. moved onto the next profit-maker for them and investment bubble: home mortgages. As Wall Street invented new ways of packaging mortgage loans, it also developed a new way of financing home mortgages. Historically, home mortgage loans were made by banks (or S&L’s when they existed). The bank kept the loan on their books and collected the payments from homeowners for the 20-30 year life of the mortgage. In the past, the bank had a vested interest in assessing the credit-worthiness of the borrower, After all, if the homeowner didn’t or couldn’t pay, it was the bank’s money that was lost. Now, however, banks became mere conduits that collected fees. They found new home buyers or convinced existing homeowners to re-finance. Then the bank would sell ownership of the loan to a group of investors in a package of mortgage loans called a “mortgage-backed security” (MBS). The bank made profits by collecting fees on both making the original loan and when selling the loan. With this shift in doing business, banks no longer cared about the credit-worthiness of the borrower or how secure the mortgage was. If the loan went bad, that was the investors’ problems later, not the bank’s problem. What mattered to banks was generating an increasing volume of new mortgages. The public was sold the idea that homes were the best possible investment – that “house prices never go down”. Historically, they aren’t. Many times in the last century house prices have declined. Yet a mania developed. Mortgages with zero down payment were offered. Mortgages to people with no income were made. House prices kept rising. Construction boomed. A bubble developed. At one point, there were more than 1.2 houses for each household in the country. TV shows explained how to get rich by “flipping” a house: buying it with no money down, making some cosmetic changes, and then selling at a profit. Then, in 2005-06, the bubble burst. House prices stopped rising and began declining. Loans went bad. Foreclosures skyrocketed. Unlike the dot.com bubble burst of 2000, this couldn’t be fixed by easy money policy from The Federal Reserve.

In the period between 1984 and 2008, the banking and financial industries transformed from accounting for only 4% of GDP and 16% of all corporate profits in the U.S. to over 8% of GDP and 41% of all corporate profits. In 2008, though it appeared to come a crashing halt.

Income Distribution and Incomes for Most People

The boom in banking proved extremely profitable for bankers. Their incomes soared. Senior banking executives soon were awarding themselves annual bonuses in the tens and hundreds of millions of dollars per executive per year. Even the average financial industry worker benefitted with their income doubling over the 30 years of the neo-liberal era. Other workers, though, the hundred-plus million workers in the rest of the economy were not so fortunate in this era.

Reagan and Bush represented a shift in government policies towards more business-friendly, deregulated approaches.  The idea was that markets and private businesses could do more than government and “the market” would discipline bad managers instead of regulators. Government and unions were blamed for the stagnation of the 1970’s.  Indeed, Reagan set a new tone in policies towards unions in 1981 when he decertified and broke PATCO, the union of Professional Air Traffic Controllers.  Union membership, which had peaked in the 1970’s, began to decline significantly. The new, tougher stance against unions in the nation combined with revisions to the income tax code (1981 and 1986) and payroll tax increases for Social Security made capital investments relatively more profitable than wage earnings.  Indeed, a new long-term trend emerged: declining real wages for workers.

Unlike the decades of the 40’s, 50’s, 60’s and even the 70’s, though, growth after 1984 was different. Though widely touted as a high-growth era, it in fact featured quite modest growth rates. But while growth was slower, it was more stable. Recessions were few – only two between 1984 and 2007 – and mild. The 1991 and 2001 recessions were themselves different from the pre-1984 recessions as can be seen in this graph of relative job losses in each post-war recession:

: percentjoblossespostww2recessions.jpg

Earlier recessions had been V-shaped: sharp, quick, deep losses in jobs followed by equally robust recoveries. But the 1990 and 2001 recessions change this pattern. The total job loss was milder, but the recovery was much slower and took longer, leading the recoveries to be described as “jobless recoveries”. This is because prior to 1984 most recessions were cyclical. Typically manufacturing workers would be laid off temporarily by their employer. As the economy recovered, workers were recalled to their old jobs.  But with the 1991 recession a new pattern emerges. Increasingly the workers laid-off in a recession find their jobs gone permanently due to either technology, international trade, outsourcing, or simple obsolescence. Often these workers found that when the recession ended, the job they used to have was now located overseas and they had to find a new job with a new employer – often in a different industry, location, or occupation.

In the decades leading to 1929, income inequality widened in the U.S. In the good times the rich got richer faster than the poor. In bad times, the poor lost more. In the 1920’s, the richest 5 percent of all families received 30 percent of all family income. But from the Great Depression until the late 1970’s, the trend reversed.  Income inequality narrowed. the period from The Great Depression until 1973 is sometimes called the “Great Compression” because while all classes of income saw incomes rise, the poor and middle class incomes rose faster than the rich, thus closing the gap. The rich continued to get richer, but lower-income quintiles gained somewhat faster. The gap between rich and poor began to “compress”

In 1947 the share of the richest 5 percent of families had shrunk to 17.5 percent and eventually to 15.6 percent by 1969, leaving that much more for the other 95 percent of all families.
Income Inequality

Prior to the mid-1970’s, all quintiles were experiencing rising real incomes. But after the mid-1970’s, the lower quintile incomes  stagnate, particularly the lower 3 quintiles. As the decades since 1980 have passed, the gap between rich, middle class, and poor has not widened, but widened at an increasing pace. Since 1980 the rich, and the very rich in particular, have seen their incomes continue to rise. Meanwhile real incomes for the middle class have stagnated and incomes for the poor have actually declined. By 1996, the share of income the richest 5% of families earned had risen to 20.3 percent of all income – more than in 1947. In 1996, 560,000 families out of the nation’s 116,000,000, a mere 0.5% of all families accounted for 11% of all income reported to IRS.  The trend has continued since 1996.  Today, income inequality is much wider and roughly equal to that of 1929.

Declining Wages and Rising Capital Gains.

The stagnation of incomes for middle and working class workers in the above graph, though, is deceptive in some ways. First, the graph is based upon “family incomes”.  During the year period there was major change in the number of income earners per family.  As married women increasingly joined the workforce, middle, lower, and working class families were increasingly supported by two wage-earner incomes.  This entry of married women into the workforce in large numbers coincides with the period when these family incomes begin to stagnate.  Underlying both is the fact that the share of national income devoted to wages is declining during this period.  In other words, real wages, which have always varied with the business cycle, began a long-term decline trend in 1982.  Married women joined the workforce in large numbers partly for social/cultural reasons, but clearly economic pressures to maintain the family income forced large numbers of married women into the workforce.

Share of NonFarm Output

From the close of World War II through 1973, average wages, adjusted for inflation, grew at 2 to 3 percent per year.  As the nation gained from increases in productivity, the gains were equally shared between workers (labor getting higher real wages) and the owners of capital (growing profits). It was the basis of the great upward mobility that many families experienced.  After the oil embargo of 1973, wage growth slowed dramatically and never recovered. As the above graph shows, a clear shift occurred in the recession of 1982. From 1982 onward, labor’s share of the national income declined and capital’s share has increased. This resulted in the stagnant and declining incomes for workers and poor households while the very wealthy, most of whom get income from profits and capital gains and not from labor income, got richer.

This decline in middle-class and poor incomes over the 1982-present era was an important component leading into the financial and banking crisis of 2008. As real incomes declined or stagnated, most American middle class and poor households began to take on increasing amounts of debt in order to maintain their existing lifestyles and expenses, particularly healthcare expenses. The prime vehicle for such borrowing was their house, credit cards, and student loans. American households gradually became more and more indebted to the banks to make up for the failure of incomes to rise. But there’s a limit to private debt. And we hit those limits in 2008.

The American Economy: Stay Tuned

The Start of the Great Recession

The collapse of house prices starting in 2006 led to increasing foreclosures and concerns over banks’ viability. The failures of some high-profile banks and finance-related institutions in 2008 (Bear Stearns, AIG, FannieMae/FreddieMac, Merrill Lynch, IndyMac bank, WaMu Bank, Countrywide Mortgage, and Lehman Brothers) brought a world-wide banking crisis and stock market crash in the fall of 2008. Policymakers failed to implement policies to keep the financial crisis/credit crunch from affecting the real economy of goods and services. A slow, modest recession that had begun in December 2007 turned into a full Great Depression-style collapse. Indeed, the declines in employment, real GDP, and incomes from September 2008 until April 2009 matched or exceeded any similar period in the Great Depression. For 6-8 months, it appeared the U.S. and developed world were headed for Great Depression 2.0. Two large automakers, GM and Chrysler, went bankrupt. Despite a $350 billion bailout by the Bush administration in October 2008, the banks continued to struggle and look unhealthy. A new President arrived in January 2009, but the slide continued. The nation was losing over half a million jobs every month. Unemployment rose to approximately 10%. The Federal Reserve took interest rates to virtually zero with little impact or stimulus effect.

Then in February 2009, the U.S. government resorted temporarily to old-style Keynesian economics. It passed and began to implement a $780 billion stimulus bill. The bill and increased federal spending had the effect of halting the decline in the economy. Accounting changes and more bank bailout money turned the finances of the big banks around (2009 would be near record profit year for banks despite the bad economy). But, the stimulus bill was too small to re-start the economy. It’s stimulus effect of increased spending was largely offset by spending cuts at state and local governments. The fall was stopped, but it was too little and too slow to re-start strong growth. Politically a populist, neo-liberal backlash combined with bank lobbying stopped an attempt to re-regulate the banks or to increase the stimulus. 2009 proved to be the end of the decline, but no real growth. As 2010 wore on, the economy continued to be in technically positive growth, but a growth rate so slow (less than 2%) that unemployment would not improve. By 2010, political leaders had tired of trying to stimulate the economy and were turning their attention to trying to reduce government spending and the budget deficit.

From American History to Your Story

As the U.S. economy enters its second decade of the 21st century, it faces several challenges. It has suffered the worst recession since the Great Depression in both length and depth. It has not recovered from that recession and growth appears so slow that it will be years, if not a decade before full employment returns. Political consensus doesn’t exist on what to do. In the short-term, the U.S. government continues to be able to borrow at favorable rates (1-2%) compared to virtually all others in the world except Japan, yet the government has shifted to trying to cut spending and investment. Income inequality has again widened.  In the past when income inequality has reached such levels, it has meant either political revolt at the polls or economic catastrophes such as the Great Depression.  Job creation has been weak for a long time and looks to not improve. The short run future, while not as bleak as it was in December 2008, is still grey, gloomy, and uncertain.


But one lesson we can learn from the U.S. economic history is that there have been no straight, linear, uninterrupted paths. Often the unexpected has changed the nation’s course.  At times, the nation has been rescued by the innovations and creativity of its people. At other times crisis has brought people and politicians to consensus and a move forward. I will end the story here.  We are now clearly in the present yet this is a “history” course.   History, properly told, oftentimes requires some distance. From here on, the story of the American economy is now your story to help create and to live.