The Reporter (original) (raw)
It is an honor to be here today; I owe my love of economics to the Bureau as well as my many friends and colleagues. Marty [Martin] Feldstein was one of the people who made it such a special place. I enjoyed seeing him around the Bureau, learning public finance from him, and briefly serving as his research assistant. I’d sit in his office, in awe of his incredible intellect and economic insights, and be completely distracted by the hilarious cartoons he had framed in his office. My favorite was the one in which Marty is depicted rowing in the wrong direction in a skiff while President Reagan yells “Feldstein!” They all reflected his steadfast willingness to speak his mind, to “speak truth to power,” even to the president of the United States.
While I would never presume to compare to Marty as chair of the Council of Economic Advisers (CEA), I did share his view of the role of the CEA in an administration. As my staff knows all too well, I said (perhaps more often than we’d like to admit) that I did not believe anyone should say something that would require them to “give back their PhD.” Our integrity matters, and I believe that any decision-maker — the president of the United States included — benefits from hearing what his or her staff actually thinks.
Marty took over as chair of the CEA as the economy was recovering from a period of high inflation and a subsequent recession. I was appointed during a different crisis. To truly understand the nature of the challenges we faced, it is useful to think back to where we were in 2021. First, we were still in the midst of a pandemic: thousands of Americans were dying each week. The first death was recorded in the US at the end of February of 2020, and by the time President Biden took office in January 2021 that number had reached 460,000. By then, the first of the vaccines were available, but we did not know how effective they would be at containing the spread, how long immunity would last, or how quickly they could be distributed and administered. While there was hope, the end of the pandemic was not yet clearly in sight.
Given that we did not understand the nature of the virus (Do we or don’t we need to wear masks? Do we need to wash our groceries?), that we had no natural immunity, and that we had no medical response, we were asked to limit contact with other people and stay home if possible. In March and April of 2020, the number of Americans living under stay-at-home orders reached more than 300 million.
Despite the shutdowns, stock markets quickly recovered. And given that the pandemic restrictions were basically about face-to-face interactions, people swapped their services consumption for durables. A switch of this magnitude from services to goods had never happened before over such a short period of time. And because those “things” had to be produced and shipped, we ended up with massive supply chain disruptions. The New York Fed’s Global Supply Chain Pressure Index, which attempts to measure the presence of supply constraints in the economy, spiked, reaching its highest value on record in December 2021. Shortages of microchips and semiconductors due to COVID restrictions, the total shutdown of many supply chains cutting through China, and disruptions in international shipping all played a key role. The cost of shipping a container from China to the West Coast of the United States increased from about 1,300percontainerinFebruaryof2020toabout1,300 per container in February of 2020 to about 1,300percontainerinFebruaryof2020toabout20,000 in September of 2021.1
Supply chains are just one example; other novel issues that emerged ranged from childcare and school closures to declining commercial real estate values in cities (particularly offices), many of which continue to have repercussions in our society today. The key point is that this crisis had a distinct cause and consequences, and we can learn something new from it.
Which leads to the heart of my talk: What lessons can we learn from the pandemic and our responses to it? We will be learning from this crisis and evaluating the response for years to come; for now, I will focus specifically on three lessons I learned during my tenure at the CEA. And I will discuss them with a focus on three areas: fiscal responses, unemployment insurance (UI), and labor markets.
The first lesson: In a crisis, policymakers can’t let the perfect be the enemy of the good.
First, I want to take us back to 2021 so that we can remember the potential crisis we were facing. Weekly initial UI claims tell the story well. As shown in Figure 1, at the beginning of March 2020, weekly claims were about 207,000; just two weeks later, they were ten times that, and at the beginning of April, claims reached a high of 6,137,000. This was nearly ten times the peak of weekly claims during the 2008 financial crisis.
Figure 1
In response, in 2020, Congress passed and then-President Trump signed two bills: the Families First Coronavirus Response Act on March 18, 2020 (providing 192billionforCOVIDresearch,enhancedUI,andhealthfunding),andtheCoronavirusAid,Relief,andEconomicSecurityAct(theCARESAct)lessthan10dayslater(providingmorethan192 billion for COVID research, enhanced UI, and health funding), and the Coronavirus Aid, Relief, and Economic Security Act (the CARES Act) less than 10 days later (providing more than 192billionforCOVIDresearch,enhancedUI,andhealthfunding),andtheCoronavirusAid,Relief,andEconomicSecurityAct(theCARESAct)lessthan10dayslater(providingmorethan2.2 trillion in economic stimulus). CARES alone was the largest stimulus package in American history. These were followed by the Coronavirus Response and Relief Supplemental Appropriations Act of 2021, which was signed in December 2020, providing 900billioninadditionalfundingandstimulus.Andthen,in2021,CongresspassedandPresidentBidensignedtheAmericanRescuePlan,whichaddedyetanother900 billion in additional funding and stimulus. And then, in 2021, Congress passed and President Biden signed the American Rescue Plan, which added yet another 900billioninadditionalfundingandstimulus.Andthen,in2021,CongresspassedandPresidentBidensignedtheAmericanRescuePlan,whichaddedyetanother1.9 trillion in stimulus and recovery funding. In total, this was more than 4.5trillioninstimulus,comparedtojustover4.5 trillion in stimulus, compared to just over 4.5trillioninstimulus,comparedtojustover2 trillion throughout the 2008 global financial crisis (both in 2022 dollars).
So why did we go so big? A concern of policymakers and economists at the time was that extended job loss is associated with long-term costs for individuals and the economy. Many were focused on data such as those in Figure 2, which show the employment-to-population ratio (indexed to 100 at the peak of the business cycle) for the last four recessions. The time it takes for employment to return to its previous peak approximates the length of the labor market recovery from the recession.
Even the relatively mild recessions in 1990 and 2001 had long-lasting effects on employment — it took 30 months and 46 months, respectively, for employment to return to its pre-recession levels. And the effects of the recession in 2008 on the labor market were even longer lasting: it took 77 months, or more than six years, for employment to fully recover. A slow recovery can have lasting repercussions: an extensive literature shows the lasting effects of recessions on labor markets, ranging from the cost of entering a poor labor market for young people (who face lower wages and lower employment rates that persist for years after recovery) to scarring for prime-age and older workers (some of whom exit entirely, leading to lasting declines in employment and growth).2
Further, there was concern that the response in 2008 had not been large enough. To be clear, in 2008, the federal government spent a historic sum at the time, more than $2 trillion in 2022 dollars. But in retrospect, many economists agree that it did not go far enough, leaving us with a large and lasting demand shortfall that extended the recession and contributed to the slow labor market recovery. These concerns were at the top of policymakers’ minds in 2020 and early 2021 — they did not want a repeat of the slow 2008 recovery, and this was a new and scary pandemic of unknown duration. Moreover, in early 2021, there were political economy concerns — many were not confident Congress would pass another stimulus bill should it be necessary. As a result, the federal government went big. The total spending on the pandemic crisis was more than double that of the financial crisis in real terms, not including the support the Fed provided to financial markets to keep credit flowing.
Was it worth it? In the affirmative, the labor market recovery from the COVID pandemic was faster than after any other major recession since World War II (see Figure 2). Further, the US recovery in terms of GDP was much faster than that of virtually every other major economy. Figure 3, an extension of a report by my colleague Gian Maria Milesi-Ferretti, shows that by the fourth quarter of 2021, US GDP was above its pre-pandemic trend by more than half a percentage point, compared to declines of more than 2 percent in the UK, Germany, and Canada.3
Figure 2
Figure 3
That said, the recovery was not without its costs. As shown in Figure 4, inflation spiked, and economists are still trying to understand the reasons for its rise. Broadly, the two major explanations are that it was due to the massive federal support of the economy and the supply constraints discussed earlier. Of course, these are not mutually exclusive. My read of the literature to date is that both likely contributed. It is too early to assess whether the pandemic response was “irresponsible” or “misguided”: we will need a few more years to fully assess the costs and benefits of economic policymaking during the pandemic. But for now, the benefits appear to have outweighed the costs. To date, the worst fears have not come true, and inflation in the US has largely been in line with other developed countries that passed much smaller stimulus packages.
Figure 4
Was this perfect economic policymaking? Probably not, but for the moment, it looks as though it was “good.” My second lesson highlights why aiming for the perfect would very likely have been the enemy of the good in this case.
The second lesson: Better calibrated economic policymaking will require much deeper investment in data and infrastructure.
This lesson is based on the fact that federal data, computer, and human resource infrastructures were — and still are — not up to the task of delivering surgical and speedy support for the economy. Components of the CARES Act highlight this reality well. For example, the Paycheck Protection Program (PPP) provided uncollateralized and forgivable loans to small businesses (generally, those with fewer than 500 employees). These loans could officially be used only to retain workers (with several safe harbor provisions), meet payroll and health insurance costs, or make mortgage, lease, and utility payments. If these conditions were met and firms met their employment targets, the loans would be entirely forgiven after the pandemic. The Economic Injury Disaster Loan (EIDL) program provided low-interest-rate loans of up to 2million,payableoverupto30years.Loansalsoincludedtheoptiontodeferallpaymentsduringthefirsttwoyearswhilebusinessesandnonprofitsgotbackontheirfeetafterthepandemic.Andfinally,thecoverageandgenerosityofUIwereexpandeddramatically.Benefitswereincreasedby2 million, payable over up to 30 years. Loans also included the option to defer all payments during the first two years while businesses and nonprofits got back on their feet after the pandemic. And finally, the coverage and generosity of UI were expanded dramatically. Benefits were increased by 2million,payableoverupto30years.Loansalsoincludedtheoptiontodeferallpaymentsduringthefirsttwoyearswhilebusinessesandnonprofitsgotbackontheirfeetafterthepandemic.Andfinally,thecoverageandgenerosityofUIwereexpandeddramatically.Benefitswereincreasedby600 per week, and those not typically covered, such as gig workers and contractors, were made temporarily eligible.
While it may have been “good enough,” it was sloppy. On the one hand, nearly 1 million firms received PPP loans (worth 150,000to150,000 to 150,000to10 million), and 3.9 million received EIDL loans. On the other hand, this assistance was rather inefficiently delivered. Waste and poor targeting were a problem. David Autor and his coauthors estimate that PPP loans cost between 169,000and169,000 and 169,000and258,000 per job-year saved, which is more than twice the average salary of these workers. They also estimate that more than two-thirds of the total outlays on the program accrued to business owners and shareholders rather than employees.4
Outright fraud was also a major issue. The Government Accountability Office (GAO) estimates that PPP fraud totaled about 64billionoutofatotalofnearly64 billion out of a total of nearly 64billionoutofatotalofnearly800 billion in loans— that is, about 8 percent of all PPP loans may have been fraudulent. Under EIDL, some borrowers claimed loans using falsified names or business details and often simply ran off with the cash. In the end, the GAO and the Small Business Administration estimate that EIDL fraud was even more pervasive than PPP fraud, in dollar terms — more than 136billion.UIfraudalsoskyrocketedduringthepandemic;theGAOestimatesthatfraudmayhavecostanywherefrom136 billion. UI fraud also skyrocketed during the pandemic; the GAO estimates that fraud may have cost anywhere from 136billion.UIfraudalsoskyrocketedduringthepandemic;theGAOestimatesthatfraudmayhavecostanywherefrom55 to $135 billion.5
Why did the federal government fail to verify the identities and creditworthiness of borrowers? Part of the answer is speed: it wanted to get money out to small businesses as quickly as possible to ensure they wouldn’t fold during the crisis. The usual procedures for background checks and verifying application details were shortened or removed altogether.
But another more structural issue was state capacity. Along with old technology, underfunding was another issue agencies faced, which led to a shortage of skilled employees who could administer and detect fraud in the programs. These issues were compounded by the US’s focus on privacy protections, which has created regulations that have the side effect of limiting our capacity to implement programs.
It is worth acknowledging that fraud and misallocation were not unique to the US. There were also headline cases of fraud in other developed countries, especially regarding the allocation of European Union (EU) pandemic funding. But, according to the European Public Prosecutor’s Office, the entire EU area (with a larger population than the US and a similar GDP) likely experienced COVID-related fraud on the order of, at most, tens of billions of dollars, an order of magnitude smaller than the hundreds of billions of dollars of fraud in the US.6 We were first in class.
In short, it didn’t have to be this bad: The speed of program implementation was inevitably going to lead to some fraud, but not necessarily as much as we had in the US. To develop this point, I’d like to focus on our UI system.
What challenges did UI face leading into the pandemic? First, funding: The federal unemployment tax, intended to fund UI, is applied to annual wages below a federally determined cap. Back in 1937, the full earnings of about 97 percent of covered workers were subject to the tax. But the nominal wage cap has not been adjusted to keep pace with inflation — it is currently just $7,000. As a result, just over 25 percent of wages are now subject to the tax. This has had the downstream effect of gradually restricting real funding to state unemployment agencies, especially for states that don’t impose their own higher tax on employers.7
As of 2020, less than half of the states had modernized their UI systems. Some state systems still run on COBOL; it is almost impossible to submit an application on a mobile device in most states, and workers in some states must still be physically mailed a password to log in to their UI account.8 In part because of these challenges, by the end of May 2020, only about 57 percent of unemployment claims had been paid nationwide.9 This created a double crisis, where overworked employees didn’t have the resources they needed to rigorously verify claims, leading to more fraud, while genuinely eligible workers had to wait weeks or months to get their benefits.
But outdated tech and low funding weren’t the only issues. Other challenges have to do with modernizing the UI system to meet the needs of the modern labor market. Beyond the many state-level differences in minimum levels of earnings and time worked required for eligibility, there are entire groups of workers who are totally ineligible under current law, including workers who quit or were fired for cause; students who work in addition to getting their education; and self-employed workers, gig workers, and contract workers.
Figure 5 shows the unemployment recipiency rate over time, effectively the percentage of total unemployed workers receiving UI. Recipiency rates have fallen dramatically since the 1950s, with a particularly large drop in the late 1970s and early 1980s as states tightened their requirements in response to the fiscal challenges created by a falling real cap on the federal unemployment tax. The decline after the 2008 financial crisis is less well-understood but may reflect a combination of further tightening of state-level unemployment programs, workers remaining unemployed for longer than the legal limit to continue receiving UI benefits, and one other detail — the rise of “alternative work arrangements” that are not covered by our current UI system.
Figure 5
Figure 6, based on data from a 2019 paper by Lawrence Katz and Alan Krueger,10 shows the rise of alternative work arrangements (AWAs) over two decades. From 1995 to 2005, all forms of AWAs rose only slightly, ticking up from 10 to 10.7 percent of the employed workforce. But between 2005 and 2015, the percentage rose dramatically to more than 15 percent of all workers in America. Some categories saw an even faster rise — for instance, the percentage of workers provided by contract firms more than doubled. More recent survey evidence suggests the percentage of workers in AWAs may be even higher.11
Figure 6
This is particularly important since most workers in an AWA are ineligible for UI. Independent contractors and other “1099 employees” cannot receive UI in any state; most of the workers described here are also ineligible under most states’ rules concerning time worked, minimum earnings, and other qualifications.
And since 2015, a new sort of AWA has come to the fore: the gig economy. Andrew Garin, Emilie Jackson, Dmitri Koustas, and Alicia Miller estimate that in 2012, the number of Americans with any payments for platform work listed on their tax return was essentially zero. By 2016, it had already risen to 2 million. Even after the reporting threshold was increased in 2017, the amount of gig work reported on tax returns continued to increase, and the number of gig workers rose to almost 5 million by 2021 — about 3 percent of the workforce.12 Because of the rising minimum earnings level for reporting gig work on tax forms, the true proportion of workers who participate in the gig economy is likely higher. Data from the Pew Research Center American Trends Panel suggest that the actual share of Americans who currently rely or recently relied on gig work as an important source of income is about 5 percent. Moreover, of Americans who engaged in gig work in 2021, 31 percent considered it their main job, while 58 percent of them considered it essential or important for meeting their basic needs.13
Putting it together, challenges with our UI response were not only due to a lack of administrative capacity but also to the expansion of the UI pandemic program to these non-traditionally covered workers. Even in states with well-funded and up-to-date UI systems, the sudden expansion of the program to gig workers and the self-employed made it challenging to ensure that only people who really qualified were getting benefits. And more generally, our UI system is falling behind in providing the kind of consumption-smoothing support it was designed to provide during job transitions because eligibility has not kept up with our evolving economy.
The third lesson: Crises such as that spurred by the pandemic can help us better understand our economy.
The final lesson stems from the fact that the economy has performed in unexpected ways since the start of the pandemic, puzzling many economists (including myself). By attempting to solve puzzles created by crises, we can better understand how our economy works. One example is the evolution of the wage structure over the past few years.
Figure 7 shows the change in average real wages, indexed to one in January 2020, for workers split into three occupational wage terciles (low, medium, and high-wage occupations).14 The three occupational groups tracked each other relatively closely from 2015 to 2020; low-wage workers caught up slightly, with wage gains of about a percentage point more than medium- and high-wage occupations over that time, but there were no major distributional changes. But in late 2020 through 2023, as the initial composition effects from pandemic unemployment began to recede, the pattern changed dramatically: Low-wage occupations experienced meaningful real wage gains while wage growth for medium-wage occupations was essentially flat relative to 2020 and high-wage occupations experienced real wage declines. Similar patterns emerge when splitting workers based on wages or education. Why might this have occurred?
Figure 7
Let’s first consider unlikely explanations. It is unlikely to have occurred due to improved human capital, education, or skills given that postsecondary enrollment fell sharply between 2019 and 2020 and has not yet recovered. Similarly, minimum wage increases likely do not explain these patterns: Minimum wages have either decreased in real terms or simply been indexed to inflation in most states. In addition, while new union elections have increased over the past couple of years, the union membership rate has continued its gradual decline since the early 1980s. That said, union threat effects — in which firms elect to increase wages and benefits out of fear that workers would otherwise unionize — may have contributed.15
If it wasn’t human capital, minimum wages, or unions, what happened? A few stylized facts may point us towards an answer, although, of course, more causal research will be needed to make a more conclusive determination. First, the pandemic recession was much shorter than most of us expected, lasting only two months. Moreover, as highlighted earlier, the labor market recovery was exceptionally fast — unemployment was below 5 percent by September 2021. Second, pandemic support for households was quite generous, especially as a share of pre-pandemic wages for low-income households. This influx of income led to higher savings, especially since many were stuck at home and couldn’t spend it. Further, enhanced UI payments were well above a 100 percent replacement rate for more than 70 percent of workers.16 This increase in resources may have sharply increased reservation wages at the bottom of the income distribution.
These facts alone might have affected the wage distribution, but their impact was magnified by a few other trends and policy decisions. As already discussed, the US approach to unemployment during the crisis was essentially to expand and bolster our existing UI system. States issued almost $800 billion in unemployment benefits, including additional federal funding, between March 2020 and July 2021 — more than three times as much money as they issued in 2009, even after adjusting for inflation. But there were no requirements that workers return to their previous jobs after the pandemic ended. In contrast, most other developed countries relied on “job retention schemes” which essentially paid employers to keep their employees on the payroll for the duration of the crisis. Countries like New Zealand, France, and Great Britain each kept more than a third of their entire workforce on the payroll through these schemes, and almost every country used them for at least a subset of their workforce.17 But in the US, despite federal funding and backing, an equivalent policy known as short-time compensation never caught on. Instead, in the US, workers were separated from their jobs, got expanded unemployment coverage, and then were required to search for a new job once the pandemic had receded.
In part for this reason, vacancy and quit rates skyrocketed after the pandemic began to recede. For several months in a row between 2021 and 2023, the number of unemployed Americans per job opening was at or below 0.6 — that is, there were nearly two job openings for every person looking for work. And at the same time, workers already in their jobs were leaving in search of better options — the quit rate reached 3 percent for the first time ever in the Job Openings and Labor Turnover Survey data.18 Workers typically quit when they believe they will be able to secure a better job — about two-thirds of total quits in the Current Population Survey (CPS), on average, are direct employer-to-employer transitions. Moreover, CPS data analyzed by Ryan Michaels at the Philadelphia Fed suggest that younger, nonwhite, and non-college-educated workers experienced a sharper increase in quits between 2020 and 2022, which could help explain part of why those workers gained the most from the “Great Resignation.”19
How would these trends explain wage patterns since the pandemic? The simplest explanation is a model of labor market tightness akin to the one proposed by Arthur Okun.20 The Okun hypothesis suggests that when markets approach full employment, workers throughout the income distribution are able to move into higher-quality jobs and the lowest-paid workers benefit the most from this process since they move from marginal employment to more steady and productive jobs. During the pandemic, these factors almost certainly played a part in the wage compression, particularly given that labor demand returned relatively quickly and workers, especially low-income workers, who were separated from their jobs could quickly find new ones at a higher wage.
But we also know more about labor markets today than we did when Okun made his contributions. The last two decades of research have revealed the profound importance of labor market imperfections for understanding wage patterns across the economy; two areas I want to highlight are the role of search frictions and market power.
Frictions are a more established area of research, but one with implications for wage patterns that I don’t think have been fully explored. Okun’s model of labor markets included frictional unemployment, but the Diamond-Mortensen-Pissarides model showed us that frictions matter not just for creating unemployment but also for wages themselves, introducing a possible range of indeterminacy in equilibrium wages and perhaps even throwing the entire concept of an equilibrium wage into question.21 And the work of Alan Manning and others on dynamic, or “modern” monopsony — the market power created for employers through search frictions and imperfect matches — also highlights some of the direct consequences of frictions for our labor markets.22 This new generation of research, some of which was highlighted in David Card’s 2022 address to the American Economic Association,23 has helped establish the importance of employer and worker power for wage determination.
What, exactly, is the source of this power is an active matter of debate. One hypothesis is that monopsony power, where employers are able to decrease worker wages because of a combination of static concentration and dynamic frictions, is the most important channel. Another interpretation highlighted by Krueger and Larry Summers decades ago,24 and more recently by Summers and Anna Stansbury,25 suggests it is imperfect product markets, rents, and worker bargaining over a share of those rents that matter most. In practice, both models could yield labor market patterns like the ones in Figure 7 and demand much more research on their implications for the wider economy.
So, there you have it, three lessons: In a crisis, policymakers can’t let the perfect be the enemy of the good. We need more data and public infrastructure if we want better economic policymaking. We can learn lots of things about our economy because of a crisis, and many important questions about the pandemic-induced crisis remain to be addressed. Which leads to a bonus lesson: Crises can always be counted upon to provide full employment for economists.