Tuesday, June 21, 2016

Attitude Adjustment

For this post, note the disclaimer at the top of the page. I'm just speaking for myself here, and my views do not necessarily reflect those of the St. Louis Fed, the Federal Reserve System, or the Board of Governors.

This is a reply to Narayana's recent Bloomberg post, which is a comment on this St. Louis Fed memo.

First, Narayana says that Jim Bullard thinks that
... the economy is so weak that a mere quarter-percentage-point increase would be enough for the foreseeable future.
I don't think the memo actually characterizes the economy as "weak" - it's not a pessimistic view of the world as, for example, Larry Summers or Robert Gordon might see it. As I noted in this post, one would not characterize the labor market as "weak." It's in fact tight, by conventional measures that we can trust. The view in the St. Louis Fed memo is that growth in real GDP, at 2% per annum, is likely to remain lower than the pre-financial crisis trend for the foreseeable future - i.e. "weaker" than we've been accustomed to. But "so weak" is language that is too pessimistic. And there remains the possibility that this will turn around.

Second, Narayana says:
Bullard’s rationale focuses on productivity...
That's not correct. The memo mentions low productivity growth, but a key part of the argument is in terms of low real rates of interest. According to conventional asset pricing and growth theory, low productivity growth leads to low consumption growth, which leads to low real rates of interest. But that effect alone does not seem to be strong enough to explain the fall in real interest rates in the world that has occurred for about the last 30 years or so. There is another effect that we could characterize as a liquidity premium effect, which could arise, for example, from a shortage of safe assets. I've studied that in some of my own work, for example in this paper with David Andolfatto. In recent history, the financial crisis, sovereign debt problems, and changes in banking regulation have contributed to the safe asset shortage, which increases the prices of safe assets, and lowers their yields. This problem is particularly acute for U.S. government debt. A key point is that a low return on government debt need not coexist with low returns on capital - see the work by Gomme, Ravikumar, and Rupert cited in the memo.

Third, Narayana thinks that:
Bullard uses a somewhat obscure measure of inflation developed by the Dallas Fed, rather than the Fed’s preferred measure, which is well below 2 percent and is expected to remain there for the next two to three years.
"Obscure," of course, is in the eye of the beholder. Let's look at some inflation measures:
The first measure is raw pce inflation - that's the Fed's preferred measure, as specified here. The second is pce inflation, after stripping out food and energy prices - that's a standard "core" measure. The third is the Dallas Fed's trimmed mean measure. Trimmed mean inflation doesn't take a stand on what prices are most volatile, in that it strips out the most volatile prices as determined by the data - it "trims" and then takes the mean. Then we calculate the rate of growth of the resulting index. One can of course argue about the wisdom of stripping volatile prices out of inflation measures - there are smart people who come down on different sides of this issue. One could, for example, make a case that core measures of inflation give us some notion of where raw pce inflation is going. For example, in mid-2014, before oil prices fell dramatically, all three measures in the chart were about the same, i.e. about 1.7%. So, by Fisherian logic, if the real interest rate persists at its level in mid-2014, then an increase in the nominal interest rate of 50 basis points would make inflation about right - perhaps even above target. Personally, I think we don't use Fisherian logic enough.

Finally, Narayana says:
...the risk of excess inflation is relatively manageable.
That's a point made in the memo. The forecast reflects a view that Phillips curve effects are unimportant, and thus an excessive burst in inflation is not anticipated.

Here's a question for Narayana: Why, if a goal is to have "capacity to lower rates" in the event of "say, global financial instability," does he want rates reduced now?

Should We Think of Confidence as Exogenous?

I don't always agree with Roger Farmer, but I admire his independence. Roger doesn't like to be bound by the constraints of particular research groups, and typically won't accept the assumptions decreed by some New Keynesians, Monetarists, New Fisherites, or whoever. Farmer is a Farmerite. But, Roger falls into a habit common to others who call themselves Keynesians, which is to describe what he does in terms of some older paradigm. The first time I saw Roger do this was in 1994, when he gave this paper at a Carnegie-Rochester conference. The paper was about quantitative work on a class of models which were one step removed from neoclassical growth models. Such models, with unique equilibrium and exogenous stochastic productivity shocks, had been used extensively by real business cycle (RBC) proponents, but Roger's work (and that of other people, including Jess Benhabib) was aimed at studying indeterminacy and endogenous fluctuations. The indeterminacy in Roger's work came from increasing returns to scale in aggregate production. Sufficient increasing returns, he showed, permitted sunspot equilibria, and those equilibria could look much like the stochastic equilibria in RBC models. That seemed promising, and potentially opened up a role for economic policy aimed at dealing with indeterminacy. Old Keynesian economics says we should offset exogenous shocks with fiscal and monetary policy; baseline RBC theory says such stabilization policy is a waste of time. But with indeterminacy, policy is much more complicated - theoretically, we can construct policies that eliminate particular equilibria through off-equilibrium promises. In equilibrium, we wouldn't actually observe how the policymaker was doing his or her job. While promising, this approach introduced some challenges. How do we deal econometrically with indeterminacy? How would we know if real-world policymakers had actually figured out this problem and were solving it?

Though teaching and entertaining ourselves has a lot to recommend it, most economists are interested in persuading other people of the usefulness of their ideas. Though I haven't had a lot of experience with dissemination of ideas in other professions, I think economists are probably extreme in terms of how we work out ideas in public. Seminars and conferences can be combative. We have fun arguing with each other, to the point where the uninitiated find us scary. And all economists know it's an uphill battle to get people to understand what we're doing, let alone to have them think that we've come up with the greatest thing since indoor plumbing. There's an art to convincing people that there are elements of things they know in our ideas. That's intution - making the idea self-evident, without making it seem trivial, and hence unpublishable (horrors).

So, what does this have to do with Roger, indeterminacy, and 1994? In the talk I heard at CMU in 1994, to make his paper understandable Roger used words like "demand and supply shocks," "labor supply and demand curves," and, particularly, "animal spirits." Given that language, one would think that the elements of the model came from the General Theory and textbook AS/AD models. But that was certainly not the case. The elements of the model were: (i) the neoclassical growth model, which most of the people in the room would have understood; (ii) increasing returns to scale which, again, was common currency for most in the room; (iii) sunspot equilibria, which were first studied in the late 1970s by Cass and Shell. This particular conference was in part about indeterminacy, so there were people there - Russ Cooper, Mike Woodford, Rao Aiyagari, for example - who understood the concept well, and could construct sunspot equilibria if you asked them to. But there were other people in the room - Alan Meltzer for example - who would have no clue. But having Roger tell the non-initiated that his paper was actually about AD/AS and animal spirits would not actually help anyone understand what he was doing. If Roger had just delivered his indeterminacy paper in unadulterated form, no undergraduate versed in IS-LM AS-AD would have have drawn any connection, and if Keynes had been in the room he would not have seen any similarity between his work and Roger's ideas. But once Roger said "animal sprits," Keynes would have thought, "Oh, now I get it." He would have left the conference with the impression that Roger was just validating the General Theory in a more technical context. And he would have been seriously mislead.

Roger was hardly the first macroeconomist who made use of language from the General Theory, or Hicksian IS-LM, or post-Hicksian static AS-AD language, to provide intuition for ideas they thought might appeal to people schooled in those traditions. Peter Diamond did it in 1982 – “aggregate demand” was in the title of the paper in which Diamond constructed a model with search and increasing returns in the matching function. That model could give rise to multiple steady states – equilibria with high output and low "unemployment" could coexist with equilibria with low output and high unemployment. If you knew some combination of one-sided search models, the Phelps volume, or had seen work by Dale Mortensen and Chris Pissarides on two-sided search, you could get it. People like Peter Howitt, Ken Burdett, and John Kennan could get it, because they were Northwestern students and been in contact with Mortensen. But an IS-LM Keynesian wouldn’t get it. For those people using the words “aggregate demand” is a dog whistle – a message that everything is OK. “Don’t worry, we’re not doing anything that you would object to.”

New Keynesians took some of these lessons in presentation to heart, and went far beyond dog whistles. A New Keynesian model is basically a neoclassical growth model with exogenous aggregate shocks, and with sticky prices in the context of price-setting monopolistically-competitive firms - and with something we could think of as monetary policy. Again, Keynes would not have the foggiest idea what this was about, but in some incarnations (three-equation reduced form), this was dressed up in a language that had for been taught to undergraduates for about thirty years prior to the advent of New Keynesian frameworks in the late 1990s – the language of “aggregate demand,” “IS curves,” and “Phillips curves.”

New Keynesian economics was no less radical than what Lucas, Prescott, and others were up to in the 1970s and 1980s, but Lucas and Prescott were very in-your-face about what they did. That’s honest, and refreshing, but getting in the faces of powerful people can get you in trouble. I think Mike Woodford learned from that. Better to calm the powerful people who might have a hard time understanding you – get them on your side, and give them the impression that they get it. If Woodford had been in-your-face like Lucas and Prescott, he would probably have the reputation that, perhaps surprisingly, Lucas and Prescott still enjoy among some Cambridge (MA) educated people of my generation. For some, Lucas and Prescott are put in a class with the low life of society – Ponzi schemers, used car salespeople, and other hucksters. Not by the Nobel committee, fortunately.

But, there’s a downside to being non-confrontational. Woodford’s work, and the work of people who extended it, and did quantitative work in that paradigm, is technical – no less technical than the work of Lucas, Sargent, Wallace, Prescott, etc., from which it came. Not everyone is going to be able to do it, and not everyone will get it if it is presented in all its glory. But the dog whistles, and other more explicit appeals to defunct paradigms - or ones that should be - makes some people think that they get it. And when they think they get it, they think that the defunct paradigms are actually OK. And, if the person that thinks he or she gets it is making policy decisions, we’re all in trouble.

Why are we in trouble? Here’s an example. I could know a lot more math and econometrics than I do, and I’ve got plenty of limitations, as we all do. But I’ve had a lot of opportunities to learn firsthand from some of the best people in the profession – Rao Aiyagari, Mark Gertler, Art Goldberger, John Geweke, Chuck Wilson, Mike Rothschild, Bob Lucas, Ed Prescott, Larry Christiano, Narayana Kocherlakota, etc., etc. But I couldn’t get NK models when I first saw them. What’s this monetary model with no money in it? Where’s that Phillips curve come from? What the heck is that central bank doing without any assets and liabilities? I had to read Woodford’s book (and we know that Woodford isn’t stingy with words), listen to a lot of presentations, read some more papers, and work stuff out for myself, before I could come close to thinking I was getting it. So, trust me, if you hear the words “IS curve,” “Phillips curve,” “aggregate demand,” and “central bank,” and think you’ve got NK, you’re way off.

Way off? How? In this post, I wrote about a simplified NK model, and its implications. Some people seem to think that NK models with rational expectations tell us that, if a central bank increases its nominal interest rate target, then inflation will go down. But, in my post, I showed that there are several ways in which that is false. NK models in fact have Fisherian properties – or Neo-Fisherian properties, if you like. Fortunately, there are some people who agree with me, including John Cochrane and Rupert and Sustek. But, in spite of the fact that you can demonstrate how conventional macroeconomic models have Neo-Fisherian properties – analytically and quantitatively – and cite empirical evidence to back it up, the majority of people who work in the NK tradition don’t believe it, and neither do most policymakers. Part of this has to do with the fact that there indeed exists a model from which one could conclude that an increase in the central bank’s nominal interest rate target will decrease inflation. That model is a static IS-LM model with a Phillips curve and fixed (i.e. exogenous) inflation expectations. That’s the model that many (indeed likely the majority) of central bankers understand. And you can forgive them for thinking that’s roughly the same thing as a full-blown NK model, because that’s what they were told by the NK people. Now you can see the danger of non-confrontation – the policymakers with the power may not get it, though they are under the illusion that they do.

I know I’m taking a circuitous route to discussing Roger’s new paper, but we’re getting there. A few years ago, when Roger started thinking about these ideas and putting the ideas in blog posts, I wrote down a little model to help me understand what he was doing. Not wanting to let that effort go to waste, I expanded on it to the point where I could argue I was doing something new, and submitted it to a journal. AEJ-Macro rejected it (an unjust decision, as I’m sure all your rejections are too), but I managed to convince the JMCB to take it. [And now I'm recognizing some of my errors - note that "Keynesian" is in the title.] Here’s the idea. In his earlier work Roger had studied a type of macroeconomic indeterminacy that is very different from the multiple equilibrium models most of us are used to. In search and matching models we typically have to deal with situations in which two economic agents have to divide the surplus from exchange. There is abundant theory to bring to bear here - generalized Nash bargaining, Kalai bargaining, Rubinstein bargaining, etc. - but if we're to be honest with ourselves, we have to admit that we really don't know much about how people will divide the surplus in exchange. That idea has been exploited in monetary theory - for example by Hu, Kennan, and Wallace. Once we accept the idea that there is indeterminacy in how the surplus from exchange is split, we can think about artificial worlds with multiple equilibria. In my paper, I first showed a simple version of Roger's idea. Output is produced by workers and producers, and there is a population of people who can choose to be either, but not both. Each individual in this world chooses an occupation (worker or producer), they go through a matching process where workers are matched with producers (there's a matching function). Some get matched, some do not, and when there is a match output gets produced and the worker and producer split the proceeds and consume. In equilibrium there are always some unmatched workers (unemployment) and unmatched producers (unfilled vacancies). There is a continuum of equilibria indexed by the wage in a match. A high wage is associated with a high unemployment rate. That's because, in equilibrium, everyone has to be indifferent between becoming a producer and becoming a worker. If the wage is high, an individual receives high surplus as a worker and low surplus as a producer. Therefore, it must be easier in equilibrium to find a match as a producer than as a worker - the unemployment rate must be high and the vacancy rate low.

What I did was to extend the idea by working this out in a monetary economy - for me, a Lagos-Wright economy where money was necessary to purchase goods. Then, I could think about monetary (and fiscal) policy, and how policymakers could achieve optimality. As in the indeterminacy literature, this required thinking about how policy rules could kill off bad equilibria.

On to Roger's new paper. He also wants to flesh out his ideas in a monetary economy, and there's a lot in there, including quantitative work. As in Roger's previous work, and my interpretation of it, there are multiple steady states, with high wage/high unemployment steady states. As it's a monetary economy (overlapping generations), there are also multiple dynamic equilibria, and Roger explores that. So, that all seems interesting. But I'm having trouble with two things. The first is Roger's "belief function." In Roger's words:
To close our model, we assume that equilibrium is selected by ‘animal spirits’ and we model that idea by introducing a belief function as in Farmer (1993, 2002, 2012b). We treat the belief function as a fundamental with the same methodological status as preferences and endowments and we study the implications of that assumption for the ability of monetary policy to influence inflation, output and unemployment.
So, a lot of people have done work on indeterminacy, and I have never run across a "belief function," that someone wants me to think is going to deliver beliefs exogenously. In Roger's model, the belief function is actually an equilibrium selection device, imposed by the modeler. The model tells us there are multiple equilibria, and that's all it has to say. "Beliefs" as we typically understand them, are in fact endogenous in Roger's model. And calling them exogenous does not accomplish anything, as far as I can tell, other than to get people confused, or cause them to raise objections, as I'm doing now.

Second complaint: This goes back to my lengthy discussion above. Roger's paper has "animal spirits" in the title, it cites the General Theory, and the words "aggregate demand" show up 7 times in the paper. Roger also sometimes comes up with passages like this:
Our model provides a microfoundation for the textbook Keynesian cross, in which the equilibrium level of output is determined by aggregate demand. Our labor market structure explains why firms are willing to produce any quantity of goods demanded, and our assumption that beliefs are fundamental determines aggregate demand.
And this:
Although our work is superficially similar to the IS-LM model and its modern New Keynesian variants; there are significant differences. By grounding the aggregate supply function in the theory of search and, more importantly, by dropping the Nash bargaining assumption, we arrive at a theory where preferences, technology and endowments are not sufficient to uniquely select an equilibrium.
In how many ways are these silly statements? This model is related to the Keynesian Cross and IS-LM as chickens are related to bears. The genesis of Roger's framework is Paul Samuelson's overlapping generations model, work on indeterminacy in monetary versions of that model (some of which you can find in the Minneapolis conference volume), and the search and matching literature. NK models are not "variants" of IS-LM models - they are entirely different beasts. It's not "aggregate demand" that is determining anything in Roger's model - there are multiple equilibria, and that's all.

Maybe you think this is all harmless, but it gets in the way of understanding, and I think Roger's goal is to be understood. Describe a bear as if it's a chicken, and you're going to confuse and mislead people. And they may make bad policy decisions as a result. Better to get in our faces with your ideas, and bear the consequences.

Friday, June 17, 2016

Dazed and Confused?

In October 2015, after a September payroll employment estimate of 142,000 new jobs, described as "grim" and "dismal" in the media, I wrote this blog post, arguing that we might well see less employment growth in the future. That conclusion came from simple labor force arithmetic. With the working-age population (ages 15-64) growing at a low rate of about 0.5%, if the labor force participation rate failed to increase and the unemployment rate stopped falling, payroll employment could grow at most by 60,000 per month, as I saw it last October.

After the last employment report, which included an estimate of a monthly increase of 38,000 in payroll employment, some people were "shocked," apparently. Let's take a look at a wider array of labor market data, and see whether they should be panicking.

If you have been following employment reports in the United States for a while, you might wonder why the establishment survey numbers are always reported in terms of the monthly change in seasonally-adjusted employment. After all, we typically like to report inflation as year-over-year percentage changes in the price level, or real GDP as quarterly percentage changes in a number that has been converted to an annual rate. So, suppose we look at year-over-year percentage changes in payroll employment:
That wouldn't quite make your cat climb the curtains. Employment growth rates were above 2% for a short time in early 2015, and the growth rate has fallen, but we're back to growth rates close to what we saw in 2013-2014.

What's happening with unemployment and vacancies?
The unemployment rate is currently at 4.7%, only 0.3 percentage points higher than its most recent cyclical low of 4.4% in May 2007, and the vacancy rate (JOLTS job openings rate) has been no higher since JOLTS came into being more than 15 years ago. Thus, by the standard measure we would use in labor search models (ratio of vacancies to unemployment), this job market is very tight.

If we break down the unemployment rate by duration of unemployment, we get more information:
In this chart, I've taken the number of unemployed for a particular duration, and expressed this as percentage of the labor force. If you add the four quantities, you get the total unemployment rate. Here, it's useful again to compare the May 2016 numbers with May 2007. In May 2007, the unemployment rates for less than 5 weeks, 5 to 14 weeks, 15-26 weeks, and 27 weeks and over were 1.6%, 1.4%, 0.7%, and 0.7%, respectively. In May 2016 they were 1.4%, 1.4%, 0.7%, and 1.2%, respectively. So, middle-duration unemployment currently looks the same as in May 2007, but there are fewer very-short-term unemployed, and more long-term unemployed. But long-term unemployment continues to fall, with a significant decline in the last report.

Some people have looked at the low employment/population ratio and falling participation rate, and argued that this reflects a persistent inefficiency:
So, for example, if you thought that a large number of "involuntarily" unemployed had dropped out of the labor force and were only waiting for the right job openings to materialize, you might have thought that increases in labor force participation earlier this year were consistent with such a phenomenon. But the best description of the data now seems to be that labor force participation leveled off as of mid-2015. Given the behavior of unemployment and vacancies in the previous two charts, and the fact that labor force participation has not been cyclically sensitive historically, the drop in labor force participation appears to be a secular phenomenon, and it is highly unlikely that this process will reverse itself. Thus, it seems wrongheaded to argue that some persistent wage and price stickiness is responsible for the low employment/population ratio and low participation rate. There is something to explain in the last chart alright (for example, Canada and Great Britain, with similar demographics, have not experienced the same decline in labor force participation), and this may have some connection to policies in the fiscal realm, but it's hard to make the case that there is some alternative monetary policy that can make labor force participation go up.

Another key piece of labor market information comes from the CPS measures of flows among the three labor force states - employed (E), unemployed (U), and not in the labor force (N). We'll plot these as percentage rates, relative to the stock of people in the source state. For the E state:
The rate of transition from E to U is at close to its lowest value since 1990, but the transition rate from E to N is relatively high. This is consistent with the view that the decline in labor force participation is a long-run phenomenon. People are not leaving E, suffering a period of U, and then going to N - they're going directly from E to N. Next, the U state:
In this chart, the total rate at which people are exiting the U state is lower than average and, while before the last recession the exit rate to E was higher than the exit rate to N, these rates are currently about the same. This seems consistent with the fact that the unemployment pool currently has a mix that tilts more toward long-term unemployed. These people have a higher probability than the rest of the unemployed, of going to state N rather than E. Finally, for state N:
Here, the rates at which people are leaving state N for both states E and U are relatively low. Thus, labor force participation has declined both because of a high inflow (from both E and U) and a low outflow. But, the low outflow rate to U from N (in fact, the lowest since 1990) also reflects the tight labor market, in that a person leaving state N is much more likely to end up in state E rather than U (though no more likely, apparently, than was the case historically, on average).

The last thing we should look at is productivity. In this context, a useful measure is the ratio of real GDP to payroll employment, which looks like this:
By this measure, average labor productivity took a large jump during 2009, but since early 2010 it has been roughly flat. There has been some discussion as to whether productivity growth measures are biased downward. Chad Syverson, for example, argues that there is no evidence of bias in measures of output per hour worked. So, if we take the productivity growth measures at face value, this is indeed something to be shocked and concerned about.

Conclusions

1. The recent month's slowdown in payroll employment growth should not be taken as a sign of an upcoming recession. The labor market, by conventional measures, is very tight.
2. The best forecast seems to be that, barring some unanticipated aggregate shock, labor force participation will stay where it is for the next year, while the unemployment rate could move lower, to the 4.2%-4.5% range, given that the fraction of long-term unemployed in the unemployment pool is still relatively high.
3. Given an annual growth rate of about 0.5% in the working age population, and supposing a drop of 0.2-0.5 percentage points in the unemployment rate over the next year, with half the reduction in unemployment involving transitions to employment, payroll employment can only grow at about 80,000 per month over the next year, assuming a stable labor force participation rate. Thus, if we add the striking Verizon workers (about 35,000) to the current increase in payroll employment, that's about what we'll be seeing for the next year. Don't be shocked and concerned. It is what it is.
4. Given recent productivity growth, and the prospects for employment growth, output growth is going to be low. I'll say 1.0%-2.0%. And that's if nothing extraordinary happens.
5. Though we can expect poor performance - low output and employment growth - relative to post-WWII time series for the United States, there is nothing currently in sight that represents an inefficiency that monetary policy could correct. That is, we should expect the labor market to remain tight, by conventional measures.

Tuesday, June 14, 2016

Dave Backus

Dave Backus has passed away. Dave was the Heinz Riehl Professor in the Stern School at NYU, and had previous positions at Queen's University, UBC, and the Minneapolis Fed. Dave leaves behind a solid body of work in macroeconomics, and many sad colleagues, students, friends, and family. Dave and I crossed paths in Kingston Ontario, Minneapolis, and on the editorial board of the JME. He was always straightforward, helpful, a dedicated scientist, and one of our honourary Canadians. Dave is interviewed here.