The need for pluralism in economics

Flattr this!

For decades, main­stream econ­o­mists have react­ed to crit­i­cism of their method­ol­o­gy main­ly by dis­miss­ing it, rather than engag­ing with it. And the cus­tom­ary form that dis­missal has tak­en is to argue that crit­ics and pur­vey­ors of alter­na­tive approach­es to eco­nom­ics sim­ply aren’t capa­ble of under­stand­ing the math­e­mat­ics the main­stream uses. The lat­est instal­ment of this slant on non-main­stream eco­nom­ic the­o­ry appeared in Noah Smith’s col­umn in Bloomberg View: “Eco­nom­ics With­out Math Is Trendy, But It Does­n’t Add Up”.

Fig­ure 1: Noah’s tweet announc­ing his blog post

While Noah’s col­umn made some valid points (and there’s been some good off-line dis­cus­sion between us too), its core mes­sage spout­ed five con­flict­ing fal­lac­i­es as fact:

  • The first (pro­claimed in the title sup­plied by the Bloomberg sub-edi­tor rather than by Noah) is that non-main­stream (or “het­ero­dox”) eco­nom­ics is not math­e­mat­i­cal;
  • The sec­ond is that the het­ero­dox math­e­mat­i­cal mod­els that do exist can’t be used to make fore­casts;
  • The third, that there are some that can make fore­casts, but these have so many para­me­ters that they are eas­i­ly “over-fit­ted” to exist­ing data and there­fore use­less at pre­dic­tion (and they lack suf­fi­cient atten­tion to human behav­iour);
  • The fourth, that though het­ero­dox econ­o­mists make a song and dance about devel­op­ing “stock-flow con­sis­tent” mod­els, main­stream mod­els are stock-flow con­sis­tent too; and
  • The fifth, that agent-based non-main­stream approach­es haven’t pro­duced decent results as yet, and may nev­er do so.

I’ll con­sid­er each of these asser­tions one by one, because they cer­tain­ly can’t be addressed togeth­er.

Non-mathematical?

There is indeed a wing of het­ero­dox eco­nom­ics that is anti-math­e­mat­i­cal. Known as “Crit­i­cal Real­ism” and cen­tred on the work of Tony Law­son at Cam­bridge UK, it attrib­ut­es the fail­ings of eco­nom­ics to the use of math­e­mat­ics itself. Noah has been less than com­pli­men­ta­ry about this par­tic­u­lar sub­set of het­ero­dox eco­nom­ics in the past—see Fig­ure 2.

Fig­ure 2: Noah’s reac­tion to crit­i­cal real­ism

What Noah might not know is that many het­ero­dox econ­o­mists are crit­i­cal of this approach as well. In response to a paper by Law­son that effec­tive­ly defined “Neo­clas­si­cal” eco­nom­ics as any eco­nom­ics that made use of math­e­mat­ics (which would define me as a Neo­clas­si­cal!), Jamie Mor­gan edit­ed a book of replies to Law­son enti­tled What is Neo­clas­si­cal Eco­nom­ics? (includ­ing a chap­ter by me). While the authors agreed with Law­son’s pri­ma­ry point that eco­nom­ics has suf­fered from favour­ing appar­ent math­e­mat­i­cal ele­gance above real­ism, sev­er­al of us assert­ed that math­e­mat­i­cal analy­sis is need­ed in eco­nom­ics, if only for the rea­son that Noah gave in his arti­cle:

At the end of the day, pol­i­cy­mak­ers and investors need to make quan­ti­ta­tive deci­sions — how much to raise or low­er inter­est rates, how big of a deficit to run, or how much wealth to allo­cate to Trea­sury bonds. (Noah Smith, August 8 2016)

The dif­fer­ence between main­stream and het­ero­dox econ­o­mists there­fore isn’t pri­mar­i­ly that the for­mer is math­e­mat­i­cal while the lat­ter is ver­bal. It’s that het­ero­dox math­e­mat­i­cal econ­o­mists accept Tony Law­son’s key point that math­e­mat­i­cal mod­els must be ground­ed in real­ism; we just reject, to vary­ing degrees, Tony’s argu­ment that math­e­mat­ics inher­ent­ly makes mod­els unre­al­is­tic.

In con­trast, the devel­op­ment of main­stream mod­el­ling has large­ly fol­lowed Mil­ton Fried­man’s edict that the real­ism of a mod­el isn’t important—all that mat­ters is that it gen­er­ates real­is­tic pre­dic­tions:

Tru­ly impor­tant and sig­nif­i­cant hypothe­ses will be found to have “assump­tions” that are wild­ly inac­cu­rate descrip­tive rep­re­sen­ta­tions of real­i­ty, and, in gen­er­al, the more sig­nif­i­cant the the­o­ry, the more unre­al­is­tic the assump­tions (in this sense)… the rel­e­vant ques­tion to ask about the “assump­tions” of a the­o­ry is not whether they are descrip­tive­ly “real­is­tic,” for they nev­er are, but whether they are suf­fi­cient­ly good approx­i­ma­tions for the pur­pose in hand. And this ques­tion can be answered only by see­ing whether the the­o­ry works, which means whether it yields suf­fi­cient­ly accu­rate pre­dic­tions. (Fried­man 1966, The Method­ol­o­gy of Pos­i­tive Eco­nom­ics; empha­sis added)

Even on this cri­te­ri­on, main­stream macro­eco­nom­ics is a fail­ure, giv­en the occur­rence of a cri­sis that it believed could not hap­pen. But this cri­te­ri­on alone isn’t suf­fi­cient: real­ism does mat­ter.

If Fried­man’s “only pre­dic­tive accu­ra­cy mat­ters, not real­ism” cri­te­ri­on had been applied in astron­o­my, we would still be using Ptole­my’s mod­el that put the Earth at the cen­tre of the Uni­verse with the Sun, Moon, plan­ets and stars orbit­ing it, because the mod­el yield­ed quite accu­rate pre­dic­tions of where celes­tial objects would appear to be in the sky cen­turies into the future. Its pre­dic­tions were in fact more accu­rate than the ini­tial pre­dic­tions from Galileo’s helio­cen­tric mod­el, even though Galileo’s core concept—that the Sun was the cen­tre of the solar sys­tem, not the Earth—was true, while Ptole­my’s Earth-cen­tric par­a­digm was false.

Fried­man’s argu­ment was sim­ply bad method­ol­o­gy, and it’s led to bad main­stream math­e­mat­i­cal mod­els that make scream­ing­ly unre­al­is­tic assump­tions in order to reach desired results.

The piv­otal unre­al­is­tic assump­tion of main­stream eco­nom­ics pri­or to the cri­sis was that “eco­nom­ic agents” have “ratio­nal expec­ta­tions”. It sounds rea­son­able as a sound bite—who wants to be accused of hav­ing “irra­tional expectations”?—but it actu­al­ly means assum­ing (a) that peo­ple have an accu­rate mod­el of the econ­o­my in their heads that guides their behav­iour today and (b) that this mod­el hap­pens to be the same as the one the Neo­clas­si­cal author has dreamed up in his (it’s rarely her, on either side of eco­nom­ics) paper. And there are many, many oth­er unre­al­is­tic assump­tions.

Noah’s argu­ment that het­ero­dox eco­nom­ics is less math­e­mat­i­cal than the main­stream was also truer some decades ago, but today, with so many physi­cists and math­e­mati­cians in the “het­ero­dox” camp, it’s a very dat­ed defence of the main­stream.

The stan­dard riposte to crit­ics of main­stream eco­nom­ics used to be that they are crit­i­cal sim­ply because they lack the math­e­mat­i­cal skills to under­stand Neo­clas­si­cal mod­els, and—the argu­ment Noah repeats here—their papers were just ver­bal hand-wav­ing that could­n’t be giv­en pre­cise math­e­mat­i­cal form, and there­fore could­n’t be test­ed:

Also, vague ideas can’t eas­i­ly be test­ed against data and reject­ed. The heart of sci­ence is throw­ing away mod­els that don’t work. One of main­stream macro’s biggest fail­ings is that the­o­ries that don’t fit the data con­tin­ue to be regard­ed as good and use­ful mod­els. But ideas like Min­sky’s, with no equa­tions or quan­ti­ta­tive pre­dic­tions, are almost impos­si­ble to reject — if they seem not to fit with events, they can sim­ply be rein­ter­pret­ed. Peo­ple will for­ev­er argue about what Min­sky meant, or John May­nard Keynes, or Friedrich Hayek. (Noah Smith, 8th August 2016)

Ideas like Min­sky’s, with no equa­tions”? If it’s equa­tions and Min­sky you want, try this macro­eco­nom­ics paper “Desta­bi­liz­ing a sta­ble cri­sis: Employ­ment per­sis­tence and gov­ern­ment inter­ven­tion in macro­eco­nom­ics” (Cos­ta Lima, Gras­sel­li, Wang & Wu 2014). And I defy any Neo­clas­si­cal to tell the authors (includ­ing math­e­mati­cian Matheus Gras­sel­li, whose PhD was enti­tled “Clas­si­cal and Quan­tum Infor­ma­tion Geom­e­try”) that they lack the math­e­mat­i­cal abil­i­ty to under­stand Neo­clas­si­cal mod­els.

The math­e­mat­ics used in het­ero­dox papers like this one is in fact hard­er than that used by the main­stream, because it rejects a cru­cial “sim­pli­fy­ing assump­tion” that main­stream­ers rou­tine­ly use to make their mod­els eas­i­er to han­dle: impos­ing lin­ear­i­ty on unsta­ble non­lin­ear sys­tems.

Impos­ing lin­ear­i­ty on a non­lin­ear sys­tem is a valid pro­ce­dure if, and only if, the equi­lib­ri­um around which the mod­el is lin­earized is sta­ble. But the canon­i­cal mod­el from which DSGE mod­els were derived—Ram­sey’s 1925 opti­mal sav­ings mod­el—has an unsta­ble equi­lib­ri­um that is sim­i­lar to the shape of a horse’s sad­dle. Imag­ine try­ing to drop a ball onto a sad­dle so that it does­n’t slide off—impossible, no?

Not if you’re a “rep­re­sen­ta­tive agent” with “ratio­nal expec­ta­tions”! Neo­clas­si­cal mod­el­ers assume that the “rep­re­sen­ta­tive agents” in their mod­els are in effect clever enough to be able to drop a ball onto the eco­nom­ic sad­dle and have it remain on it, rather than slid­ing off (they call it impos­ing a “trans­ver­sal­i­ty con­di­tion”).

The math­e­mat­i­cal­ly more valid approach is to accept that, if your mod­el’s equi­lib­ria are unsta­ble, then your mod­el will dis­play far-from-equi­lib­ri­um dynam­ics, rather than oscil­lat­ing about and con­verg­ing on an equi­lib­ri­um. This requires you to under­stand and apply tech­niques from com­plex sys­tems analy­sis, which is much more sophis­ti­cat­ed than the math­e­mat­ics Neo­clas­si­cal mod­el­ers use (see the won­der­ful free Chaos­Book http://www.chaosbook.org/ for details). The upside of this effort though is that since the real world is non­lin­ear, you are much clos­er to cap­tur­ing it with fun­da­men­tal­ly non­lin­ear tech­niques than you are by pre­tend­ing to mod­el it as if it is lin­ear.

Main­stream econ­o­mists are belat­ed­ly becom­ing aware of this mis­take, as Olivi­er Blan­chard stat­ed recent­ly:

These tech­niques how­ev­er made sense only under a vision in which eco­nom­ic fluc­tu­a­tions were reg­u­lar enough so that, by look­ing at the past, peo­ple and firms (and the econo­me­tri­cians who apply sta­tis­tics to eco­nom­ics) could under­stand their nature and form expec­ta­tions of the future, and sim­ple enough so that small shocks had small effects and a shock twice as big as anoth­er had twice the effect on eco­nom­ic activ­i­ty…

Think­ing about macro­eco­nom­ics was large­ly shaped by those assump­tions. We in the field did think of the econ­o­my as rough­ly lin­ear, con­stant­ly sub­ject to dif­fer­ent shocks, con­stant­ly fluc­tu­at­ing, but nat­u­ral­ly return­ing to its steady state over time. Instead of talk­ing about fluc­tu­a­tions, we increas­ing­ly used the term “busi­ness cycle.” Even when we lat­er devel­oped tech­niques to deal with non­lin­ear­i­ties, this gen­er­al­ly benign view of fluc­tu­a­tions remained dom­i­nant. (Blan­chard Sep­tem­ber 2014, “Where Dan­ger Lurks”).

This is not a prob­lem for het­ero­dox econ­o­mists, since they don’t take as an arti­cle of faith that the econ­o­my is sta­ble. But it does make it much more dif­fi­cult to eval­u­ate the prop­er­ties of a mod­el, fit it to data, and so on. For this, we need fund­ing and time—not a casu­al dis­missal of the cur­rent state of het­ero­dox math­e­mat­i­cal eco­nom­ics.

Can’t make forecasts?

Noah is large­ly cor­rect that het­ero­dox mod­els aren’t set up to make numer­i­cal fore­casts (though there are some that are). Instead the major­i­ty of het­ero­dox mod­els are set up to con­sid­er exist­ing trends, and to assess how fea­si­ble it is that they can be main­tained.

Far from being a weak­ness, this has been a strength of the het­ero­dox approach: it enabled Wynne God­ley to warn from as long ago as 1998 that the trends in the USA’s finan­cial flows were unsus­tain­able, and that there­fore a cri­sis was inevitable unless these trends were reversed. At the same time that the main­stream was crow­ing about “The Great Mod­er­a­tion”, Wynne was warn­ing that “Goldilocks is doomed”.

Wyn­ne’s method was both essen­tial­ly sim­ple, and at the same time impos­si­ble for the main­stream to repli­cate, because it con­sid­ered mon­e­tary flows between eco­nom­ic sectors—and the stocks of debt that these flows implied. The main­stream can’t do this, not because it’s impossible—it’s basi­cal­ly accounting—but because they have false­ly per­suad­ed them­selves that mon­ey is neu­tral, and they there­fore don’t con­sid­er the via­bil­i­ty of mon­e­tary flows in their mod­els.

Divid­ing the econ­o­my into the gov­ern­ment, pri­vate and inter­na­tion­al sec­tors, God­ley point­ed out that the flows between them must sum to zero: an out­flow from any one sec­tor is an inflow to one of the oth­ers. Since the pub­lic sec­tor under Clin­ton was run­ning a sur­plus, and the trade bal­ance was neg­a­tive, the only way this could be sus­tained was for the pri­vate sec­tor to “run a deficit”—to increase its debt to the bank­ing sec­tor. This implied unsus­tain­able lev­els of pri­vate debt in the future, so that the trends that gave rise to “The Great Mod­er­a­tion” could not con­tin­ue. As Wynne and Ran­dall Wray put it:

It has been wide­ly rec­og­nized that there are two black spots that blem­ish the appear­ance of our Goldilocks econ­o­my: low house­hold sav­ing (which has actu­al­ly fall­en below zero) and the bur­geon­ing trade deficit. How­ev­er, com­men­ta­tors have not not­ed so clear­ly that pub­lic sec­tor sur­plus­es and inter­na­tion­al cur­rent account deficits require domes­tic pri­vate sec­tor deficits. Once this is under­stood, it will become clear that Goldilocks is doomed. (God­ley & Wray, “Is Goldilocks Doomed?”, March 2000; empha­sis added).

Had Wyn­ne’s warn­ings been heed­ed, the 2008 cri­sis might have been avert­ed. But of course they weren’t: instead main­stream econ­o­mists gen­er­at­ed numer­i­cal fore­casts from their DSGE mod­els that extrap­o­lat­ed “The Great Mod­er­a­tion” into the indef­i­nite future. And in 2008, the US (and most of the glob­al econ­o­my) crashed into the turn­ing point that Wynne warned was inevitably com­ing.

Wyn­ne’s was­n’t the only het­ero­dox econ­o­mist pre­dict­ing a future cri­sis for the US and glob­al econ­o­my, at a time when main­stream Neo­clas­si­cal mod­ellers were wrong­ly pre­dict­ing con­tin­ued eco­nom­ic bliss. Oth­ers fol­low­ing Hyman Min­sky’s “Finan­cial Insta­bil­i­ty Hypoth­e­sis” made sim­i­lar warn­ings.

Noah was rough­ly right, but pre­cise­ly wrong, when he claimed that “Min­sky, though trained in math, chose not to use equa­tions to mod­el the econ­o­my — instead, he sketched broad ideas in plain Eng­lish.”

In fact, Min­sky began with a math­e­mat­i­cal mod­el of finan­cial insta­bil­i­ty based on Samuel­son’s mul­ti­pli­er-accel­er­a­tor mod­el (Min­sky, 1957 “Mon­e­tary Sys­tems and Accel­er­a­tor Mod­els” The Amer­i­can Eco­nom­ic Review, 47, pp. 860–883), but aban­doned that for a ver­bal argu­ment lat­er (wonk­ish hint to Noah: aban­don­ing this was a good idea, because Samuel­son’s mod­el is eco­nom­i­cal­ly invalid. Trans­form it into a vec­tor dif­fer­ence equa­tion and you’ll see that its matrix is invert­ible).

Min­sky sub­se­quent­ly attempt­ed to express his mod­el using Kaleck­i’s math­e­mat­i­cal approach, but nev­er quite got there. How­ev­er, that did­n’t stop others—including me—trying to find a way to express his ideas math­e­mat­i­cal­ly. I suc­ceed­ed in August 1992 (with the paper being pub­lished in 1995), and the most remark­able thing about it—apart from the fact that it did gen­er­ate a “Min­sky Crisis”—was that the cri­sis was pre­ced­ed by a peri­od of appar­ent sta­bil­i­ty. That is in fact what tran­spired in the real world: the “Great Reces­sion” was pre­ced­ed by the “Great Mod­er­a­tion”. This was not a pre­dic­tion of Min­sky’s ver­bal mod­el itself: it was the prod­uct of putting that ver­bal mod­el in math­e­mat­i­cal form, and then see­ing how it behaved. It in effect pre­dict­ed that, if a Min­sky cri­sis were to occur, then it would be pre­ced­ed by a peri­od of dimin­ish­ing cycles in infla­tion (with the wages share of out­put as a proxy for infla­tion) and unem­ploy­ment.

The behav­iour was so strik­ing that I fin­ished my paper not­ing it:

From the per­spec­tive of eco­nom­ic the­o­ry and pol­i­cy, this vision of a cap­i­tal­ist econ­o­my with finance requires us to go beyond that habit of mind which Keynes described so well, the exces­sive reliance on the (sta­ble) recent past as a guide to the future. The chaot­ic dynam­ics explored in this paper should warn us against accept­ing a peri­od of rel­a­tive tran­quil­i­ty in a cap­i­tal­ist econ­o­my as any­thing oth­er than a lull before the storm. (Keen, S. 1995 “Finance and Eco­nom­ic Break­down: Mod­el­ing Min­sky’s ‘Finan­cial Insta­bil­i­ty Hypoth­e­sis.’.” Jour­nal of Post Key­ne­sian Eco­nom­ics
17, p. 634).

This was before the so-called “Great Mod­er­a­tion” was appar­ent in the data, let alone before Neo­clas­si­cal econ­o­mists like Ben Bernanke pop­u­larised the term. This was there­fore an “out of sam­ple” pre­dic­tion of my model—and of Min­sky’s hypoth­e­sis. Had the 2008 cri­sis not been pre­ced­ed by such a peri­od, my model—and, to the extent that it cap­tured its essence, Min­sky’s hypoth­e­sis as well—would have been dis­proved. But the phe­nom­e­non that my Min­sky mod­el pre­dict­ed as a pre­cur­sor to cri­sis actu­al­ly occurred—along with the cri­sis itself.

Even with­out quan­ti­ta­tive pre­dic­tions, these het­ero­dox models—Godley’s stock-flow con­sis­tent pro­jec­tions and my non­lin­ear simulations—fared far bet­ter than did Neo­clas­si­cal mod­els with all their econo­met­ric bells and whis­tles.

Over-fitted to the data?

It is indeed true that many het­ero­dox mod­els have numer­ous para­me­ters, and that a judi­cious choice of para­me­ter val­ues can enable a mod­el to close­ly fit the exist­ing data, but be use­less for fore­cast­ing because it tracks the noise in the data, rather than the causal trends. Of course, this is equal­ly true of main­stream mod­els as well—compare for exam­ple the canon­i­cal Neo­clas­si­cal DSGE paper “Shocks and Fric­tions in US Busi­ness Cycles: A Bayesian DSGE Approach” (Smets and Wouters 2007) with the equal­ly canon­i­cal Post Key­ne­sian “Stock-Flow Con­sis­tent Mod­el­ling” (SFCM) paper “Fis­cal Pol­i­cy in a Stock-Flow Con­sis­tent (SFC) Mod­el” from the same year (God­ley and Lavoie 2007). Both are lin­ear mod­els, and the for­mer has sub­stan­tial­ly more para­me­ters than the lat­ter.

The fact that this is a seri­ous prob­lem for DSGE models—and not a rea­son why they are supe­ri­or to het­ero­dox models—is clear­ly stat­ed in a new note by Olivi­er Blan­chard:

The mod­els … come, how­ev­er, with a very large num­ber of para­me­ters to esti­mate…, a num­ber of para­me­ters are set a pri­ori, through “cal­i­bra­tion.” This approach would be rea­son­able if these para­me­ters were well estab­lished empir­i­cal­ly or the­o­ret­i­cal­ly… But the list of para­me­ters cho­sen through cal­i­bra­tion is typ­i­cal­ly much larg­er, and the evi­dence often much fuzzi­er… In many cas­es, the choice to rely on a “stan­dard set of para­me­ters” is sim­ply a way of shift­ing blame for the choice of para­me­ters to pre­vi­ous researchers.” (Olivi­er Blan­chard, “Do DSGE Mod­els Have a Future?”, August 2016; empha­sis added)

What real­ly mat­ters how­ev­er, as a point of dis­tinc­tion between two approach­es that share this same flaw, is not the flaw itself, but the dif­fer­ent vari­ables than the mod­els regard as essen­tial deter­mi­nants of the econ­o­my’s behav­iour. There we see chalk and cheese—with the het­ero­dox choic­es being far more palat­able because they include finan­cial sec­tor vari­ables, where­as the pre-cri­sis DSGE mod­els did not.

The real prob­lems with over-fit­ting the data arise not from over-fit­ting to what ends up being noise rather than sig­nal, but from fit­ting a mod­el to real world data when the mod­el omits cru­cial deter­mi­nants of what actu­al­ly hap­pens in the real world, and from devel­op­ing lin­ear mod­els of a fun­da­men­tal­ly non­lin­ear real world. The for­mer error guar­an­tees that your care­ful­ly fit­ted mod­el will match his­tor­i­cal data superbly, but will be wild­ly wrong about the future because it omits key fac­tors that deter­mine it. The lat­ter error guar­an­tees that your mod­el can extrap­o­late exist­ing trends—if it includes the main deter­mi­nants of the economy—but it can­not cap­ture turn­ing points. A lin­ear mod­el is, by def­i­n­i­tion, lin­ear, and straight lines don’t bend.

On the “get the vari­ables right” issue, most mod­ern het­ero­dox mod­els are supe­ri­or to main­stream DSGE ones, sim­ply because most of them include the finan­cial sys­tem and mon­e­tary stocks and flows in an intrin­sic way.

On the lin­ear­i­ty issue, most het­ero­dox SFCM mod­els are lin­ear, and are there­fore as flawed as their Neo­clas­si­cal DSGE coun­ter­parts. But it then comes down to how are these lin­ear mod­els used? In the Neo­clas­si­cal case, these mod­els are used to make numer­i­cal fore­casts and there­fore they extrap­o­late exist­ing trends into the future. In the het­ero­dox case, they are used to ask whether exist­ing trends can be sus­tained.

The for­mer pro­cliv­i­ty led DSGE modellers—such as the team behind the OECD’s Eco­nom­ic Out­look Report—to extrap­o­late the rel­a­tive tran­quil­li­ty of 1993–2007 into the indef­i­nite future in June of 2007:

Recent devel­op­ments have broad­ly con­firmed this prog­no­sis. Indeed, the cur­rent eco­nom­ic sit­u­a­tion is in many ways bet­ter than what we have expe­ri­enced in years. Against that back­ground, we have stuck to the rebal­anc­ing sce­nario. Our cen­tral fore­cast remains indeed quite benign: a soft land­ing in the Unit­ed States, a strong and sus­tained recov­ery in Europe, a sol­id tra­jec­to­ry in Japan and buoy­ant activ­i­ty in Chi­na and India. In line with recent trends, sus­tained growth in OECD economies would be under­pinned by strong job cre­ation and falling unem­ploy­ment. (Cotis 2007, p. 7; emphases added)

In con­trast, God­ley and Wray used the SFCM approach (with­out an actu­al mod­el) to con­clude that the 1993–2007 trends were unsus­tain­able, and that with­out a change in pol­i­cy, a cri­sis was inevitable:

We has­ten to add that we do not believe this pro­jec­tion. The econ­o­my will not con­tin­ue to grow; the pro­ject­ed bud­get sur­plus­es will not be achieved; pri­vate sec­tor spend­ing will not con­tin­ue to out­strip income; and growth of pri­vate sec­tor indebt­ed­ness will not accel­er­ate… As soon as pri­vate sec­tor spend­ing stops grow­ing faster than pri­vate sec­tor income, GDP will stop grow­ing. (God­ley & Wray, “Is Goldilocks Doomed?”, March 2000, p. 204)

This leads to Noah’s next false point, that Neo­clas­si­cal mod­els do what het­ero­dox ones do any­way.

We’re doing it anyway?

There are three main strands in het­ero­dox macro mod­el­ling: what is known as “Stock-Flow Con­sis­tent Mod­el­ling” (SFCM) that was pio­neered by Wynne Gold­ey; non­lin­ear sys­tem dynam­ics mod­el­ling; and het­ero­ge­neous mul­ti-agent mod­el­ling (there are oth­er approach­es too, includ­ing struc­tural­ly esti­mat­ed mod­els and big data sys­tems, but these are the main ones). Noah made a strong claim about the stock-flow con­sis­tent strand and Neo­clas­si­cal mod­el­ling:

Some het­ero­dox macro­econ­o­mists, it’s true, do have quan­ti­ta­tive the­o­ries. One is “stock-flow con­sis­tent” mod­els (a con­fus­ing name, since main­stream mod­els also main­tain con­sis­ten­cy between stocks and flows). These mod­els, devel­oped main­ly by researchers at the Levy Eco­nom­ics Insti­tute of Bard Col­lege, are large sys­tems of many equa­tions, usu­al­ly lin­ear equa­tions — for an exam­ple, see this paper by Levy econ­o­mists Dim­itri B. Papadim­itri­ou, Gen­naro Zez­za and Michalis Niki­foros. (Noah Smith, 8th August 2016)

I agree the name is confusing—perhaps it would be bet­ter if the name were “Mon­e­tary Stock-Flow Con­sis­tent Mod­els”. With that clar­i­fi­ca­tion, there is no way that Neo­clas­si­cal DSGE mod­els are stock-flow con­sis­tent in a mon­e­tary sense.

Even after the cri­sis, most Neo­clas­si­cal DSGE mod­els don’t include mon­ey or debt in any intrin­sic way (the finan­cial sec­tor turns up as anoth­er source of “fric­tions” that slow down a con­ver­gence to equi­lib­ri­um), and they cer­tain­ly don’t treat the out­stand­ing stock of pri­vate debt as a major fac­tor in the econ­o­my.

Het­ero­dox SFCM mod­els do include these mon­e­tary and debt flows, and there­fore the stocks as well. A trend—like that in the mid-1990s till 2000—that requires pri­vate debt to rise faster than GDP indef­i­nite­ly will be iden­ti­fied as a prob­lem for the econ­o­my by a het­ero­dox SFCM mod­el, but not by a Neo­clas­si­cal DSGE one. A Neo­clas­si­cal author who believes the fal­la­cious Loan­able Funds mod­el of bank­ing is also like­ly to wrong­ly con­clude that the lev­el of pri­vate debt is irrel­e­vant (except per­haps dur­ing a cri­sis).

This makes het­ero­dox SFCM models—such as the Kingston Finan­cial Bal­ances Mod­el (KFBM) of the US econ­o­my pro­duced by researchers at my own Department—very dif­fer­ent to main­stream DSGE mod­els.

No decent results from Agent-Based Models?

Noah con­cludes with the state­ment that what is known as “Agent Based Mod­el­ling” (ABM), which is very pop­u­lar in het­ero­dox cir­cles right now, has­n’t yet pro­duced robust results:

A sec­ond class of quan­ti­ta­tive het­ero­dox mod­els, called “agent-based mod­els,” have gained some atten­tion, but so far no robust, reli­able results have emerged from the research pro­gram. (Noah Smith, 8th August 2016)

Large­ly speak­ing, this is true—if you want to use these mod­els for macro­eco­nom­ic fore­cast­ing. But they are use­ful for illus­trat­ing an issue that the main­stream avoids: “emer­gent prop­er­ties”. A pop­u­la­tion, even of very sim­i­lar enti­ties, can gen­er­ate results that can’t be extrap­o­lat­ed from the prop­er­ties of any one enti­ty tak­en in iso­la­tion. My favourite exam­ple here is what we com­mon­ly call water. There is no such thing as a “water mol­e­cule”, or a “steam mol­e­cule”, let alone a “snowflake mol­e­cule”. All these pecu­liar and, to life, essen­tial fea­tures of H2O, are “emer­gent prop­er­ties” from the inter­ac­tion of large num­bers of H2O under dif­fer­ent envi­ron­men­tal con­di­tions. None of these are prop­er­ties of a sin­gle mol­e­cule of H2O tak­en in iso­la­tion.

Neo­clas­si­cal econ­o­mists unin­ten­tion­al­ly proved this about iso­lat­ed con­sumers as well, in what is known as the Son­nen­schein-Man­tel-Debreu the­o­rem. But they have side­stepped its results ever since.

The the­o­rem estab­lish­es that even if an econ­o­my con­sists entire­ly of ratio­nal util­i­ty max­i­miz­ers who each, tak­en in iso­la­tion, can be shown to have a down­ward-slop­ing indi­vid­ual demand curve, the mar­ket demand curve for any giv­en mar­ket can the­o­ret­i­cal­ly take any poly­no­mi­al shape at all:

Can an arbi­trary con­tin­u­ous func­tion … be an excess demand func­tion for some com­mod­i­ty in a gen­er­al equi­lib­ri­um econ­o­my?… we prove that every poly­no­mi­al … is an excess demand func­tion for a spec­i­fied com­mod­i­ty in some n com­mod­i­ty econ­o­my… every con­tin­u­ous real-val­ued func­tion is approx­i­mate­ly an excess demand func­tion. (Son­nen­schein, 1972 “Mar­ket Excess Demand Func­tions.” Econo­met­ri­ca
40, pp. 549–563.pp. 549–550)

Alan Kir­man sug­gest­ed the prop­er reac­tion to this dis­cov­ery almost 30 years ago: that the deci­sion by the Neo­clas­si­cal school, at the time of the sec­ond great con­tro­ver­sy over eco­nom­ic the­o­ry, to aban­don class-based analy­sis, was unsound. Since even such a basic con­cept (to the Neo­clas­si­cal school) as a down­ward-slop­ing demand curve could not be derived by extrap­o­lat­ing from the prop­er­ties of an iso­lat­ed indi­vid­ual, the only rea­son­able pro­ce­dure was to work at the lev­el of groups with “col­lec­tive­ly coher­ent behaviour”—what the Clas­si­cal School called “social class­es”:

If we are to progress fur­ther we may well be forced to the­o­rise in terms of groups who have col­lec­tive­ly coher­ent behav­iour. Thus demand and expen­di­ture func­tions if they are to be set against real­i­ty must be defined at some rea­son­ably high lev­el of aggre­ga­tion. The idea that we should start at the lev­el of the iso­lat­ed indi­vid­ual is one which we may well have to aban­don. There is no more mis­lead­ing descrip­tion in mod­ern eco­nom­ics than the so-called micro­foun­da­tions of macro­eco­nom­ics which in fact describe the behav­iour of the con­sump­tion or pro­duc­tion sec­tor by the behav­iour of one indi­vid­ual or firm. If we aggre­gate over sev­er­al indi­vid­u­als, such a mod­el is unjus­ti­fied. Kir­man, A. (1989). “The Intrin­sic Lim­its of Mod­ern Eco­nom­ic The­o­ry: The Emper­or Has No Clothes.” Eco­nom­ic Jour­nal 99(395): 126–139.

Instead of tak­ing this sen­si­ble route, Neo­clas­si­cal economists—mainly with­out con­scious­ly real­is­ing it—took the approach of mak­ing the absurd assump­tion that the entire econ­o­my could be treat­ed as a sin­gle indi­vid­ual in the fic­tion of a “rep­re­sen­ta­tive agent”.

Men­da­cious text­books played a large role here—which is why I say that they did this with­out real­is­ing that they were doing so. Most of today’s influ­en­tial Neo­clas­si­cal econ­o­mists would have learnt their advanced micro from Hal Var­i­an’s text­book. Here’s how Var­i­an “explained” the Son­nen­schein Man­tel Debreu results:

it is some­times con­ve­nient to think of the aggre­gate demand as the demand of some ‘rep­re­sen­ta­tive con­sumer’… The con­di­tions under which this can be done are rather strin­gent, but a dis­cus­sion of this issue is beyond the scope of this book…” (Var­i­an 1984, p. 268)

The “con­ve­nience” of the “rep­re­sen­ta­tive con­sumer” led direct­ly to Real Busi­ness Cycle mod­els of the macro­econ­o­my, and thence DSGE—which Neo­clas­si­cals are now begin­ning to realise was a mon­u­men­tal blun­der.

Mul­ti-agent mod­el­ling may not lead to a new pol­i­cy-ori­ent­ed the­o­ry of macro­eco­nom­ics. But it acquaints those who do it with the phe­nom­e­non of emer­gent properties—that an aggre­gate does not func­tion as a scaled-up ver­sion of the enti­ties that com­prise it. That’s a les­son that Neo­clas­si­cal econ­o­mists still haven’t absorbed.

Previous periods of crisis in economic theory

Since I began this post by call­ing the cur­rent debate the “5th great con­flict over the nature of eco­nom­ics”, I’d bet­ter detail the first three (the 4th being Key­nes’s batlle in the 1930s). These were:

  • The “Meth­o­d­en­stre­it” dis­pute between the Aus­tri­an and Ger­man His­tor­i­cal Schools—which was a dis­pute about a pri­ori rea­son­ing ver­sus empir­i­cal data;
  • The Neo­clas­si­cal revolt against the Clas­si­cal school after Marx had turned the lat­ter into the basis for a cri­tique of cap­i­tal­ism, rather than a defence of it as it was with Smith and Ricar­do; and
  • The event that I per­son­al­ly iden­ti­fy as the real point at which eco­nom­ics went wrong: Smith’s replace­ment of “the divi­sion of labour” as the source of ris­ing pro­duc­tiv­i­ty in cap­i­tal­ist over the Phys­io­crat­ic argu­ment that human pro­duc­tiv­i­ty actu­al­ly emanat­ed from employ­ing the ener­gy of the Sun. Though the Phys­iocrats were wrong that agri­cul­ture was the only “pro­duc­tive” sector—manufacturing being “ster­ile” accord­ing to them since all it did was trans­form the out­puts of agri­cul­ture into dif­fer­ent forms, when in fact it har­ness­es “free ener­gy” (ener­gy from the Sun, fos­sil fuels and nuclear process­es) to do use­ful work even more effec­tive­ly than agriculture—they were right that har­ness­ing free ener­gy was the basis of the pro­duc­tiv­i­ty of cap­i­tal­ism.

I’ll address this very last wonk­ish issue in a future post.

Bookmark the permalink.

About Steve Keen

I am Professor of Economics and Head of Economics, History and Politics at Kingston University London, and a long time critic of conventional economic thought. As well as attacking mainstream thought in Debunking Economics, I am also developing an alternative dynamic approach to economic modelling. The key issue I am tackling here is the prospect for a debt-deflation on the back of the enormous private debts accumulated globally, and our very low rate of inflation.