というNBER論文をタイラー・コーエンが「この種のマクロ経済理論は過小評価されている(This kind of macro theory is underrated)」というコメントを添えて紹介しているungated版へのリンクがある著者の一人のページ)。原題は「Demand Shocks as Technology Shocks」で、著者はYan Bai(ロチェスター大)、José-Víctor Ríos-Rull(ペンシルベニア大)、Kjetil Storesletten(ミネソタ大)。

In the standard neoclassical model, output is a function of inputs such as labor and capital. There is no explicit role for demand because potential consumers are always available and Walrasian prices adjust so that all produced goods become used. In reality, customers and producers must meet in order for the produced good to be consumed, so value added depends on how well they are matched. As an example, consider a restaurant. According to the neoclassical view, the value added of a restaurant should be a function of its inputs (employees, tables, etc.), irrespective of the number of patrons and how hungry they are. Moreover, the restaurant owner would set prices so that all tables were in use. However, actual production takes place only when customers show up. The more customers demand the restaurant’s meals, the larger the value added will be. The idea that the demand for goods plays a direct role extends to many forms of production: dentists need patients, car dealers need shoppers, all producers need buyers.
This paper provides a theory where search for goods —which we, with some abuse of terminology, refer to as demand— has a productive role. The starting point is that customers search for producers, and a standard matching friction prevents Walrasian market clearing in the sense that all potential productive capacity necessarily translates into actual value added. Allowing an explicit role for demand has implications for business cycle analysis, especially for our understanding of the driving forces of business cycles. In our model, changes in search effort affect output even if conventional inputs remain constant. Demand shocks therefore influence the measured aggregate TFP. This paper quantifies how important this mechanism is for aggregate fluctuations, relative to more standard business-cycle shocks.


Our paper is consistent with Keynes’ idea that consumer demand can have real effects. We show that this holds true even in a neoclassical model with flexible prices, amended with a product market matching friction.






というNBER論文をサマーズらが上げている(H/T タイラー・コーエン)。原題は「The Cost of Money is Part of the Cost of Living: New Evidence on the Consumer Sentiment Anomaly」で、著者はMarijn A. Bolhuis(IMF)、Judd N. L. Cramer(ハーバード大)、Karl Oskar Schulz(同)、Lawrence H. Summers(同)。

In new NBER paper with @MA_Bolhuis, @juddcramer and Oskar Shulz, we argue that the unprecedented increase in borrowing costs is crucial to explaining the low consumer sentiment of the last two years. 1/N
With higher rates, mortgage payments, car payments, and other credit payments required to finance everyday purchases have risen as well. It is not surprising that this would affect how consumers feel about the economy. 2/N
Since Okun invented his misery index in the 1970s, economists have looked at unemployment and the inflation rate to gauge consumer sentiment. But now that unemployment is low and inflation has declined, consumer sentiment remains depressed. 3/N
Pre-1983, mortgage costs were in the CPI as were car payments pre-1998. Now, price indexes do not include borrowing costs. Thus, when interest rates jumped last year, official inflation did not fully capture the effects it would have on consumer well-being. 4/N
In the paper, we show that the variation in the current University of Michigan Index of Consumer Sentiment, which cannot be explained by official inflation and unemployment, has historically shown a strong correlation with proxies for borrowing costs. 5/N
We also show that the underlying questions in the survey provide direct evidence that concerns of consumers about borrowing costs are at historic highs, surpassed only by the Volcker-era. 6/N
We then develop alternative CPI measures that explicitly incorporate the cost of money. The CPI does not only exclude mortgage costs, but also personal interest payments, which increased by more than 50 percent in 2023. 7/N
We show that if we make an effort to reconstruct the CPI of Okun’s era—which would have had inflation peak last year around 18%, we are able to explain 70% of the gap in consumer sentiment we saw last year. 8/N
We also show the sentiment gap in 2023 was not only a U.S. phenomenon as rates have jumped around the world. Overall, our paper highlights how consumers care about the cost of money, with potential for consumer sentiment to rise significantly if and when interest rates decline. 9/9
@MA_Bolhuis、@juddcramer、Oskar Shulzとの新しいNBER論文で我々は、借り入れコストの前例の無い増加が過去2年間の消費者心理の低下を説明する上で極めて重要である、と論じる。








というJournal of Economic Perspectives論文をMostly Economicsが紹介している。原題は「The Failure of Silicon Valley Bank and the Panic of 2023」で、著者はAndrew Metrick(イェール大)。

The failure of Silicon Valley Bank on March 10, 2023 brought attention to significant weaknesses across the banking system, leading to a panic that spread to other vulnerable banks. With subsequent failures of Signature Bank and First Republic Bank, the United States had three of the four largest bank failures in its history occur over a two-month period. Several features of the Silicon Valley Bank failure make it an ideal teaching case for explaining the underlying economics of banking (in general) and banking crises (specifically). This paper tries to do that.

結論部では、銀行の支払い能力によって預金は貨幣のような役割を果たし、預金者にとって銀行を切り替えることにコストが掛かることから銀行に独占力が生じる、と説明している。そのため支払い能力と流動性はお互いを強め合い、それが銀行の収益の源泉となるが、いったん支払い能力に疑問符が付くとそれが自己補強してシリコンバレー銀行のように破綻に向かうことになる。それを防ぐためには単純に預金保険を拡大すれば良いように思われ、実際リーマン危機時に米連邦預金保険公社が一時的に導入した無利子預金に対する無制限の保証であるTransaction Account Guarantee(TAG)は成功例とされている。だが、そうした政策を恒久化することには幾つかの難がある、とMetrickは言う。一つには、TAG導入当時は金利はゼロに近かったので預金者が預金を手放すインセンティブが乏しかったが、今の高金利環境ではそうではない。もう一つは、仮に有利子預金にまで預金保険を拡大すると、銀行があの手この手を使ってあらゆる金融商品預金保険の対象にしようとするのは目に見えている。従って正しい水準の預金保険を見つけるのは難しく、それが銀行規制の課題になる、とMetrickは述べている。


というSSRN論文をMostly Economicsが紹介している。原題は「Douglass North, New Institutional Economics, and Complexity Theory」で、著者はJohn B. Davis(アムステルダム大)。

Douglass North was central to the emergence of New Institutional Economics. Less well known are his later writings where he became interested in complexity theory. He attended the second economics complexity conference at the the Santa Fe Institute in 1996 on how the economy functions as a complex adaptive system, and in his 2005 Understanding the Process of Economic Change incorporated this thinking into his argument that market systems depend on how institutions evolve. North also emphasized in the 2005 book the role belief played in evolutionary processes, and drew on cognitive science, especially the famous ‘scaffolding’ idea of cognitive scientist Andy Clark – the idea that the brain and the world ‘collaborate’ to address our computational and informational needs. This chapter discusses how North’s thinking about institutions and change reflected these later investigations. It concludes with comments on his late thoughts about the problem of violence.


というECBの小論をMostly Economicsが紹介している。原題は「Is the PMI a reliable indicator for nowcasting euro area real GDP?」で、著者はGabe de BondtとLorena Saiz。

The euro area composite output Purchasing Managers' Index (PMI) tends to be strongly correlated with real GDP growth (Chart A). The composite output PMI is a diffusion index, which measures the sum of the percentage of month-on-month “higher” output responses and half the percentage of “no output change” responses. The PMI survey output question asks about the actual unit volume of output this month compared to the previous month. It indicates the degree to which output changes are diffused throughout the panel of respondents and has a no-change benchmark of 50. A simple PMI-based rule of thumb, hereafter referred to as the PMI-based tracker rule, calculates euro area quarterly real GDP growth as 10% of the quarterly average level of the composite output PMI from which a value of 50 is subtracted. This rule-of-thumb exhibited a good nowcasting performance during the pre-coronavirus (COVID-19) period. However, since the composite output PMI is a diffusion index, it provides information on the extensive margin of change (the number of firms that reported a change in output) but not on the intensive margin of change (the amount by which output changed). It implies that in periods of extreme volatility in output, such as during the COVID-19 pandemic, the level of the composite output PMI might become less informative. Another limitation of the composite output PMI is the incomplete sector coverage; the index is a weighted average of the services business activity PMI and the manufacturing output PMI, while other important sectors such as retail, construction and government are missing. Moreover, the euro area composite output is based solely on the four largest euro area countries and Ireland.


Chart A
Euro area composite output PMI and real GDP growth
(left axis: quarterly percentage changes, right axis: diffusion index)

Sources: Eurostat, S&P Global and ECB calculations.
Notes: The two y-axis scales reflect the PMI-based tracker rule, which is calculated as 10% of the quarterly average of the composite output PMI minus 50.


Information derived from composite and sectoral PMIs plays an important role for the mechanical short-term GDP forecasting tools used by ECB and Eurosystem staff. The ECB short-term mechanical forecasting models include basic linear regressions, which directly link quarterly averages of monthly PMI data with real GDP. These regressions are known as bridge equations because GDP predictors bridge the gap between earlier available higher frequency data, such as industrial production, and quarterly GDP. The GDP predictors are, in turn, forecast with satellite models using sectoral PMIs, among other monthly indicators. Overall, compared with other indicators, PMIs tend to have a relatively high weight in the forecasting models due to their timeliness.

Chart B
Forecast accuracy of the ECB’s PMI-based short-term forecasting tool
(in percentage points)

Sources: ECB calculations.
Notes: The chart shows the root mean squared forecast error (RMSFE) and the bias, which is defined as the average difference between the forecast and the outcome. The forecasts use real-time data and are made two weeks before the official flash estimate of GDP is released by Eurostat and are evaluated against it.


Chart C
Root mean squared forecast errors in nowcasting euro area real GDP growth based on the latest GDP vintage
(in percentage points)

Sources: ECB, S&P Global and ECB staff calculations.
Notes: The numbers “3”, “2” and “1” represent the number of months before the release of the first GDP estimate. The real-time nowcast from the ECB/Eurosystem staff macroeconomic projections is available around two months before the first GDP estimate. The forecast errors are calculated using the latest available GDP vintage (19 January 2024) as a target. 2023 is based on the first three quarters of 2023 only because the first GDP vintage for the fourth quarter of 2023 was released after the cut-off date of this Bulletin.



というNBER論文(原題は「Applying AI to Rebuild Middle Class Jobs」)をMITのDavid Autorが上げている(H/T Mostly Economics;cf. 同内容のNOEMA記事に関する本人のツイート)。

While the utopian vision of the current Information Age was that computerization would flatten economic hierarchies by democratizing information, the opposite has occurred. Information, it turns out, is merely an input into a more consequential economic function, decision-making, which is the province of elite experts. The unique opportunity that AI offers to the labor market is to extend the relevance, reach, and value of human expertise. Because of AI’s capacity to weave information and rules with acquired experience to support decision-making, it can be applied to enable a larger set of workers possessing complementary knowledge to perform some of the higher-stakes decision-making tasks that are currently arrogated to elite experts, e.g., medical care to doctors, document production to lawyers, software coding to computer engineers, and undergraduate education to professors. My thesis is not a forecast but an argument about what is possible: AI, if used well, can assist with restoring the middle-skill, middle-class heart of the US labor market that has been hollowed out by automation and globalization.

以下はMostly Economicsの本文からの引用の孫引き。

Most “experts” of our era would be at a loss if teleported back to the 18th century. Prior to the Industrial Revolution, goods were handmade by skilled artisans: wagon wheels by wheelwrights; clothing by tailors; shoes by cobblers; timepieces by clockmakers; firearms by blacksmiths. Artisans spent years acquiring at least two broad forms of expertise: procedural expertise, meaning following highly practiced steps to produce an outcome; and expert judgment, meaning adapting those procedures to variable instances.

Although artisanal expertise was revered, its value was ultimately decimated by the rise of mass production in the 18th and 19th centuries (Hounshell, 1984). Mass production meant breaking the complex work of artisans into discrete, self-contained and often quite simple steps that could be carried out mechanistically by a team of production workers, aided by machinery and overseen by managers with higher education levels. Mass production was vastly more productive than artisanal work, but conditions for rank-and-file workers were typically hazardous and grueling, requiring no specialized expertise beyond a willingness to labor under punishing conditions for extremely low pay.
職人の専門性は尊敬されていたが、その価値は最終的に18世紀と19世紀の大量生産の台頭によって消滅した(Hounshell, 1984)。大量生産は、職人の複雑な仕事を、自己完結していて、しばしば極めて単純な個々の*1段階に分解することを意味していた。そのように分解された仕事は、教育程度の高い管理者の監督下で、機械の助けを得つつ、生産労働者のチームによって機械的に遂行できるようになった。大量生産は職人の仕事よりも遥かに生産的であったが、一般の労働者の環境は危険かつ激務なのが普通で、極めて低い賃金のために過酷な環境下で働く意思以外の特別な専門性は要求されなかった。

As the tools, processes and products of modern industry gained sophistication, demand for a new form of worker expertise — “mass expertise” — burgeoned (Goldin and Katz, 1998; Buyst et al., 2018). Workers operating and maintaining complex equipment required training and experience in machining, fitting, welding, processing chemicals, handling textiles, dyeing and calibrating precision instruments, etc. Away from the factory floor, telephone operators, typists, bookkeepers and inventory clerks, served as information conduits — the information technology of their era.
近代産業の道具、プロセス、および製品が洗練されるのに伴い、労働者の新たな形の専門性である「大量生産の専門性」への需要が急増した(Goldin and Katz, 1998; Buyst et al., 2018)。複雑な装置を操作し維持する労働者は、機械加工、取り付け、溶接、薬品加工、織物の取り扱い、染色、精密機械の較正などに訓練と経験が要求された。工場の現場以外では、電話交換手、簿記係、在庫係が情報の導管――その時代の情報技術――として機能した。

Stemming from the innovations pioneered during World War II, the Computer Era (AKA the Information Age) ultimately extinguished much of the demand for mass expertise that the Industrial Revolution had fostered. The unique power of the digital computer, relative to all technologies that preceded it, was its ability to cheaply, reliably and rapidly execute cognitive and manual tasks encoded in explicit, deterministic rules, i.e., what economists called “routine tasks” and what software engineers call programs.

Like the Industrial and Computer revolutions before it, Artificial Intelligence marks an inflection point in the economic value of human expertise. To appreciate why, consider what distinguishes AI from the computing era that we’re now leaving behind. Pre-AI, computing’s core capability was its faultless and nearly costless execution of routine, procedural tasks. Its Achilles’ heel was its inability to master non-routine tasks requiring tacit knowledge. Artificial Intelligence’s capabilities are precisely the inverse.

Artificial Intelligence is this inversion technology. By providing decision support in the form of real-time guidance and guardrails, AI could enable a larger set of workers possessing complementary knowledge to perform some of the higher-stakes decision-making tasks currently arrogated to elite experts like doctors, lawyers, coders and educators. This would improve the quality of jobs for workers without college degrees, moderate earnings inequality, and — akin to what the Industrial Revolution did for consumer goods — lower the cost of key services such as healthcare, education and legal expertise.

*1:Mostly Economicsの引用ではdiscreetとなっていたが、NOEMA記事ではdiscreteになっていたので、ここでは後者に修正した。



というNBER論文が上がっている(H/T Mostly Economicsungated版)。原題は「On Digital Currencies」で、著者はHarald Uhlig(シカゴ大)。

I discuss private and central-bank-issued digital currencies, summarizing my prior research. I argue that prices of private digital currencies such as bitcoin follow random walks or, more generally, risk-adjusted martingales. For central bank digital currencies, I argue that they enhance the “CBDC trilemma” facing a central bank: out of the three objectives, price stability, efficiency, and monetary trust, it can achieve at most two.

  Ptyt = Dt +QtBt


  • t=0:[0, 1]で連続的に分布している主体が実物財1単位を賦与される。主体はその財を中銀に売却してM単位の貨幣ないしCBDCを得る。中銀は実物財に投資する。
  • t=1:主体は確率λでせっかちで、確率1-λで我慢強いことが明らかになる。せっかちな主体はこの期にMを支出し、我慢強い主体はこの期もしくは翌期に支出する。中銀は支出した主体の割合nを観測し(ここでλ ≤ n ≤ 1)、投資した財の一部 y = y(n) ∈ [0,1] を市場清算価格P1で売却する。
  • t=2:この期に支出を持ち越した主体は金利i(n)を受領し、(1 + i(n))Mを支出する。1単位の実物財はR単位になり、中銀は手持ちの財R(1 − y)を市場清算価格P2で売却する。


  1. 物価の安定性:P1(n)が環境nによらず一定。
  2. 効率性:社会的に最適なリスクシェアリングの達成。そのための条件は y(λ) = y = λx1 (ここでx1は貨幣導入前のモデルにおけるt=1でのせっかちな主体が消費した実物財。そのモデルでは max λu(x1) +(1−λ)u(x2) s.t. λx1 +(1−λ) x2/R =1 が社会計画者の最適化すべき問題となり、x1の最適値がx1となる)。
  3. 貨幣の信認:中銀の取り付け騒ぎと貨幣の不安定化を避ける。



  P1(n) = nM / y(n)
  P2(n) = {(1−n)(1+i(n))M} / {R(1 −y(n))}
  x1(n) = M / P1 = y(n) / n
  x2(n) = (1+i(n))M / P2 = {1−y(n)} R / {1−n}