liquidity and economic activity - university of birmingham · 2017-05-22 · financial services...

24
Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor William A Barnett BROCHURE

Upload: others

Post on 30-Jun-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

Fina

ncia

l Ser

vice

s In

dice

sL

iqui

dity

and

Eco

nom

ic A

ctiv

ity

In h

onou

r of

Osw

ald

Dis

ting

uish

ed P

rofe

ssor

Will

iam

A B

arne

ttB

RO

CH

UR

E

2 Liquidity and Economic Activity Conference

CONFERENCE LOCATION Bank of EnglandThreadneedle Street London EC2R 8AH

AUTHORFinancial Resilience Research ClusterBirmingham Business School University of Birmingham Edgbaston Birmingham B15 2TT

C E N T E R F O R F I N A N C I A L S T A B I L I T Y B o l d I n n o v a t i v e P r a c t i c a l

3Liquidity and Economic Activity Conference

Financial Services Indices Liquidity and Economic Activity CONFERENCE SCHEDULE

Day 1 23 May 2017 850ndash920am Registration and Coffee920ndash930am Conference Opens and Welcome Address Jane Binner930ndash1130am

SESSION 1 Liquidity and Monetary PolicyChair David Aikman

nPAPER 1 Robert Aliber lsquoAn essay on monetary turbulence and the supply of liquidityrsquo

nPAPER 2 Tim Congdon lsquoWhat were the causes of the Great Recession The importance of the ldquowhich aggregaterdquo debatersquoever

nPAPER 3 Singh lsquoThe role of pledged collateral in liquidity metrics and monetary policyrsquo

nPAPER 4 Aikman Lehnert Liang Modugno lsquoCredit risk appetite and monetary policy transmissionrsquo

1130ndash1145am Break1145amndash115pm

SESSION 2Producing Liquidity and Excess LiquidityChair Jan Willem Van den End

nPAPER 5 Kevin Fox and Erwin Diewert lsquoThe demand for monetary balances and the measurement of productivityrsquo

nPAPER 6 Dennis Fixler and Kim Zieschang lsquoProducing liquidityrsquo

nPAPER 7 Jan Willem Van den End lsquoApplying complexity theory to interest rates Evidence of critical transitions in the Euro arearsquo

115ndash215pm Lunch 215ndash230pm Conference Group Photo ndash steps of the Bank of England230ndash330pm Keynote Lecture Oswald Distinguished Professor William A Barnett ndash lsquoUnsolved problems in monetary aggregation and transmissionrsquo330ndash345pm Break345ndash545pm

SESSION 3Liquidity Creation and Macroeconomic PolicyChair Jean-Loup Soula

nPAPER 8 Bowe Kolokolova and Michalski lsquoToo big to care too small to matter macro financial policy and bank liquidity creationrsquo

nPAPER 9 Goldberg lsquoThe supply of liquidity and real economic activityrsquo

nPAPER 10 Bezemer and Lu Zhang lsquoMacroeconomic implications of liquidity creation credit allocation and post-crisis recessionsrsquo

nPAPER 11 Hasan and Soula lsquoTechnical efficiency in bank liquidity creationrsquo

545ndash600pm Break 600ndash700pm Poster Competition ndash Atrium Bank of England with light refreshments and wine 745pm Conference Dinner ndash The Counting House 50 Cornhill London EC3V 3PD

4 Liquidity and Economic Activity Conference

Day 2 24 May 2017 930ndash1030am

SESSION 4 Financial Services Indices Liquidity and Economic ActivityChair Jane Binner

nPAPER 12 Richard Anderson John Duca and Barry Jones lsquoA broad monetary services (liquidity) index and its long-term links to economic activityrsquo

nPAPER 13 John Keating and Lee Smith lsquoThe optimal monetary instrument and the (mis) use of Granger causalityrsquo

nPAPER 14 Jane Binner Logan Kelly and Jon Tepper lsquoOn the robustness of sluggish state-based neural networks for providing useful insight into the new Keynesian Phillips curversquo

1030ndash1045am Break1045amndash1245pm

SESSION 5Structural Change Volatility and ShocksChair Taniya Ghosh

nPAPER 15 Rakesh Bissoondeeal Michael Karaglou Jane Binner lsquoStructural changes and the role of money in the UKrsquo

nPAPER 16 Costas Milas and Michael Ellington lsquoIdentifying aggregate liquidity shocks in conjunction with monetary policy shocks An application using UK datarsquo

nPAPER 17 Makram El Shagi and Logan Kelly lsquoWhat can we learn from country level liquidity in the EMUrsquo

nPAPER 18 Bhadury and Ghosh lsquoHas money lost its relevance Determining a solution to the exchange rate disconnect puzzle in the small open economiesrsquo

1245ndash145pm Lunch145ndash245pm Keynote Address Lawrence Goodman Centre for Financial Stability New York lsquoPreventative macroprudential policyrsquo245ndash300pm Break300ndash500pm

SESSION 6Tests of Index Numbers Separability and Price DualsChair Victor Valcarcel

nPAPER 19 William A Barnett and Jinan Liu lsquoUser cost of credit card services under intertemporal non-separabilityrsquo

nPAPER 20 Per Hjertstrand Gerald Whitney and James Swofford lsquoPanel data tests of index numbers and revealed preference rankingsrsquo

nPAPER 21 Sajid Chaudhry Jane Binner James Swofford and Andrew Mullineux lsquoScotland as an optimum currency arearsquo

nPAPER 22 Victor Valcarcel lsquoInterest rate pass-through Divisia user costs of monetary assets and the Federal funds ratersquo

500pm Conference Close600pm Editorial Board Meeting ndash Guest Editors Special Issue Meeting

5Liquidity and Economic Activity Conference

Financial Services Indices Liquidity and Economic Activity PAPER ABSTRACTS

PAPER 1Robert Aliber University of Chicago lsquoAN ESSAY ON MONETARY TURBULENCE AND THE SUPPLY OF LIQUIDITYrsquo

Abstract The last 35 years have been the most turbulent in monetary history There have more than 100 banking crises many in one of four waves most of these crises have been lsquotwinnedrsquo with currency crises Moreover the deviations between the market prices of currencies have been much larger than ever before The purpose of this essay is to explain why there have been so many banking crises and why they have often occurred together with currency crises the answer is that the floating currency arrangement is inherently unstable because an increase in cross-border investment flows to a country leads to an increase in the price of its securities and the increase in the price of its currency The essay is based on a general equilibrium view that links the market in currencies with the markets in bonds stocks and real estate the increase in cross-border investment inflows to a country leads to an increase in the household wealth as an integral part of the adjustment process to ensure that the countryrsquos current account deficit increases as its capital account surplus increases otherwise the market in the countryrsquos currency would not clear The increase in the countryrsquos external indebtedness is much more rapid than the increase in its GDP When the lenders recognise that the indebted country

is on a non-sustainable trajectory their demand for the borrowersrsquo IOUs declines and the price of its securities and the price of its currency declines A banking crisis may follow if the decline in the price of the currency is large since it leads immediately to a sharp increase in the total indebtedness of the borrowers with liabilities denominated in a foreign currency

When currencies are no longer anchored to parities each central bank has much more monetary independence and investors have much more incentive to change the currency composition of the securities in their portfolios Changes in investor demand for foreign securities lead to much more price risk in both the markets for currencies and the markets for securities There has been a scissors-like movement in the market for liquidity the lsquodemandrsquo for liquidity by traders and investors has increased while the supply of liquidity has declined because the price risks are much larger Keywords banking crises market efficiency market failure flexible exchange rates the transfer problem process

JEL Codes E44 F31 F32 F33 F34 F38

6 Liquidity and Economic Activity Conference

PAPER 2Professor Tim Congdon CBE Chairman Institute of International Monetary Research at the University of Buckingham lsquoWHAT WERE THE CAUSES OF THE GREAT RECESSION THE IMPORTANCE OF THE ldquoWHICH AGGREGATErdquo DEBATErsquo

Abstract Monetary economists have long debated which measure of the quantity of money is the most useful in macroeconomic analysis Before during and after the Great Recession (to be understood ndash roughly speaking as the six quarters to mid-2009) the growth rates of different money aggregates diverged sharply in the leading economies The lsquowhich aggregatersquo debate was therefore of particular importance The focus in this paper is on the USArsquos experience although the behaviour of money in other economies and its relationship to prices and spending in them is mentioned where relevant and interesting

The argument is that broadly-defined money has long been the correct concept to use in interpreting macroeconomic developments and that its merits became clear in the Great Recession and its aftermath Broad money is to be viewed as including virtually all money-like assets (and certainly most of the deposit liabilities of the commercial banking system) For the purposes of the paper broad money is identified with the M3 money measure for which the Federal Reserve prepared data until early 2006 It will be shown that with its flow-of-funds data the Federal Reserve is still publishing information that enables an approximate M3 aggregate to be estimated Further the M3 money holdings

of the US economyrsquos main sectors ndash households non-financial business and financial business ndash can be tracked from the flow-of-funds numbers

The last decade has seen the lowest average annual increase per cent in nominal GDP since the 1930s The growth rate of M3 has also been the lowest since the 1930s This similarity of the rates of change contrasts with the behaviour of M1 M1 increased strongly during the Great Recession and afterwards Its rate of increase in the 2009ndash15 period was over double that in the preceding 48 years with its behaviour sharply divergent from nominal GDP With M2 the discrepancy is less marked but a discrepancy remains

Apart from its insights into the causation of the Great Recession the paper will have two provocative conclusions First the Federal Reserve should resume publication of an M3 aggregate and following the Bank of Englandrsquos example it should de-compose broad money into its sector constituents Second the interesting patterns in inter-sectoral money flows that seem to be recurrent in cyclical episodes can be monitored only from a simple-sum broad money aggregate Divisia indices prepared from aggregate economy-wide data cannot identify the patterns while ndash arguably ndash these patterns are important in understanding the transmission mechanism from money to the economy

The paperrsquos main thesis has already been developed in an informal way in an article in the 2016 Central Banking journal The aim of the paper will be to present the analysis in a more academically rigorous form

7Liquidity and Economic Activity Conference

PAPER 3Manmohan Singh International Monetary Fund lsquoTHE ROLE OF PLEDGED COLLATERAL IN LIQUIDITY METRICS AND MONETARY POLICYrsquo

Abstract Collateral does not flow in a vacuum and needs balance sheet(s) to move within the financial system Pledged collateral needs to be considered along with money metrics to fully understand the liquidity in the markets This paper analyses securities-lending derivatives and prime-brokerage markets as suppliers of collateral (as much has been written on the repo market) Going forward the official sectorrsquos choice of balance sheet(s) that allows the flow of liquidity (ie money and collateral) should be transparent and driven by market forces and not by ad hoc allocation by central banks Else this may be suboptimal on many fronts for monetary policy transmission for smooth money market functioning and ultimately for market liquidity

Keywords collateral velocity securities lending prime brokerage OTC derivatives repo

JEL Codes G21 G28 F33 K22

PAPER 4David Aikman Bank of England Andreas Lehnert Federal Reserve Bank Nellie Liang Federal Reserve Bank Michele Modugno Federal Reserve Bank lsquoCREDIT RISK APPETITE AND MONETARY POLICY TRANSMISSIONrsquo

Abstract We show that US economic performance and monetary policy transmission depend on nonfinancial sector credit and the effects are nonlinear When credit is below its trend increases in risk appetite lead to sustained increases in output In contrast when credit is above trend initial expansions are followed by additional excess borrowing and subsequent contractions suggesting an inter-temporal trade-off for economic activity Also tighter monetary policy is ineffective at slowing the economy when credit is high consistent with evidence of less transmission of policy changes to distant forward Treasury rates in high-credit periods

Keywords financial stability financial conditions credit asset bubbles monetary policy

JEL Codes E58 E65 G28

8 Liquidity and Economic Activity Conference

PAPER 5Kevin Fox University of New South Wales and Erwin Diewert University of British ColumbialsquoTHE DEMAND FOR MONETARY BALANCES AND THE MEASUREMENT OF PRODUCTIVITYrsquo

Abstract Firms in advanced economies have greatly increased their cash holdings since the mid-1990s While this has been observed and the reasons debated by central bankers international agencies and academics it remains somewhat of a puzzle This paper explores possible reasons for this increase and the implications for understanding productivity growth Monetary holdings have an opportunity cost ie allocating firm financial capital into monetary deposits means that investment in real assets is reduced Traditional measures of Total Factor Productivity (TFP) do not take into account these holdings of monetary assets Given the recent large increases in these holdings ex ante it can be expected that adding these monetary assets to the list of traditional sources of capital services will reduce the TFP of the business sector Using a new data set on the US aggregate (corporate and non-corporate) business sector we measure this effect for the noting the implications for the System of National Accounts of this expanded definition of capital services Also industry elasticities of demand to hold monetary balances using the Normalized Quadratic functional form A key finding is that the accumulation of monetary holdings is primarily a phenomenon of the non-corporate business sector

We have found that while conceptually more correct adding real money balances to our input aggregate does not change aggregate measured productivity performance very much for the corporate sector This is because even though there is some variation the asset share is relatively small The impact on the non-corporate sector is larger especially in the latter decades of the sample when currency and deposit holdings increased substantially especially relative to other asset holdings Finally the relative productivity of individual firms can be significantly impacted by differences in money holdings even if there is little aggregate effect at the sectoral level Indeed understanding productivity differences between small and large firms can be enhanced by taking into account currency and deposits small firms are often credit constrained and therefore have greater cash holdings Similarly accounting for cash holdings can provide an augmented understanding of productivity and profitability in studies of firm dynamics In addition understanding productivity differences between risky and less risky sectors and firms can be informed by differences in money balances where eg dependence on RampD is taken as a proxy for risk Hence this paper provides a framework and empirical results for a more comprehensive understanding of productivity growth and dynamics

9Liquidity and Economic Activity Conference

PAPER 6Dennis Fixler Bureau of Economic Analysis and Kim Zieschang International Monetary Fund lsquoPRODUCING LIQUIDITYrsquo

Abstract Based on a paper presented at the 2015 Meeting of the Society for Economic Measurement Paris Dennis Fixler is Chief Economist Bureau of Economic Analysis and Kim Zieschang is Adjunct Professor of Economics University of Queensland The views expressed in this paper are those of the authors and should not be attributed to the Bureau of Economic Analysis JEL codes E01 Measurement and Data on National Income and Product Accounts Commercial banks are a primary producer of liquidity in an economy Despite their importance there is no consensus on the measurement of the liquidity service and for the matter bank output in general This lack of consensus exits at the microeconomic level and at that level of the national accounts which attempts to capture the significant role of bank services in the output of the economy The current national accounts measure of bank output is termed financial intermediation services indirectly measured or lsquoFISIMrsquo springing from the 1993 version of the System of National accounts (the 1993 SNA) Calculating FISIM under the 2008 version of national accounting standards is simple and generally practical provided the compiler has a key datummdashthe lsquoreference rate of interestrsquo The calculation is essentially Output = (Reference rate of interest minus Deposit rate) times Deposit liabilities + (Loan rate minus Reference rate of interest) times Loan assets

As the deposit and loan financial instrument coverage of this formula implies the current national accounting standards apply it to deposit-takers such as banks as well as to

non-deposit-taking loan-making financial institutions such as finance companies and money lenders

A first issue in treatment of financial services since the 1993 version of the standards introduced the reference rate concept has been lack of consensus on how it should be determined Generally the idea has been to select an exogenous reference rate a government security rate or a combination of them is often used because it captures the risk-free reference rate that underlies the user cost of money as in Barnett (1978) Some have proposed alternative exogenous reference rates that are tied to the market determined risk of the security to which the reference rate is to be applied In any event we argue that the reference rate should be endogenously determined and should be the bankrsquos calculated cost of capital the overall rate of return paid to all sources of funding including equity on the liability side of the balance sheet Our lsquoreference rate of interestrsquo is therefore individual to each bank rather than an economy-wide constant A second issue in the national accounts dialogue on financial services is the scope of financial instruments that should be associated with the SNArsquos indirect financial services measure The 2008 SNA narrowed the scope of FISIM to the deposit and loan positions of financial corporations but previous versions included interest income flows on essentially all financial instruments Return to the broad 1993 financial instrument scope is nevertheless a research agenda item for the next version of the SNA and appears essential to align the SNA with the scope of liquidity measured by the money and banking literature and the associated standards for compiling financial statistics We argue that FISIM should cover all financial instruments

10 Liquidity and Economic Activity Conference

Armed with the cost of capital reference rate and full financial balance sheet instrument scope we derive the production identity (value of output equiv cost of production) from the income equiv expense and balanced sheet identities generating a FISIM like calculation of output with a single cost of capital reference rate for each enterprise rather than for the whole economy Note that by computing specific user cost prices of bank services it is in principle possible to aggregate them into a price index for liquidity services and correspondingly obtain a quantity index of liquidity services

Such aggregate measures would be useful in tracing the financial intermediation process into GDP or another aggregate measure of economic activity

On examining the SNA-type production identity for an individual bank we find the cost side contains a term within operating surplus ndash the equity leverage premium ndash that depends on the bankrsquos financing ndash the debt and equity composition of the liability side of its balance sheet Further it is inherent in the definition of the cost of capital reference rate that the equity leverage premium is identically equal to what we will term produced liquidity within the part of SNA financial services output of the bank coming from the debt instruments on the liability side of the its

balance sheet prominent among which being deposits Given that banks transform liabilities into assets the equity-leverage premium and the produced liquidity is tied to the risk bearing undertaken by the bank With an exogenous reference rate the risk bearing is completely embedded in the user cost price of the asset or liability product In our model because the entire balance sheet is used the risk bearing is tied to equity holders

This paper proposes a resolution to the scope and methodology issues in the ongoing national accounts conversation on financial services particularly on provision of liquidity by debt issuing enterprises and suggests that the equity leverage premium now included in the nominal output of banks be offset by an intermediate insurance input supplied by their equity holders This retains the current standardsrsquo origination of liquidity with banks (but also extends it to other debt issuing enterprises) while better exposing the link between the leverage risk bearing (provision of debt guarantees) of equity holding sectors and production of liquidity by banks (and other debt issuing enterprises) With the developed framework issues such as the measurement of output and productivity of the providers of liquidity services and other financial services can also be measured and incorporated into macroeconomic statistics

11Liquidity and Economic Activity Conference

PAPER 7Jan Willem Van den End De Nederlandsche Bank the Netherlands lsquoAPPLYING COMPLEXITY THEORY TO INTEREST RATES EVIDENCE OF CRITICAL TRANSITIONS IN THE EURO AREArsquo

Abstract We apply complexity theory to financial markets to show that excess liquidity created by the Eurosystem has led to critical transitions in the configuration of interest rates Complexity indicators turn out to be useful signals of tipping points and subsequent regime shifts in interest rates We find that the critical transitions are related to the increase of excess liquidity in the euro area These insights can help central banks to strike the right balance between the intention to support the financial system by injecting liquidity and potential unintended side-effects on market functioning

Keywords interest rates central banks and their policies monetary policy

JEL Codes E43 E58 E52

12 Liquidity and Economic Activity Conference

PAPER 8Michael Bowe Alliance Manchester Business School University of Manchester and University of Vaasa Olga Kolokolova Alliance Manchester Business School University of Manchester and Marcin Michalski Alliance Manchester Business School University of Manchester lsquoTOO BIG TO CARE TOO SMALL TO MATTER MACRO FINANCIAL POLICY AND BANK LIQUIDITY CREATIONrsquo

Abstract We estimate the volume of liquidity creation by US bank holding companies between 1997 and 2015 and examine the impact of changes in macrofinancial policies on the dynamics of this process We focus on three major policy developments occurring in the aftermath of the 2007ndash09 financial crisis bank capital regulation reform monetary stimulus through quantitative easing and the Troubled Asset Relief Program (TARP)

We use the three-step procedure proposed by Berger and Bouwman (2009) to calculate the dollar amount of liquidity a financial institution creates Initially we classify all balance sheet items and o_-balance sheet activities of an institution as liquid semi-liquid or illiquid to which we then assign liquidity weights of +1=2 (illiquid assets and liquid liabilities) 0 (semi-liquid assets and liabilities) or 10485761=2 (liquid assets illiquid liabilities and equity) respectively The dollar volume of liquidity creation is then calculated as liquidity-weighted sum of the items identified in the first step We find that the total amount of liquidity creation by banks in the sample increases by a factor of 365 from $14 trillion in 1997Q1 to $51 trillion in 2015Q4 Indeed the volume of liquidity creation increases at a faster pace than the gross domestic product of the United States which rises by a factor of 21 during the same period

The results of panel regressions reveal that the dynamics of bank liquidity creation differ considerably between small and large institutions The level of bank capital requirements and the stance of monetary policy impact the liquidity creation of both small and medium-sized banks Liquidity creation of the largest banks which control over 80 of the banking systemrsquos assets remains unaffected

We find that changes in the amount of liquidity creation by small banks per $1 of their gross total assets are positively related to changes in the term spread but inversely related to changes in their Tier 1 capital ratios Further we show that the volume of liquidity creation is positively related to the riskiness of a bankrsquos assets as measured by the ratio of risk-weighted assets to gross total assets regardless of its size classification We establish that TARP has negative short-term effects on small and medium banks and no immediate impact on the liquidity creation of the largest institutions in the sample In contrast participation in TARP leads to a long-term decline in liquidity provision per dollar of assets of the largest banks This persists even after the completion of the programme and repayment of TARP funding As nearly all of the largest TARP-recipient banks in the sample are subsequently classified as systemically important financial institutions our results suggest that the increased regulatory scrutiny may adversely affect their ability to create liquidity

By demonstrating that the stance of monetary policy and the level of bank capital requirements do not tangibly enhance the liquidity provision efficiency of the largest systemically important institutions in the system our study offers important insights for the design of effective macroprudential policies

13Liquidity and Economic Activity Conference

PAPER 9 Jonathan Goldberg Federal Reserve Board lsquoTHE SUPPLY OF LIQUIDITY AND REAL ECONOMIC ACTIVITYrsquo

Abstract This paper identifies shocks to the supply of liquidity by dealer firms and investigates their effects on real economic activity First I develop a simple theoretical model of dealer intermediation then in a structural VAR model I use sign restrictions derived from the theoretical model to identify liquidity supply shocks Liquidity supply shocks that are orthogonal to information contained in macroeconomic and asset price variables have considerable predictive power for economic activity Moreover positive liquidity supply shocks cause large and persistent increases in real activity

Keywords liquidity dealer intermediation risk-taking real activity liquidity shocks

JEL Codes G10 G12 G17 G24

14 Liquidity and Economic Activity Conference

PAPER 10 Dirk Bezemer University of Groningen the Netherlands and Lu Zhang Sustainable Finance Lab and Utrecht University lsquoMACROECONOMIC IMPLICATIONS OF LIQUIDITY CREATION CREDIT ALLOCATION AND POST CRISIS RECESSIONSrsquo

Abstract In this paper we address macroeconomic implications of liquidity creation through bank lending and the impacts of liquidity on economic activity We note that liquidity created through bank lending can be channeled into the real sector in support of economic activity or in financial and real estate markets in support of capital gains We collected macro-level data on bank credit aggregates over 2000ndash12 for 57 economies categorise according to the use of credit We note the long-term shift in the allocation of bank credit creation away from non-financial business lending and towards financial and especially real estate markets We then present new evidence on the channels from credit allocation pre-crisis to the severity of post-crisis recessions

Our first contribution is to show that it is not just the level but the composition of debt (defined as the share of mortgage credit in total credit) that matters A second contribution is to analyse the channels We collect additional industry-level data across 20 industries for a subset of economies We analyze the effect of changes in the pre-crisis composition of debt on total GDP and on investment consumption and capital allocation We find that changes in the share of household mortgage credit before the crisis have a significant effect on recession

severity after the 2007 crisis This is not the case for any other credit category nor for growth of total bank credit We address the causality challenge by using the difference between IMF growth forecasts and growth realisations This filters out country-specific drivers of both debt and income growth We address the model selection challenge by using Bayesian averaging models This indicates that the change in credit composition is among the three most robust determinants of post-crisis recession severity with income levels and current account balance The findings are robust to a wide range of control variables and to the different responses across advancedemerging and EMUnon-EMU economies

We then delve into the channels from change in debt composition to income growth loss The literature to date has focused on negative wealth effects on consumption for which we find strong evidence In addition we find evidence for two investment channels a loan supply effect and a capital allocation effect In the industry-level analysis we find that in economies which experienced a larger change in debt composition before 2008 there was a larger reduction of credit available and weaker capital re-allocation towards sectors with higher value-added This effect is observed already before the crisis and very strongly after the crisis We discuss policy implications and future research

Keywords private credit mortgages crisis output loss investment capital allocation

JEL Codes C11 C15 E01 O4

15Liquidity and Economic Activity Conference

PAPER 11 Iftekhar Hasan Gabelli School of Business Fordham University and Jean-Loup Soula Strasbourg University LaRGE Research Centre lsquoTECHNICAL EFFICIENCY IN BANK LIQUIDITY CREATIONrsquo

Abstract This paper generates an optimum bank liquidity creation benchmark by tracing an efficient frontier in liquidity creation (bank intermediation) and questions why some banks are more efficient than others in such activities Evidence reveals that medium size banks are most correlated to efficient frontier irrespective of their business models Small (large) banks ndash focused on traditional banking activities ndash are found to be the most (least) efficient in creating liquidity in on-balance sheet items whereas large banks ndash involved in non-traditional activities ndash are found to be most efficient in off-balance sheet liquidity creation Additionally the liquidity efficiency of small banks is more resilient during the 2007ndash08 financial crisis relative to other banks

Keywords banks technical efficiency liquidity creation diversification

Jel Codes G21 G28 G32

16 Liquidity and Economic Activity Conference

PAPER 12 Richard Anderson Lindenwood University John Duca Federal Reserve Bank of Dallas and Barry Jones Department of Economics State University of New York lsquoA BROAD MONETARY SERVICES (LIQUIDITY) INDEX AND ITS LONG-TERM LINKS TO ECONOMIC ACTIVITYrsquo

Abstract Liquid assets play a crucial role in economic activity as the medium in which payments are received and are made lsquoSudden stopsrsquo in financial markets ndash during which liquid assets are hoarded ndash are periods when economic activity slows abruptly Further it is the sine qua non of financial intermediation to alter the measured relative and absolute quantities of liquid assets In this way the observed quantities of liquid assets reflect both the path of past economic activity and anticipations of future activity The quantities must be regarded as arising endogenously within an intertemporal general equilibrium model of the economy a la Tobin (1958) and Merton (1971) Economic modeling and analysis traditionally proceeds by combining relatively high-dimension lists of specific assets into lower-dimension lsquomonetary aggregatesrsquo The defining characteristic of the assets included is that all are available to facilitate

the exchange of goods and services at a transaction cost less than infinity That is all included assets may be sold or used as collateral for the purchase and sale of goods and services and thereby provide liquidity services which can be tracked by measured opportunity costs of foregone interest which may not in practice reflect all transactions costs The last caveat particularly applies to household holdings of mutual fund assets outside of money market funds Our study contributes to the literature in two key ways First it expands a conventional Divisia measure of money services to account for the liquidity provided by such mutual fund assets Second it then explores the long-run connections between economic activity and monetary aggregates constructed as index numbers from 1929 to 2016 finding that the inclusion of mutual fund liquidity services results in a Divisia measure of money that has a much more stable velocity

We thank Emil Mihalov and Tyler Atkinson for research assistance The views expressed are those of the authors and do not necessarily reflect those of the Federal Reserve Bank of Dallas or the Federal Reserve System Any errors are our own

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 2: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

2 Liquidity and Economic Activity Conference

CONFERENCE LOCATION Bank of EnglandThreadneedle Street London EC2R 8AH

AUTHORFinancial Resilience Research ClusterBirmingham Business School University of Birmingham Edgbaston Birmingham B15 2TT

C E N T E R F O R F I N A N C I A L S T A B I L I T Y B o l d I n n o v a t i v e P r a c t i c a l

3Liquidity and Economic Activity Conference

Financial Services Indices Liquidity and Economic Activity CONFERENCE SCHEDULE

Day 1 23 May 2017 850ndash920am Registration and Coffee920ndash930am Conference Opens and Welcome Address Jane Binner930ndash1130am

SESSION 1 Liquidity and Monetary PolicyChair David Aikman

nPAPER 1 Robert Aliber lsquoAn essay on monetary turbulence and the supply of liquidityrsquo

nPAPER 2 Tim Congdon lsquoWhat were the causes of the Great Recession The importance of the ldquowhich aggregaterdquo debatersquoever

nPAPER 3 Singh lsquoThe role of pledged collateral in liquidity metrics and monetary policyrsquo

nPAPER 4 Aikman Lehnert Liang Modugno lsquoCredit risk appetite and monetary policy transmissionrsquo

1130ndash1145am Break1145amndash115pm

SESSION 2Producing Liquidity and Excess LiquidityChair Jan Willem Van den End

nPAPER 5 Kevin Fox and Erwin Diewert lsquoThe demand for monetary balances and the measurement of productivityrsquo

nPAPER 6 Dennis Fixler and Kim Zieschang lsquoProducing liquidityrsquo

nPAPER 7 Jan Willem Van den End lsquoApplying complexity theory to interest rates Evidence of critical transitions in the Euro arearsquo

115ndash215pm Lunch 215ndash230pm Conference Group Photo ndash steps of the Bank of England230ndash330pm Keynote Lecture Oswald Distinguished Professor William A Barnett ndash lsquoUnsolved problems in monetary aggregation and transmissionrsquo330ndash345pm Break345ndash545pm

SESSION 3Liquidity Creation and Macroeconomic PolicyChair Jean-Loup Soula

nPAPER 8 Bowe Kolokolova and Michalski lsquoToo big to care too small to matter macro financial policy and bank liquidity creationrsquo

nPAPER 9 Goldberg lsquoThe supply of liquidity and real economic activityrsquo

nPAPER 10 Bezemer and Lu Zhang lsquoMacroeconomic implications of liquidity creation credit allocation and post-crisis recessionsrsquo

nPAPER 11 Hasan and Soula lsquoTechnical efficiency in bank liquidity creationrsquo

545ndash600pm Break 600ndash700pm Poster Competition ndash Atrium Bank of England with light refreshments and wine 745pm Conference Dinner ndash The Counting House 50 Cornhill London EC3V 3PD

4 Liquidity and Economic Activity Conference

Day 2 24 May 2017 930ndash1030am

SESSION 4 Financial Services Indices Liquidity and Economic ActivityChair Jane Binner

nPAPER 12 Richard Anderson John Duca and Barry Jones lsquoA broad monetary services (liquidity) index and its long-term links to economic activityrsquo

nPAPER 13 John Keating and Lee Smith lsquoThe optimal monetary instrument and the (mis) use of Granger causalityrsquo

nPAPER 14 Jane Binner Logan Kelly and Jon Tepper lsquoOn the robustness of sluggish state-based neural networks for providing useful insight into the new Keynesian Phillips curversquo

1030ndash1045am Break1045amndash1245pm

SESSION 5Structural Change Volatility and ShocksChair Taniya Ghosh

nPAPER 15 Rakesh Bissoondeeal Michael Karaglou Jane Binner lsquoStructural changes and the role of money in the UKrsquo

nPAPER 16 Costas Milas and Michael Ellington lsquoIdentifying aggregate liquidity shocks in conjunction with monetary policy shocks An application using UK datarsquo

nPAPER 17 Makram El Shagi and Logan Kelly lsquoWhat can we learn from country level liquidity in the EMUrsquo

nPAPER 18 Bhadury and Ghosh lsquoHas money lost its relevance Determining a solution to the exchange rate disconnect puzzle in the small open economiesrsquo

1245ndash145pm Lunch145ndash245pm Keynote Address Lawrence Goodman Centre for Financial Stability New York lsquoPreventative macroprudential policyrsquo245ndash300pm Break300ndash500pm

SESSION 6Tests of Index Numbers Separability and Price DualsChair Victor Valcarcel

nPAPER 19 William A Barnett and Jinan Liu lsquoUser cost of credit card services under intertemporal non-separabilityrsquo

nPAPER 20 Per Hjertstrand Gerald Whitney and James Swofford lsquoPanel data tests of index numbers and revealed preference rankingsrsquo

nPAPER 21 Sajid Chaudhry Jane Binner James Swofford and Andrew Mullineux lsquoScotland as an optimum currency arearsquo

nPAPER 22 Victor Valcarcel lsquoInterest rate pass-through Divisia user costs of monetary assets and the Federal funds ratersquo

500pm Conference Close600pm Editorial Board Meeting ndash Guest Editors Special Issue Meeting

5Liquidity and Economic Activity Conference

Financial Services Indices Liquidity and Economic Activity PAPER ABSTRACTS

PAPER 1Robert Aliber University of Chicago lsquoAN ESSAY ON MONETARY TURBULENCE AND THE SUPPLY OF LIQUIDITYrsquo

Abstract The last 35 years have been the most turbulent in monetary history There have more than 100 banking crises many in one of four waves most of these crises have been lsquotwinnedrsquo with currency crises Moreover the deviations between the market prices of currencies have been much larger than ever before The purpose of this essay is to explain why there have been so many banking crises and why they have often occurred together with currency crises the answer is that the floating currency arrangement is inherently unstable because an increase in cross-border investment flows to a country leads to an increase in the price of its securities and the increase in the price of its currency The essay is based on a general equilibrium view that links the market in currencies with the markets in bonds stocks and real estate the increase in cross-border investment inflows to a country leads to an increase in the household wealth as an integral part of the adjustment process to ensure that the countryrsquos current account deficit increases as its capital account surplus increases otherwise the market in the countryrsquos currency would not clear The increase in the countryrsquos external indebtedness is much more rapid than the increase in its GDP When the lenders recognise that the indebted country

is on a non-sustainable trajectory their demand for the borrowersrsquo IOUs declines and the price of its securities and the price of its currency declines A banking crisis may follow if the decline in the price of the currency is large since it leads immediately to a sharp increase in the total indebtedness of the borrowers with liabilities denominated in a foreign currency

When currencies are no longer anchored to parities each central bank has much more monetary independence and investors have much more incentive to change the currency composition of the securities in their portfolios Changes in investor demand for foreign securities lead to much more price risk in both the markets for currencies and the markets for securities There has been a scissors-like movement in the market for liquidity the lsquodemandrsquo for liquidity by traders and investors has increased while the supply of liquidity has declined because the price risks are much larger Keywords banking crises market efficiency market failure flexible exchange rates the transfer problem process

JEL Codes E44 F31 F32 F33 F34 F38

6 Liquidity and Economic Activity Conference

PAPER 2Professor Tim Congdon CBE Chairman Institute of International Monetary Research at the University of Buckingham lsquoWHAT WERE THE CAUSES OF THE GREAT RECESSION THE IMPORTANCE OF THE ldquoWHICH AGGREGATErdquo DEBATErsquo

Abstract Monetary economists have long debated which measure of the quantity of money is the most useful in macroeconomic analysis Before during and after the Great Recession (to be understood ndash roughly speaking as the six quarters to mid-2009) the growth rates of different money aggregates diverged sharply in the leading economies The lsquowhich aggregatersquo debate was therefore of particular importance The focus in this paper is on the USArsquos experience although the behaviour of money in other economies and its relationship to prices and spending in them is mentioned where relevant and interesting

The argument is that broadly-defined money has long been the correct concept to use in interpreting macroeconomic developments and that its merits became clear in the Great Recession and its aftermath Broad money is to be viewed as including virtually all money-like assets (and certainly most of the deposit liabilities of the commercial banking system) For the purposes of the paper broad money is identified with the M3 money measure for which the Federal Reserve prepared data until early 2006 It will be shown that with its flow-of-funds data the Federal Reserve is still publishing information that enables an approximate M3 aggregate to be estimated Further the M3 money holdings

of the US economyrsquos main sectors ndash households non-financial business and financial business ndash can be tracked from the flow-of-funds numbers

The last decade has seen the lowest average annual increase per cent in nominal GDP since the 1930s The growth rate of M3 has also been the lowest since the 1930s This similarity of the rates of change contrasts with the behaviour of M1 M1 increased strongly during the Great Recession and afterwards Its rate of increase in the 2009ndash15 period was over double that in the preceding 48 years with its behaviour sharply divergent from nominal GDP With M2 the discrepancy is less marked but a discrepancy remains

Apart from its insights into the causation of the Great Recession the paper will have two provocative conclusions First the Federal Reserve should resume publication of an M3 aggregate and following the Bank of Englandrsquos example it should de-compose broad money into its sector constituents Second the interesting patterns in inter-sectoral money flows that seem to be recurrent in cyclical episodes can be monitored only from a simple-sum broad money aggregate Divisia indices prepared from aggregate economy-wide data cannot identify the patterns while ndash arguably ndash these patterns are important in understanding the transmission mechanism from money to the economy

The paperrsquos main thesis has already been developed in an informal way in an article in the 2016 Central Banking journal The aim of the paper will be to present the analysis in a more academically rigorous form

7Liquidity and Economic Activity Conference

PAPER 3Manmohan Singh International Monetary Fund lsquoTHE ROLE OF PLEDGED COLLATERAL IN LIQUIDITY METRICS AND MONETARY POLICYrsquo

Abstract Collateral does not flow in a vacuum and needs balance sheet(s) to move within the financial system Pledged collateral needs to be considered along with money metrics to fully understand the liquidity in the markets This paper analyses securities-lending derivatives and prime-brokerage markets as suppliers of collateral (as much has been written on the repo market) Going forward the official sectorrsquos choice of balance sheet(s) that allows the flow of liquidity (ie money and collateral) should be transparent and driven by market forces and not by ad hoc allocation by central banks Else this may be suboptimal on many fronts for monetary policy transmission for smooth money market functioning and ultimately for market liquidity

Keywords collateral velocity securities lending prime brokerage OTC derivatives repo

JEL Codes G21 G28 F33 K22

PAPER 4David Aikman Bank of England Andreas Lehnert Federal Reserve Bank Nellie Liang Federal Reserve Bank Michele Modugno Federal Reserve Bank lsquoCREDIT RISK APPETITE AND MONETARY POLICY TRANSMISSIONrsquo

Abstract We show that US economic performance and monetary policy transmission depend on nonfinancial sector credit and the effects are nonlinear When credit is below its trend increases in risk appetite lead to sustained increases in output In contrast when credit is above trend initial expansions are followed by additional excess borrowing and subsequent contractions suggesting an inter-temporal trade-off for economic activity Also tighter monetary policy is ineffective at slowing the economy when credit is high consistent with evidence of less transmission of policy changes to distant forward Treasury rates in high-credit periods

Keywords financial stability financial conditions credit asset bubbles monetary policy

JEL Codes E58 E65 G28

8 Liquidity and Economic Activity Conference

PAPER 5Kevin Fox University of New South Wales and Erwin Diewert University of British ColumbialsquoTHE DEMAND FOR MONETARY BALANCES AND THE MEASUREMENT OF PRODUCTIVITYrsquo

Abstract Firms in advanced economies have greatly increased their cash holdings since the mid-1990s While this has been observed and the reasons debated by central bankers international agencies and academics it remains somewhat of a puzzle This paper explores possible reasons for this increase and the implications for understanding productivity growth Monetary holdings have an opportunity cost ie allocating firm financial capital into monetary deposits means that investment in real assets is reduced Traditional measures of Total Factor Productivity (TFP) do not take into account these holdings of monetary assets Given the recent large increases in these holdings ex ante it can be expected that adding these monetary assets to the list of traditional sources of capital services will reduce the TFP of the business sector Using a new data set on the US aggregate (corporate and non-corporate) business sector we measure this effect for the noting the implications for the System of National Accounts of this expanded definition of capital services Also industry elasticities of demand to hold monetary balances using the Normalized Quadratic functional form A key finding is that the accumulation of monetary holdings is primarily a phenomenon of the non-corporate business sector

We have found that while conceptually more correct adding real money balances to our input aggregate does not change aggregate measured productivity performance very much for the corporate sector This is because even though there is some variation the asset share is relatively small The impact on the non-corporate sector is larger especially in the latter decades of the sample when currency and deposit holdings increased substantially especially relative to other asset holdings Finally the relative productivity of individual firms can be significantly impacted by differences in money holdings even if there is little aggregate effect at the sectoral level Indeed understanding productivity differences between small and large firms can be enhanced by taking into account currency and deposits small firms are often credit constrained and therefore have greater cash holdings Similarly accounting for cash holdings can provide an augmented understanding of productivity and profitability in studies of firm dynamics In addition understanding productivity differences between risky and less risky sectors and firms can be informed by differences in money balances where eg dependence on RampD is taken as a proxy for risk Hence this paper provides a framework and empirical results for a more comprehensive understanding of productivity growth and dynamics

9Liquidity and Economic Activity Conference

PAPER 6Dennis Fixler Bureau of Economic Analysis and Kim Zieschang International Monetary Fund lsquoPRODUCING LIQUIDITYrsquo

Abstract Based on a paper presented at the 2015 Meeting of the Society for Economic Measurement Paris Dennis Fixler is Chief Economist Bureau of Economic Analysis and Kim Zieschang is Adjunct Professor of Economics University of Queensland The views expressed in this paper are those of the authors and should not be attributed to the Bureau of Economic Analysis JEL codes E01 Measurement and Data on National Income and Product Accounts Commercial banks are a primary producer of liquidity in an economy Despite their importance there is no consensus on the measurement of the liquidity service and for the matter bank output in general This lack of consensus exits at the microeconomic level and at that level of the national accounts which attempts to capture the significant role of bank services in the output of the economy The current national accounts measure of bank output is termed financial intermediation services indirectly measured or lsquoFISIMrsquo springing from the 1993 version of the System of National accounts (the 1993 SNA) Calculating FISIM under the 2008 version of national accounting standards is simple and generally practical provided the compiler has a key datummdashthe lsquoreference rate of interestrsquo The calculation is essentially Output = (Reference rate of interest minus Deposit rate) times Deposit liabilities + (Loan rate minus Reference rate of interest) times Loan assets

As the deposit and loan financial instrument coverage of this formula implies the current national accounting standards apply it to deposit-takers such as banks as well as to

non-deposit-taking loan-making financial institutions such as finance companies and money lenders

A first issue in treatment of financial services since the 1993 version of the standards introduced the reference rate concept has been lack of consensus on how it should be determined Generally the idea has been to select an exogenous reference rate a government security rate or a combination of them is often used because it captures the risk-free reference rate that underlies the user cost of money as in Barnett (1978) Some have proposed alternative exogenous reference rates that are tied to the market determined risk of the security to which the reference rate is to be applied In any event we argue that the reference rate should be endogenously determined and should be the bankrsquos calculated cost of capital the overall rate of return paid to all sources of funding including equity on the liability side of the balance sheet Our lsquoreference rate of interestrsquo is therefore individual to each bank rather than an economy-wide constant A second issue in the national accounts dialogue on financial services is the scope of financial instruments that should be associated with the SNArsquos indirect financial services measure The 2008 SNA narrowed the scope of FISIM to the deposit and loan positions of financial corporations but previous versions included interest income flows on essentially all financial instruments Return to the broad 1993 financial instrument scope is nevertheless a research agenda item for the next version of the SNA and appears essential to align the SNA with the scope of liquidity measured by the money and banking literature and the associated standards for compiling financial statistics We argue that FISIM should cover all financial instruments

10 Liquidity and Economic Activity Conference

Armed with the cost of capital reference rate and full financial balance sheet instrument scope we derive the production identity (value of output equiv cost of production) from the income equiv expense and balanced sheet identities generating a FISIM like calculation of output with a single cost of capital reference rate for each enterprise rather than for the whole economy Note that by computing specific user cost prices of bank services it is in principle possible to aggregate them into a price index for liquidity services and correspondingly obtain a quantity index of liquidity services

Such aggregate measures would be useful in tracing the financial intermediation process into GDP or another aggregate measure of economic activity

On examining the SNA-type production identity for an individual bank we find the cost side contains a term within operating surplus ndash the equity leverage premium ndash that depends on the bankrsquos financing ndash the debt and equity composition of the liability side of its balance sheet Further it is inherent in the definition of the cost of capital reference rate that the equity leverage premium is identically equal to what we will term produced liquidity within the part of SNA financial services output of the bank coming from the debt instruments on the liability side of the its

balance sheet prominent among which being deposits Given that banks transform liabilities into assets the equity-leverage premium and the produced liquidity is tied to the risk bearing undertaken by the bank With an exogenous reference rate the risk bearing is completely embedded in the user cost price of the asset or liability product In our model because the entire balance sheet is used the risk bearing is tied to equity holders

This paper proposes a resolution to the scope and methodology issues in the ongoing national accounts conversation on financial services particularly on provision of liquidity by debt issuing enterprises and suggests that the equity leverage premium now included in the nominal output of banks be offset by an intermediate insurance input supplied by their equity holders This retains the current standardsrsquo origination of liquidity with banks (but also extends it to other debt issuing enterprises) while better exposing the link between the leverage risk bearing (provision of debt guarantees) of equity holding sectors and production of liquidity by banks (and other debt issuing enterprises) With the developed framework issues such as the measurement of output and productivity of the providers of liquidity services and other financial services can also be measured and incorporated into macroeconomic statistics

11Liquidity and Economic Activity Conference

PAPER 7Jan Willem Van den End De Nederlandsche Bank the Netherlands lsquoAPPLYING COMPLEXITY THEORY TO INTEREST RATES EVIDENCE OF CRITICAL TRANSITIONS IN THE EURO AREArsquo

Abstract We apply complexity theory to financial markets to show that excess liquidity created by the Eurosystem has led to critical transitions in the configuration of interest rates Complexity indicators turn out to be useful signals of tipping points and subsequent regime shifts in interest rates We find that the critical transitions are related to the increase of excess liquidity in the euro area These insights can help central banks to strike the right balance between the intention to support the financial system by injecting liquidity and potential unintended side-effects on market functioning

Keywords interest rates central banks and their policies monetary policy

JEL Codes E43 E58 E52

12 Liquidity and Economic Activity Conference

PAPER 8Michael Bowe Alliance Manchester Business School University of Manchester and University of Vaasa Olga Kolokolova Alliance Manchester Business School University of Manchester and Marcin Michalski Alliance Manchester Business School University of Manchester lsquoTOO BIG TO CARE TOO SMALL TO MATTER MACRO FINANCIAL POLICY AND BANK LIQUIDITY CREATIONrsquo

Abstract We estimate the volume of liquidity creation by US bank holding companies between 1997 and 2015 and examine the impact of changes in macrofinancial policies on the dynamics of this process We focus on three major policy developments occurring in the aftermath of the 2007ndash09 financial crisis bank capital regulation reform monetary stimulus through quantitative easing and the Troubled Asset Relief Program (TARP)

We use the three-step procedure proposed by Berger and Bouwman (2009) to calculate the dollar amount of liquidity a financial institution creates Initially we classify all balance sheet items and o_-balance sheet activities of an institution as liquid semi-liquid or illiquid to which we then assign liquidity weights of +1=2 (illiquid assets and liquid liabilities) 0 (semi-liquid assets and liabilities) or 10485761=2 (liquid assets illiquid liabilities and equity) respectively The dollar volume of liquidity creation is then calculated as liquidity-weighted sum of the items identified in the first step We find that the total amount of liquidity creation by banks in the sample increases by a factor of 365 from $14 trillion in 1997Q1 to $51 trillion in 2015Q4 Indeed the volume of liquidity creation increases at a faster pace than the gross domestic product of the United States which rises by a factor of 21 during the same period

The results of panel regressions reveal that the dynamics of bank liquidity creation differ considerably between small and large institutions The level of bank capital requirements and the stance of monetary policy impact the liquidity creation of both small and medium-sized banks Liquidity creation of the largest banks which control over 80 of the banking systemrsquos assets remains unaffected

We find that changes in the amount of liquidity creation by small banks per $1 of their gross total assets are positively related to changes in the term spread but inversely related to changes in their Tier 1 capital ratios Further we show that the volume of liquidity creation is positively related to the riskiness of a bankrsquos assets as measured by the ratio of risk-weighted assets to gross total assets regardless of its size classification We establish that TARP has negative short-term effects on small and medium banks and no immediate impact on the liquidity creation of the largest institutions in the sample In contrast participation in TARP leads to a long-term decline in liquidity provision per dollar of assets of the largest banks This persists even after the completion of the programme and repayment of TARP funding As nearly all of the largest TARP-recipient banks in the sample are subsequently classified as systemically important financial institutions our results suggest that the increased regulatory scrutiny may adversely affect their ability to create liquidity

By demonstrating that the stance of monetary policy and the level of bank capital requirements do not tangibly enhance the liquidity provision efficiency of the largest systemically important institutions in the system our study offers important insights for the design of effective macroprudential policies

13Liquidity and Economic Activity Conference

PAPER 9 Jonathan Goldberg Federal Reserve Board lsquoTHE SUPPLY OF LIQUIDITY AND REAL ECONOMIC ACTIVITYrsquo

Abstract This paper identifies shocks to the supply of liquidity by dealer firms and investigates their effects on real economic activity First I develop a simple theoretical model of dealer intermediation then in a structural VAR model I use sign restrictions derived from the theoretical model to identify liquidity supply shocks Liquidity supply shocks that are orthogonal to information contained in macroeconomic and asset price variables have considerable predictive power for economic activity Moreover positive liquidity supply shocks cause large and persistent increases in real activity

Keywords liquidity dealer intermediation risk-taking real activity liquidity shocks

JEL Codes G10 G12 G17 G24

14 Liquidity and Economic Activity Conference

PAPER 10 Dirk Bezemer University of Groningen the Netherlands and Lu Zhang Sustainable Finance Lab and Utrecht University lsquoMACROECONOMIC IMPLICATIONS OF LIQUIDITY CREATION CREDIT ALLOCATION AND POST CRISIS RECESSIONSrsquo

Abstract In this paper we address macroeconomic implications of liquidity creation through bank lending and the impacts of liquidity on economic activity We note that liquidity created through bank lending can be channeled into the real sector in support of economic activity or in financial and real estate markets in support of capital gains We collected macro-level data on bank credit aggregates over 2000ndash12 for 57 economies categorise according to the use of credit We note the long-term shift in the allocation of bank credit creation away from non-financial business lending and towards financial and especially real estate markets We then present new evidence on the channels from credit allocation pre-crisis to the severity of post-crisis recessions

Our first contribution is to show that it is not just the level but the composition of debt (defined as the share of mortgage credit in total credit) that matters A second contribution is to analyse the channels We collect additional industry-level data across 20 industries for a subset of economies We analyze the effect of changes in the pre-crisis composition of debt on total GDP and on investment consumption and capital allocation We find that changes in the share of household mortgage credit before the crisis have a significant effect on recession

severity after the 2007 crisis This is not the case for any other credit category nor for growth of total bank credit We address the causality challenge by using the difference between IMF growth forecasts and growth realisations This filters out country-specific drivers of both debt and income growth We address the model selection challenge by using Bayesian averaging models This indicates that the change in credit composition is among the three most robust determinants of post-crisis recession severity with income levels and current account balance The findings are robust to a wide range of control variables and to the different responses across advancedemerging and EMUnon-EMU economies

We then delve into the channels from change in debt composition to income growth loss The literature to date has focused on negative wealth effects on consumption for which we find strong evidence In addition we find evidence for two investment channels a loan supply effect and a capital allocation effect In the industry-level analysis we find that in economies which experienced a larger change in debt composition before 2008 there was a larger reduction of credit available and weaker capital re-allocation towards sectors with higher value-added This effect is observed already before the crisis and very strongly after the crisis We discuss policy implications and future research

Keywords private credit mortgages crisis output loss investment capital allocation

JEL Codes C11 C15 E01 O4

15Liquidity and Economic Activity Conference

PAPER 11 Iftekhar Hasan Gabelli School of Business Fordham University and Jean-Loup Soula Strasbourg University LaRGE Research Centre lsquoTECHNICAL EFFICIENCY IN BANK LIQUIDITY CREATIONrsquo

Abstract This paper generates an optimum bank liquidity creation benchmark by tracing an efficient frontier in liquidity creation (bank intermediation) and questions why some banks are more efficient than others in such activities Evidence reveals that medium size banks are most correlated to efficient frontier irrespective of their business models Small (large) banks ndash focused on traditional banking activities ndash are found to be the most (least) efficient in creating liquidity in on-balance sheet items whereas large banks ndash involved in non-traditional activities ndash are found to be most efficient in off-balance sheet liquidity creation Additionally the liquidity efficiency of small banks is more resilient during the 2007ndash08 financial crisis relative to other banks

Keywords banks technical efficiency liquidity creation diversification

Jel Codes G21 G28 G32

16 Liquidity and Economic Activity Conference

PAPER 12 Richard Anderson Lindenwood University John Duca Federal Reserve Bank of Dallas and Barry Jones Department of Economics State University of New York lsquoA BROAD MONETARY SERVICES (LIQUIDITY) INDEX AND ITS LONG-TERM LINKS TO ECONOMIC ACTIVITYrsquo

Abstract Liquid assets play a crucial role in economic activity as the medium in which payments are received and are made lsquoSudden stopsrsquo in financial markets ndash during which liquid assets are hoarded ndash are periods when economic activity slows abruptly Further it is the sine qua non of financial intermediation to alter the measured relative and absolute quantities of liquid assets In this way the observed quantities of liquid assets reflect both the path of past economic activity and anticipations of future activity The quantities must be regarded as arising endogenously within an intertemporal general equilibrium model of the economy a la Tobin (1958) and Merton (1971) Economic modeling and analysis traditionally proceeds by combining relatively high-dimension lists of specific assets into lower-dimension lsquomonetary aggregatesrsquo The defining characteristic of the assets included is that all are available to facilitate

the exchange of goods and services at a transaction cost less than infinity That is all included assets may be sold or used as collateral for the purchase and sale of goods and services and thereby provide liquidity services which can be tracked by measured opportunity costs of foregone interest which may not in practice reflect all transactions costs The last caveat particularly applies to household holdings of mutual fund assets outside of money market funds Our study contributes to the literature in two key ways First it expands a conventional Divisia measure of money services to account for the liquidity provided by such mutual fund assets Second it then explores the long-run connections between economic activity and monetary aggregates constructed as index numbers from 1929 to 2016 finding that the inclusion of mutual fund liquidity services results in a Divisia measure of money that has a much more stable velocity

We thank Emil Mihalov and Tyler Atkinson for research assistance The views expressed are those of the authors and do not necessarily reflect those of the Federal Reserve Bank of Dallas or the Federal Reserve System Any errors are our own

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 3: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

3Liquidity and Economic Activity Conference

Financial Services Indices Liquidity and Economic Activity CONFERENCE SCHEDULE

Day 1 23 May 2017 850ndash920am Registration and Coffee920ndash930am Conference Opens and Welcome Address Jane Binner930ndash1130am

SESSION 1 Liquidity and Monetary PolicyChair David Aikman

nPAPER 1 Robert Aliber lsquoAn essay on monetary turbulence and the supply of liquidityrsquo

nPAPER 2 Tim Congdon lsquoWhat were the causes of the Great Recession The importance of the ldquowhich aggregaterdquo debatersquoever

nPAPER 3 Singh lsquoThe role of pledged collateral in liquidity metrics and monetary policyrsquo

nPAPER 4 Aikman Lehnert Liang Modugno lsquoCredit risk appetite and monetary policy transmissionrsquo

1130ndash1145am Break1145amndash115pm

SESSION 2Producing Liquidity and Excess LiquidityChair Jan Willem Van den End

nPAPER 5 Kevin Fox and Erwin Diewert lsquoThe demand for monetary balances and the measurement of productivityrsquo

nPAPER 6 Dennis Fixler and Kim Zieschang lsquoProducing liquidityrsquo

nPAPER 7 Jan Willem Van den End lsquoApplying complexity theory to interest rates Evidence of critical transitions in the Euro arearsquo

115ndash215pm Lunch 215ndash230pm Conference Group Photo ndash steps of the Bank of England230ndash330pm Keynote Lecture Oswald Distinguished Professor William A Barnett ndash lsquoUnsolved problems in monetary aggregation and transmissionrsquo330ndash345pm Break345ndash545pm

SESSION 3Liquidity Creation and Macroeconomic PolicyChair Jean-Loup Soula

nPAPER 8 Bowe Kolokolova and Michalski lsquoToo big to care too small to matter macro financial policy and bank liquidity creationrsquo

nPAPER 9 Goldberg lsquoThe supply of liquidity and real economic activityrsquo

nPAPER 10 Bezemer and Lu Zhang lsquoMacroeconomic implications of liquidity creation credit allocation and post-crisis recessionsrsquo

nPAPER 11 Hasan and Soula lsquoTechnical efficiency in bank liquidity creationrsquo

545ndash600pm Break 600ndash700pm Poster Competition ndash Atrium Bank of England with light refreshments and wine 745pm Conference Dinner ndash The Counting House 50 Cornhill London EC3V 3PD

4 Liquidity and Economic Activity Conference

Day 2 24 May 2017 930ndash1030am

SESSION 4 Financial Services Indices Liquidity and Economic ActivityChair Jane Binner

nPAPER 12 Richard Anderson John Duca and Barry Jones lsquoA broad monetary services (liquidity) index and its long-term links to economic activityrsquo

nPAPER 13 John Keating and Lee Smith lsquoThe optimal monetary instrument and the (mis) use of Granger causalityrsquo

nPAPER 14 Jane Binner Logan Kelly and Jon Tepper lsquoOn the robustness of sluggish state-based neural networks for providing useful insight into the new Keynesian Phillips curversquo

1030ndash1045am Break1045amndash1245pm

SESSION 5Structural Change Volatility and ShocksChair Taniya Ghosh

nPAPER 15 Rakesh Bissoondeeal Michael Karaglou Jane Binner lsquoStructural changes and the role of money in the UKrsquo

nPAPER 16 Costas Milas and Michael Ellington lsquoIdentifying aggregate liquidity shocks in conjunction with monetary policy shocks An application using UK datarsquo

nPAPER 17 Makram El Shagi and Logan Kelly lsquoWhat can we learn from country level liquidity in the EMUrsquo

nPAPER 18 Bhadury and Ghosh lsquoHas money lost its relevance Determining a solution to the exchange rate disconnect puzzle in the small open economiesrsquo

1245ndash145pm Lunch145ndash245pm Keynote Address Lawrence Goodman Centre for Financial Stability New York lsquoPreventative macroprudential policyrsquo245ndash300pm Break300ndash500pm

SESSION 6Tests of Index Numbers Separability and Price DualsChair Victor Valcarcel

nPAPER 19 William A Barnett and Jinan Liu lsquoUser cost of credit card services under intertemporal non-separabilityrsquo

nPAPER 20 Per Hjertstrand Gerald Whitney and James Swofford lsquoPanel data tests of index numbers and revealed preference rankingsrsquo

nPAPER 21 Sajid Chaudhry Jane Binner James Swofford and Andrew Mullineux lsquoScotland as an optimum currency arearsquo

nPAPER 22 Victor Valcarcel lsquoInterest rate pass-through Divisia user costs of monetary assets and the Federal funds ratersquo

500pm Conference Close600pm Editorial Board Meeting ndash Guest Editors Special Issue Meeting

5Liquidity and Economic Activity Conference

Financial Services Indices Liquidity and Economic Activity PAPER ABSTRACTS

PAPER 1Robert Aliber University of Chicago lsquoAN ESSAY ON MONETARY TURBULENCE AND THE SUPPLY OF LIQUIDITYrsquo

Abstract The last 35 years have been the most turbulent in monetary history There have more than 100 banking crises many in one of four waves most of these crises have been lsquotwinnedrsquo with currency crises Moreover the deviations between the market prices of currencies have been much larger than ever before The purpose of this essay is to explain why there have been so many banking crises and why they have often occurred together with currency crises the answer is that the floating currency arrangement is inherently unstable because an increase in cross-border investment flows to a country leads to an increase in the price of its securities and the increase in the price of its currency The essay is based on a general equilibrium view that links the market in currencies with the markets in bonds stocks and real estate the increase in cross-border investment inflows to a country leads to an increase in the household wealth as an integral part of the adjustment process to ensure that the countryrsquos current account deficit increases as its capital account surplus increases otherwise the market in the countryrsquos currency would not clear The increase in the countryrsquos external indebtedness is much more rapid than the increase in its GDP When the lenders recognise that the indebted country

is on a non-sustainable trajectory their demand for the borrowersrsquo IOUs declines and the price of its securities and the price of its currency declines A banking crisis may follow if the decline in the price of the currency is large since it leads immediately to a sharp increase in the total indebtedness of the borrowers with liabilities denominated in a foreign currency

When currencies are no longer anchored to parities each central bank has much more monetary independence and investors have much more incentive to change the currency composition of the securities in their portfolios Changes in investor demand for foreign securities lead to much more price risk in both the markets for currencies and the markets for securities There has been a scissors-like movement in the market for liquidity the lsquodemandrsquo for liquidity by traders and investors has increased while the supply of liquidity has declined because the price risks are much larger Keywords banking crises market efficiency market failure flexible exchange rates the transfer problem process

JEL Codes E44 F31 F32 F33 F34 F38

6 Liquidity and Economic Activity Conference

PAPER 2Professor Tim Congdon CBE Chairman Institute of International Monetary Research at the University of Buckingham lsquoWHAT WERE THE CAUSES OF THE GREAT RECESSION THE IMPORTANCE OF THE ldquoWHICH AGGREGATErdquo DEBATErsquo

Abstract Monetary economists have long debated which measure of the quantity of money is the most useful in macroeconomic analysis Before during and after the Great Recession (to be understood ndash roughly speaking as the six quarters to mid-2009) the growth rates of different money aggregates diverged sharply in the leading economies The lsquowhich aggregatersquo debate was therefore of particular importance The focus in this paper is on the USArsquos experience although the behaviour of money in other economies and its relationship to prices and spending in them is mentioned where relevant and interesting

The argument is that broadly-defined money has long been the correct concept to use in interpreting macroeconomic developments and that its merits became clear in the Great Recession and its aftermath Broad money is to be viewed as including virtually all money-like assets (and certainly most of the deposit liabilities of the commercial banking system) For the purposes of the paper broad money is identified with the M3 money measure for which the Federal Reserve prepared data until early 2006 It will be shown that with its flow-of-funds data the Federal Reserve is still publishing information that enables an approximate M3 aggregate to be estimated Further the M3 money holdings

of the US economyrsquos main sectors ndash households non-financial business and financial business ndash can be tracked from the flow-of-funds numbers

The last decade has seen the lowest average annual increase per cent in nominal GDP since the 1930s The growth rate of M3 has also been the lowest since the 1930s This similarity of the rates of change contrasts with the behaviour of M1 M1 increased strongly during the Great Recession and afterwards Its rate of increase in the 2009ndash15 period was over double that in the preceding 48 years with its behaviour sharply divergent from nominal GDP With M2 the discrepancy is less marked but a discrepancy remains

Apart from its insights into the causation of the Great Recession the paper will have two provocative conclusions First the Federal Reserve should resume publication of an M3 aggregate and following the Bank of Englandrsquos example it should de-compose broad money into its sector constituents Second the interesting patterns in inter-sectoral money flows that seem to be recurrent in cyclical episodes can be monitored only from a simple-sum broad money aggregate Divisia indices prepared from aggregate economy-wide data cannot identify the patterns while ndash arguably ndash these patterns are important in understanding the transmission mechanism from money to the economy

The paperrsquos main thesis has already been developed in an informal way in an article in the 2016 Central Banking journal The aim of the paper will be to present the analysis in a more academically rigorous form

7Liquidity and Economic Activity Conference

PAPER 3Manmohan Singh International Monetary Fund lsquoTHE ROLE OF PLEDGED COLLATERAL IN LIQUIDITY METRICS AND MONETARY POLICYrsquo

Abstract Collateral does not flow in a vacuum and needs balance sheet(s) to move within the financial system Pledged collateral needs to be considered along with money metrics to fully understand the liquidity in the markets This paper analyses securities-lending derivatives and prime-brokerage markets as suppliers of collateral (as much has been written on the repo market) Going forward the official sectorrsquos choice of balance sheet(s) that allows the flow of liquidity (ie money and collateral) should be transparent and driven by market forces and not by ad hoc allocation by central banks Else this may be suboptimal on many fronts for monetary policy transmission for smooth money market functioning and ultimately for market liquidity

Keywords collateral velocity securities lending prime brokerage OTC derivatives repo

JEL Codes G21 G28 F33 K22

PAPER 4David Aikman Bank of England Andreas Lehnert Federal Reserve Bank Nellie Liang Federal Reserve Bank Michele Modugno Federal Reserve Bank lsquoCREDIT RISK APPETITE AND MONETARY POLICY TRANSMISSIONrsquo

Abstract We show that US economic performance and monetary policy transmission depend on nonfinancial sector credit and the effects are nonlinear When credit is below its trend increases in risk appetite lead to sustained increases in output In contrast when credit is above trend initial expansions are followed by additional excess borrowing and subsequent contractions suggesting an inter-temporal trade-off for economic activity Also tighter monetary policy is ineffective at slowing the economy when credit is high consistent with evidence of less transmission of policy changes to distant forward Treasury rates in high-credit periods

Keywords financial stability financial conditions credit asset bubbles monetary policy

JEL Codes E58 E65 G28

8 Liquidity and Economic Activity Conference

PAPER 5Kevin Fox University of New South Wales and Erwin Diewert University of British ColumbialsquoTHE DEMAND FOR MONETARY BALANCES AND THE MEASUREMENT OF PRODUCTIVITYrsquo

Abstract Firms in advanced economies have greatly increased their cash holdings since the mid-1990s While this has been observed and the reasons debated by central bankers international agencies and academics it remains somewhat of a puzzle This paper explores possible reasons for this increase and the implications for understanding productivity growth Monetary holdings have an opportunity cost ie allocating firm financial capital into monetary deposits means that investment in real assets is reduced Traditional measures of Total Factor Productivity (TFP) do not take into account these holdings of monetary assets Given the recent large increases in these holdings ex ante it can be expected that adding these monetary assets to the list of traditional sources of capital services will reduce the TFP of the business sector Using a new data set on the US aggregate (corporate and non-corporate) business sector we measure this effect for the noting the implications for the System of National Accounts of this expanded definition of capital services Also industry elasticities of demand to hold monetary balances using the Normalized Quadratic functional form A key finding is that the accumulation of monetary holdings is primarily a phenomenon of the non-corporate business sector

We have found that while conceptually more correct adding real money balances to our input aggregate does not change aggregate measured productivity performance very much for the corporate sector This is because even though there is some variation the asset share is relatively small The impact on the non-corporate sector is larger especially in the latter decades of the sample when currency and deposit holdings increased substantially especially relative to other asset holdings Finally the relative productivity of individual firms can be significantly impacted by differences in money holdings even if there is little aggregate effect at the sectoral level Indeed understanding productivity differences between small and large firms can be enhanced by taking into account currency and deposits small firms are often credit constrained and therefore have greater cash holdings Similarly accounting for cash holdings can provide an augmented understanding of productivity and profitability in studies of firm dynamics In addition understanding productivity differences between risky and less risky sectors and firms can be informed by differences in money balances where eg dependence on RampD is taken as a proxy for risk Hence this paper provides a framework and empirical results for a more comprehensive understanding of productivity growth and dynamics

9Liquidity and Economic Activity Conference

PAPER 6Dennis Fixler Bureau of Economic Analysis and Kim Zieschang International Monetary Fund lsquoPRODUCING LIQUIDITYrsquo

Abstract Based on a paper presented at the 2015 Meeting of the Society for Economic Measurement Paris Dennis Fixler is Chief Economist Bureau of Economic Analysis and Kim Zieschang is Adjunct Professor of Economics University of Queensland The views expressed in this paper are those of the authors and should not be attributed to the Bureau of Economic Analysis JEL codes E01 Measurement and Data on National Income and Product Accounts Commercial banks are a primary producer of liquidity in an economy Despite their importance there is no consensus on the measurement of the liquidity service and for the matter bank output in general This lack of consensus exits at the microeconomic level and at that level of the national accounts which attempts to capture the significant role of bank services in the output of the economy The current national accounts measure of bank output is termed financial intermediation services indirectly measured or lsquoFISIMrsquo springing from the 1993 version of the System of National accounts (the 1993 SNA) Calculating FISIM under the 2008 version of national accounting standards is simple and generally practical provided the compiler has a key datummdashthe lsquoreference rate of interestrsquo The calculation is essentially Output = (Reference rate of interest minus Deposit rate) times Deposit liabilities + (Loan rate minus Reference rate of interest) times Loan assets

As the deposit and loan financial instrument coverage of this formula implies the current national accounting standards apply it to deposit-takers such as banks as well as to

non-deposit-taking loan-making financial institutions such as finance companies and money lenders

A first issue in treatment of financial services since the 1993 version of the standards introduced the reference rate concept has been lack of consensus on how it should be determined Generally the idea has been to select an exogenous reference rate a government security rate or a combination of them is often used because it captures the risk-free reference rate that underlies the user cost of money as in Barnett (1978) Some have proposed alternative exogenous reference rates that are tied to the market determined risk of the security to which the reference rate is to be applied In any event we argue that the reference rate should be endogenously determined and should be the bankrsquos calculated cost of capital the overall rate of return paid to all sources of funding including equity on the liability side of the balance sheet Our lsquoreference rate of interestrsquo is therefore individual to each bank rather than an economy-wide constant A second issue in the national accounts dialogue on financial services is the scope of financial instruments that should be associated with the SNArsquos indirect financial services measure The 2008 SNA narrowed the scope of FISIM to the deposit and loan positions of financial corporations but previous versions included interest income flows on essentially all financial instruments Return to the broad 1993 financial instrument scope is nevertheless a research agenda item for the next version of the SNA and appears essential to align the SNA with the scope of liquidity measured by the money and banking literature and the associated standards for compiling financial statistics We argue that FISIM should cover all financial instruments

10 Liquidity and Economic Activity Conference

Armed with the cost of capital reference rate and full financial balance sheet instrument scope we derive the production identity (value of output equiv cost of production) from the income equiv expense and balanced sheet identities generating a FISIM like calculation of output with a single cost of capital reference rate for each enterprise rather than for the whole economy Note that by computing specific user cost prices of bank services it is in principle possible to aggregate them into a price index for liquidity services and correspondingly obtain a quantity index of liquidity services

Such aggregate measures would be useful in tracing the financial intermediation process into GDP or another aggregate measure of economic activity

On examining the SNA-type production identity for an individual bank we find the cost side contains a term within operating surplus ndash the equity leverage premium ndash that depends on the bankrsquos financing ndash the debt and equity composition of the liability side of its balance sheet Further it is inherent in the definition of the cost of capital reference rate that the equity leverage premium is identically equal to what we will term produced liquidity within the part of SNA financial services output of the bank coming from the debt instruments on the liability side of the its

balance sheet prominent among which being deposits Given that banks transform liabilities into assets the equity-leverage premium and the produced liquidity is tied to the risk bearing undertaken by the bank With an exogenous reference rate the risk bearing is completely embedded in the user cost price of the asset or liability product In our model because the entire balance sheet is used the risk bearing is tied to equity holders

This paper proposes a resolution to the scope and methodology issues in the ongoing national accounts conversation on financial services particularly on provision of liquidity by debt issuing enterprises and suggests that the equity leverage premium now included in the nominal output of banks be offset by an intermediate insurance input supplied by their equity holders This retains the current standardsrsquo origination of liquidity with banks (but also extends it to other debt issuing enterprises) while better exposing the link between the leverage risk bearing (provision of debt guarantees) of equity holding sectors and production of liquidity by banks (and other debt issuing enterprises) With the developed framework issues such as the measurement of output and productivity of the providers of liquidity services and other financial services can also be measured and incorporated into macroeconomic statistics

11Liquidity and Economic Activity Conference

PAPER 7Jan Willem Van den End De Nederlandsche Bank the Netherlands lsquoAPPLYING COMPLEXITY THEORY TO INTEREST RATES EVIDENCE OF CRITICAL TRANSITIONS IN THE EURO AREArsquo

Abstract We apply complexity theory to financial markets to show that excess liquidity created by the Eurosystem has led to critical transitions in the configuration of interest rates Complexity indicators turn out to be useful signals of tipping points and subsequent regime shifts in interest rates We find that the critical transitions are related to the increase of excess liquidity in the euro area These insights can help central banks to strike the right balance between the intention to support the financial system by injecting liquidity and potential unintended side-effects on market functioning

Keywords interest rates central banks and their policies monetary policy

JEL Codes E43 E58 E52

12 Liquidity and Economic Activity Conference

PAPER 8Michael Bowe Alliance Manchester Business School University of Manchester and University of Vaasa Olga Kolokolova Alliance Manchester Business School University of Manchester and Marcin Michalski Alliance Manchester Business School University of Manchester lsquoTOO BIG TO CARE TOO SMALL TO MATTER MACRO FINANCIAL POLICY AND BANK LIQUIDITY CREATIONrsquo

Abstract We estimate the volume of liquidity creation by US bank holding companies between 1997 and 2015 and examine the impact of changes in macrofinancial policies on the dynamics of this process We focus on three major policy developments occurring in the aftermath of the 2007ndash09 financial crisis bank capital regulation reform monetary stimulus through quantitative easing and the Troubled Asset Relief Program (TARP)

We use the three-step procedure proposed by Berger and Bouwman (2009) to calculate the dollar amount of liquidity a financial institution creates Initially we classify all balance sheet items and o_-balance sheet activities of an institution as liquid semi-liquid or illiquid to which we then assign liquidity weights of +1=2 (illiquid assets and liquid liabilities) 0 (semi-liquid assets and liabilities) or 10485761=2 (liquid assets illiquid liabilities and equity) respectively The dollar volume of liquidity creation is then calculated as liquidity-weighted sum of the items identified in the first step We find that the total amount of liquidity creation by banks in the sample increases by a factor of 365 from $14 trillion in 1997Q1 to $51 trillion in 2015Q4 Indeed the volume of liquidity creation increases at a faster pace than the gross domestic product of the United States which rises by a factor of 21 during the same period

The results of panel regressions reveal that the dynamics of bank liquidity creation differ considerably between small and large institutions The level of bank capital requirements and the stance of monetary policy impact the liquidity creation of both small and medium-sized banks Liquidity creation of the largest banks which control over 80 of the banking systemrsquos assets remains unaffected

We find that changes in the amount of liquidity creation by small banks per $1 of their gross total assets are positively related to changes in the term spread but inversely related to changes in their Tier 1 capital ratios Further we show that the volume of liquidity creation is positively related to the riskiness of a bankrsquos assets as measured by the ratio of risk-weighted assets to gross total assets regardless of its size classification We establish that TARP has negative short-term effects on small and medium banks and no immediate impact on the liquidity creation of the largest institutions in the sample In contrast participation in TARP leads to a long-term decline in liquidity provision per dollar of assets of the largest banks This persists even after the completion of the programme and repayment of TARP funding As nearly all of the largest TARP-recipient banks in the sample are subsequently classified as systemically important financial institutions our results suggest that the increased regulatory scrutiny may adversely affect their ability to create liquidity

By demonstrating that the stance of monetary policy and the level of bank capital requirements do not tangibly enhance the liquidity provision efficiency of the largest systemically important institutions in the system our study offers important insights for the design of effective macroprudential policies

13Liquidity and Economic Activity Conference

PAPER 9 Jonathan Goldberg Federal Reserve Board lsquoTHE SUPPLY OF LIQUIDITY AND REAL ECONOMIC ACTIVITYrsquo

Abstract This paper identifies shocks to the supply of liquidity by dealer firms and investigates their effects on real economic activity First I develop a simple theoretical model of dealer intermediation then in a structural VAR model I use sign restrictions derived from the theoretical model to identify liquidity supply shocks Liquidity supply shocks that are orthogonal to information contained in macroeconomic and asset price variables have considerable predictive power for economic activity Moreover positive liquidity supply shocks cause large and persistent increases in real activity

Keywords liquidity dealer intermediation risk-taking real activity liquidity shocks

JEL Codes G10 G12 G17 G24

14 Liquidity and Economic Activity Conference

PAPER 10 Dirk Bezemer University of Groningen the Netherlands and Lu Zhang Sustainable Finance Lab and Utrecht University lsquoMACROECONOMIC IMPLICATIONS OF LIQUIDITY CREATION CREDIT ALLOCATION AND POST CRISIS RECESSIONSrsquo

Abstract In this paper we address macroeconomic implications of liquidity creation through bank lending and the impacts of liquidity on economic activity We note that liquidity created through bank lending can be channeled into the real sector in support of economic activity or in financial and real estate markets in support of capital gains We collected macro-level data on bank credit aggregates over 2000ndash12 for 57 economies categorise according to the use of credit We note the long-term shift in the allocation of bank credit creation away from non-financial business lending and towards financial and especially real estate markets We then present new evidence on the channels from credit allocation pre-crisis to the severity of post-crisis recessions

Our first contribution is to show that it is not just the level but the composition of debt (defined as the share of mortgage credit in total credit) that matters A second contribution is to analyse the channels We collect additional industry-level data across 20 industries for a subset of economies We analyze the effect of changes in the pre-crisis composition of debt on total GDP and on investment consumption and capital allocation We find that changes in the share of household mortgage credit before the crisis have a significant effect on recession

severity after the 2007 crisis This is not the case for any other credit category nor for growth of total bank credit We address the causality challenge by using the difference between IMF growth forecasts and growth realisations This filters out country-specific drivers of both debt and income growth We address the model selection challenge by using Bayesian averaging models This indicates that the change in credit composition is among the three most robust determinants of post-crisis recession severity with income levels and current account balance The findings are robust to a wide range of control variables and to the different responses across advancedemerging and EMUnon-EMU economies

We then delve into the channels from change in debt composition to income growth loss The literature to date has focused on negative wealth effects on consumption for which we find strong evidence In addition we find evidence for two investment channels a loan supply effect and a capital allocation effect In the industry-level analysis we find that in economies which experienced a larger change in debt composition before 2008 there was a larger reduction of credit available and weaker capital re-allocation towards sectors with higher value-added This effect is observed already before the crisis and very strongly after the crisis We discuss policy implications and future research

Keywords private credit mortgages crisis output loss investment capital allocation

JEL Codes C11 C15 E01 O4

15Liquidity and Economic Activity Conference

PAPER 11 Iftekhar Hasan Gabelli School of Business Fordham University and Jean-Loup Soula Strasbourg University LaRGE Research Centre lsquoTECHNICAL EFFICIENCY IN BANK LIQUIDITY CREATIONrsquo

Abstract This paper generates an optimum bank liquidity creation benchmark by tracing an efficient frontier in liquidity creation (bank intermediation) and questions why some banks are more efficient than others in such activities Evidence reveals that medium size banks are most correlated to efficient frontier irrespective of their business models Small (large) banks ndash focused on traditional banking activities ndash are found to be the most (least) efficient in creating liquidity in on-balance sheet items whereas large banks ndash involved in non-traditional activities ndash are found to be most efficient in off-balance sheet liquidity creation Additionally the liquidity efficiency of small banks is more resilient during the 2007ndash08 financial crisis relative to other banks

Keywords banks technical efficiency liquidity creation diversification

Jel Codes G21 G28 G32

16 Liquidity and Economic Activity Conference

PAPER 12 Richard Anderson Lindenwood University John Duca Federal Reserve Bank of Dallas and Barry Jones Department of Economics State University of New York lsquoA BROAD MONETARY SERVICES (LIQUIDITY) INDEX AND ITS LONG-TERM LINKS TO ECONOMIC ACTIVITYrsquo

Abstract Liquid assets play a crucial role in economic activity as the medium in which payments are received and are made lsquoSudden stopsrsquo in financial markets ndash during which liquid assets are hoarded ndash are periods when economic activity slows abruptly Further it is the sine qua non of financial intermediation to alter the measured relative and absolute quantities of liquid assets In this way the observed quantities of liquid assets reflect both the path of past economic activity and anticipations of future activity The quantities must be regarded as arising endogenously within an intertemporal general equilibrium model of the economy a la Tobin (1958) and Merton (1971) Economic modeling and analysis traditionally proceeds by combining relatively high-dimension lists of specific assets into lower-dimension lsquomonetary aggregatesrsquo The defining characteristic of the assets included is that all are available to facilitate

the exchange of goods and services at a transaction cost less than infinity That is all included assets may be sold or used as collateral for the purchase and sale of goods and services and thereby provide liquidity services which can be tracked by measured opportunity costs of foregone interest which may not in practice reflect all transactions costs The last caveat particularly applies to household holdings of mutual fund assets outside of money market funds Our study contributes to the literature in two key ways First it expands a conventional Divisia measure of money services to account for the liquidity provided by such mutual fund assets Second it then explores the long-run connections between economic activity and monetary aggregates constructed as index numbers from 1929 to 2016 finding that the inclusion of mutual fund liquidity services results in a Divisia measure of money that has a much more stable velocity

We thank Emil Mihalov and Tyler Atkinson for research assistance The views expressed are those of the authors and do not necessarily reflect those of the Federal Reserve Bank of Dallas or the Federal Reserve System Any errors are our own

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 4: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

4 Liquidity and Economic Activity Conference

Day 2 24 May 2017 930ndash1030am

SESSION 4 Financial Services Indices Liquidity and Economic ActivityChair Jane Binner

nPAPER 12 Richard Anderson John Duca and Barry Jones lsquoA broad monetary services (liquidity) index and its long-term links to economic activityrsquo

nPAPER 13 John Keating and Lee Smith lsquoThe optimal monetary instrument and the (mis) use of Granger causalityrsquo

nPAPER 14 Jane Binner Logan Kelly and Jon Tepper lsquoOn the robustness of sluggish state-based neural networks for providing useful insight into the new Keynesian Phillips curversquo

1030ndash1045am Break1045amndash1245pm

SESSION 5Structural Change Volatility and ShocksChair Taniya Ghosh

nPAPER 15 Rakesh Bissoondeeal Michael Karaglou Jane Binner lsquoStructural changes and the role of money in the UKrsquo

nPAPER 16 Costas Milas and Michael Ellington lsquoIdentifying aggregate liquidity shocks in conjunction with monetary policy shocks An application using UK datarsquo

nPAPER 17 Makram El Shagi and Logan Kelly lsquoWhat can we learn from country level liquidity in the EMUrsquo

nPAPER 18 Bhadury and Ghosh lsquoHas money lost its relevance Determining a solution to the exchange rate disconnect puzzle in the small open economiesrsquo

1245ndash145pm Lunch145ndash245pm Keynote Address Lawrence Goodman Centre for Financial Stability New York lsquoPreventative macroprudential policyrsquo245ndash300pm Break300ndash500pm

SESSION 6Tests of Index Numbers Separability and Price DualsChair Victor Valcarcel

nPAPER 19 William A Barnett and Jinan Liu lsquoUser cost of credit card services under intertemporal non-separabilityrsquo

nPAPER 20 Per Hjertstrand Gerald Whitney and James Swofford lsquoPanel data tests of index numbers and revealed preference rankingsrsquo

nPAPER 21 Sajid Chaudhry Jane Binner James Swofford and Andrew Mullineux lsquoScotland as an optimum currency arearsquo

nPAPER 22 Victor Valcarcel lsquoInterest rate pass-through Divisia user costs of monetary assets and the Federal funds ratersquo

500pm Conference Close600pm Editorial Board Meeting ndash Guest Editors Special Issue Meeting

5Liquidity and Economic Activity Conference

Financial Services Indices Liquidity and Economic Activity PAPER ABSTRACTS

PAPER 1Robert Aliber University of Chicago lsquoAN ESSAY ON MONETARY TURBULENCE AND THE SUPPLY OF LIQUIDITYrsquo

Abstract The last 35 years have been the most turbulent in monetary history There have more than 100 banking crises many in one of four waves most of these crises have been lsquotwinnedrsquo with currency crises Moreover the deviations between the market prices of currencies have been much larger than ever before The purpose of this essay is to explain why there have been so many banking crises and why they have often occurred together with currency crises the answer is that the floating currency arrangement is inherently unstable because an increase in cross-border investment flows to a country leads to an increase in the price of its securities and the increase in the price of its currency The essay is based on a general equilibrium view that links the market in currencies with the markets in bonds stocks and real estate the increase in cross-border investment inflows to a country leads to an increase in the household wealth as an integral part of the adjustment process to ensure that the countryrsquos current account deficit increases as its capital account surplus increases otherwise the market in the countryrsquos currency would not clear The increase in the countryrsquos external indebtedness is much more rapid than the increase in its GDP When the lenders recognise that the indebted country

is on a non-sustainable trajectory their demand for the borrowersrsquo IOUs declines and the price of its securities and the price of its currency declines A banking crisis may follow if the decline in the price of the currency is large since it leads immediately to a sharp increase in the total indebtedness of the borrowers with liabilities denominated in a foreign currency

When currencies are no longer anchored to parities each central bank has much more monetary independence and investors have much more incentive to change the currency composition of the securities in their portfolios Changes in investor demand for foreign securities lead to much more price risk in both the markets for currencies and the markets for securities There has been a scissors-like movement in the market for liquidity the lsquodemandrsquo for liquidity by traders and investors has increased while the supply of liquidity has declined because the price risks are much larger Keywords banking crises market efficiency market failure flexible exchange rates the transfer problem process

JEL Codes E44 F31 F32 F33 F34 F38

6 Liquidity and Economic Activity Conference

PAPER 2Professor Tim Congdon CBE Chairman Institute of International Monetary Research at the University of Buckingham lsquoWHAT WERE THE CAUSES OF THE GREAT RECESSION THE IMPORTANCE OF THE ldquoWHICH AGGREGATErdquo DEBATErsquo

Abstract Monetary economists have long debated which measure of the quantity of money is the most useful in macroeconomic analysis Before during and after the Great Recession (to be understood ndash roughly speaking as the six quarters to mid-2009) the growth rates of different money aggregates diverged sharply in the leading economies The lsquowhich aggregatersquo debate was therefore of particular importance The focus in this paper is on the USArsquos experience although the behaviour of money in other economies and its relationship to prices and spending in them is mentioned where relevant and interesting

The argument is that broadly-defined money has long been the correct concept to use in interpreting macroeconomic developments and that its merits became clear in the Great Recession and its aftermath Broad money is to be viewed as including virtually all money-like assets (and certainly most of the deposit liabilities of the commercial banking system) For the purposes of the paper broad money is identified with the M3 money measure for which the Federal Reserve prepared data until early 2006 It will be shown that with its flow-of-funds data the Federal Reserve is still publishing information that enables an approximate M3 aggregate to be estimated Further the M3 money holdings

of the US economyrsquos main sectors ndash households non-financial business and financial business ndash can be tracked from the flow-of-funds numbers

The last decade has seen the lowest average annual increase per cent in nominal GDP since the 1930s The growth rate of M3 has also been the lowest since the 1930s This similarity of the rates of change contrasts with the behaviour of M1 M1 increased strongly during the Great Recession and afterwards Its rate of increase in the 2009ndash15 period was over double that in the preceding 48 years with its behaviour sharply divergent from nominal GDP With M2 the discrepancy is less marked but a discrepancy remains

Apart from its insights into the causation of the Great Recession the paper will have two provocative conclusions First the Federal Reserve should resume publication of an M3 aggregate and following the Bank of Englandrsquos example it should de-compose broad money into its sector constituents Second the interesting patterns in inter-sectoral money flows that seem to be recurrent in cyclical episodes can be monitored only from a simple-sum broad money aggregate Divisia indices prepared from aggregate economy-wide data cannot identify the patterns while ndash arguably ndash these patterns are important in understanding the transmission mechanism from money to the economy

The paperrsquos main thesis has already been developed in an informal way in an article in the 2016 Central Banking journal The aim of the paper will be to present the analysis in a more academically rigorous form

7Liquidity and Economic Activity Conference

PAPER 3Manmohan Singh International Monetary Fund lsquoTHE ROLE OF PLEDGED COLLATERAL IN LIQUIDITY METRICS AND MONETARY POLICYrsquo

Abstract Collateral does not flow in a vacuum and needs balance sheet(s) to move within the financial system Pledged collateral needs to be considered along with money metrics to fully understand the liquidity in the markets This paper analyses securities-lending derivatives and prime-brokerage markets as suppliers of collateral (as much has been written on the repo market) Going forward the official sectorrsquos choice of balance sheet(s) that allows the flow of liquidity (ie money and collateral) should be transparent and driven by market forces and not by ad hoc allocation by central banks Else this may be suboptimal on many fronts for monetary policy transmission for smooth money market functioning and ultimately for market liquidity

Keywords collateral velocity securities lending prime brokerage OTC derivatives repo

JEL Codes G21 G28 F33 K22

PAPER 4David Aikman Bank of England Andreas Lehnert Federal Reserve Bank Nellie Liang Federal Reserve Bank Michele Modugno Federal Reserve Bank lsquoCREDIT RISK APPETITE AND MONETARY POLICY TRANSMISSIONrsquo

Abstract We show that US economic performance and monetary policy transmission depend on nonfinancial sector credit and the effects are nonlinear When credit is below its trend increases in risk appetite lead to sustained increases in output In contrast when credit is above trend initial expansions are followed by additional excess borrowing and subsequent contractions suggesting an inter-temporal trade-off for economic activity Also tighter monetary policy is ineffective at slowing the economy when credit is high consistent with evidence of less transmission of policy changes to distant forward Treasury rates in high-credit periods

Keywords financial stability financial conditions credit asset bubbles monetary policy

JEL Codes E58 E65 G28

8 Liquidity and Economic Activity Conference

PAPER 5Kevin Fox University of New South Wales and Erwin Diewert University of British ColumbialsquoTHE DEMAND FOR MONETARY BALANCES AND THE MEASUREMENT OF PRODUCTIVITYrsquo

Abstract Firms in advanced economies have greatly increased their cash holdings since the mid-1990s While this has been observed and the reasons debated by central bankers international agencies and academics it remains somewhat of a puzzle This paper explores possible reasons for this increase and the implications for understanding productivity growth Monetary holdings have an opportunity cost ie allocating firm financial capital into monetary deposits means that investment in real assets is reduced Traditional measures of Total Factor Productivity (TFP) do not take into account these holdings of monetary assets Given the recent large increases in these holdings ex ante it can be expected that adding these monetary assets to the list of traditional sources of capital services will reduce the TFP of the business sector Using a new data set on the US aggregate (corporate and non-corporate) business sector we measure this effect for the noting the implications for the System of National Accounts of this expanded definition of capital services Also industry elasticities of demand to hold monetary balances using the Normalized Quadratic functional form A key finding is that the accumulation of monetary holdings is primarily a phenomenon of the non-corporate business sector

We have found that while conceptually more correct adding real money balances to our input aggregate does not change aggregate measured productivity performance very much for the corporate sector This is because even though there is some variation the asset share is relatively small The impact on the non-corporate sector is larger especially in the latter decades of the sample when currency and deposit holdings increased substantially especially relative to other asset holdings Finally the relative productivity of individual firms can be significantly impacted by differences in money holdings even if there is little aggregate effect at the sectoral level Indeed understanding productivity differences between small and large firms can be enhanced by taking into account currency and deposits small firms are often credit constrained and therefore have greater cash holdings Similarly accounting for cash holdings can provide an augmented understanding of productivity and profitability in studies of firm dynamics In addition understanding productivity differences between risky and less risky sectors and firms can be informed by differences in money balances where eg dependence on RampD is taken as a proxy for risk Hence this paper provides a framework and empirical results for a more comprehensive understanding of productivity growth and dynamics

9Liquidity and Economic Activity Conference

PAPER 6Dennis Fixler Bureau of Economic Analysis and Kim Zieschang International Monetary Fund lsquoPRODUCING LIQUIDITYrsquo

Abstract Based on a paper presented at the 2015 Meeting of the Society for Economic Measurement Paris Dennis Fixler is Chief Economist Bureau of Economic Analysis and Kim Zieschang is Adjunct Professor of Economics University of Queensland The views expressed in this paper are those of the authors and should not be attributed to the Bureau of Economic Analysis JEL codes E01 Measurement and Data on National Income and Product Accounts Commercial banks are a primary producer of liquidity in an economy Despite their importance there is no consensus on the measurement of the liquidity service and for the matter bank output in general This lack of consensus exits at the microeconomic level and at that level of the national accounts which attempts to capture the significant role of bank services in the output of the economy The current national accounts measure of bank output is termed financial intermediation services indirectly measured or lsquoFISIMrsquo springing from the 1993 version of the System of National accounts (the 1993 SNA) Calculating FISIM under the 2008 version of national accounting standards is simple and generally practical provided the compiler has a key datummdashthe lsquoreference rate of interestrsquo The calculation is essentially Output = (Reference rate of interest minus Deposit rate) times Deposit liabilities + (Loan rate minus Reference rate of interest) times Loan assets

As the deposit and loan financial instrument coverage of this formula implies the current national accounting standards apply it to deposit-takers such as banks as well as to

non-deposit-taking loan-making financial institutions such as finance companies and money lenders

A first issue in treatment of financial services since the 1993 version of the standards introduced the reference rate concept has been lack of consensus on how it should be determined Generally the idea has been to select an exogenous reference rate a government security rate or a combination of them is often used because it captures the risk-free reference rate that underlies the user cost of money as in Barnett (1978) Some have proposed alternative exogenous reference rates that are tied to the market determined risk of the security to which the reference rate is to be applied In any event we argue that the reference rate should be endogenously determined and should be the bankrsquos calculated cost of capital the overall rate of return paid to all sources of funding including equity on the liability side of the balance sheet Our lsquoreference rate of interestrsquo is therefore individual to each bank rather than an economy-wide constant A second issue in the national accounts dialogue on financial services is the scope of financial instruments that should be associated with the SNArsquos indirect financial services measure The 2008 SNA narrowed the scope of FISIM to the deposit and loan positions of financial corporations but previous versions included interest income flows on essentially all financial instruments Return to the broad 1993 financial instrument scope is nevertheless a research agenda item for the next version of the SNA and appears essential to align the SNA with the scope of liquidity measured by the money and banking literature and the associated standards for compiling financial statistics We argue that FISIM should cover all financial instruments

10 Liquidity and Economic Activity Conference

Armed with the cost of capital reference rate and full financial balance sheet instrument scope we derive the production identity (value of output equiv cost of production) from the income equiv expense and balanced sheet identities generating a FISIM like calculation of output with a single cost of capital reference rate for each enterprise rather than for the whole economy Note that by computing specific user cost prices of bank services it is in principle possible to aggregate them into a price index for liquidity services and correspondingly obtain a quantity index of liquidity services

Such aggregate measures would be useful in tracing the financial intermediation process into GDP or another aggregate measure of economic activity

On examining the SNA-type production identity for an individual bank we find the cost side contains a term within operating surplus ndash the equity leverage premium ndash that depends on the bankrsquos financing ndash the debt and equity composition of the liability side of its balance sheet Further it is inherent in the definition of the cost of capital reference rate that the equity leverage premium is identically equal to what we will term produced liquidity within the part of SNA financial services output of the bank coming from the debt instruments on the liability side of the its

balance sheet prominent among which being deposits Given that banks transform liabilities into assets the equity-leverage premium and the produced liquidity is tied to the risk bearing undertaken by the bank With an exogenous reference rate the risk bearing is completely embedded in the user cost price of the asset or liability product In our model because the entire balance sheet is used the risk bearing is tied to equity holders

This paper proposes a resolution to the scope and methodology issues in the ongoing national accounts conversation on financial services particularly on provision of liquidity by debt issuing enterprises and suggests that the equity leverage premium now included in the nominal output of banks be offset by an intermediate insurance input supplied by their equity holders This retains the current standardsrsquo origination of liquidity with banks (but also extends it to other debt issuing enterprises) while better exposing the link between the leverage risk bearing (provision of debt guarantees) of equity holding sectors and production of liquidity by banks (and other debt issuing enterprises) With the developed framework issues such as the measurement of output and productivity of the providers of liquidity services and other financial services can also be measured and incorporated into macroeconomic statistics

11Liquidity and Economic Activity Conference

PAPER 7Jan Willem Van den End De Nederlandsche Bank the Netherlands lsquoAPPLYING COMPLEXITY THEORY TO INTEREST RATES EVIDENCE OF CRITICAL TRANSITIONS IN THE EURO AREArsquo

Abstract We apply complexity theory to financial markets to show that excess liquidity created by the Eurosystem has led to critical transitions in the configuration of interest rates Complexity indicators turn out to be useful signals of tipping points and subsequent regime shifts in interest rates We find that the critical transitions are related to the increase of excess liquidity in the euro area These insights can help central banks to strike the right balance between the intention to support the financial system by injecting liquidity and potential unintended side-effects on market functioning

Keywords interest rates central banks and their policies monetary policy

JEL Codes E43 E58 E52

12 Liquidity and Economic Activity Conference

PAPER 8Michael Bowe Alliance Manchester Business School University of Manchester and University of Vaasa Olga Kolokolova Alliance Manchester Business School University of Manchester and Marcin Michalski Alliance Manchester Business School University of Manchester lsquoTOO BIG TO CARE TOO SMALL TO MATTER MACRO FINANCIAL POLICY AND BANK LIQUIDITY CREATIONrsquo

Abstract We estimate the volume of liquidity creation by US bank holding companies between 1997 and 2015 and examine the impact of changes in macrofinancial policies on the dynamics of this process We focus on three major policy developments occurring in the aftermath of the 2007ndash09 financial crisis bank capital regulation reform monetary stimulus through quantitative easing and the Troubled Asset Relief Program (TARP)

We use the three-step procedure proposed by Berger and Bouwman (2009) to calculate the dollar amount of liquidity a financial institution creates Initially we classify all balance sheet items and o_-balance sheet activities of an institution as liquid semi-liquid or illiquid to which we then assign liquidity weights of +1=2 (illiquid assets and liquid liabilities) 0 (semi-liquid assets and liabilities) or 10485761=2 (liquid assets illiquid liabilities and equity) respectively The dollar volume of liquidity creation is then calculated as liquidity-weighted sum of the items identified in the first step We find that the total amount of liquidity creation by banks in the sample increases by a factor of 365 from $14 trillion in 1997Q1 to $51 trillion in 2015Q4 Indeed the volume of liquidity creation increases at a faster pace than the gross domestic product of the United States which rises by a factor of 21 during the same period

The results of panel regressions reveal that the dynamics of bank liquidity creation differ considerably between small and large institutions The level of bank capital requirements and the stance of monetary policy impact the liquidity creation of both small and medium-sized banks Liquidity creation of the largest banks which control over 80 of the banking systemrsquos assets remains unaffected

We find that changes in the amount of liquidity creation by small banks per $1 of their gross total assets are positively related to changes in the term spread but inversely related to changes in their Tier 1 capital ratios Further we show that the volume of liquidity creation is positively related to the riskiness of a bankrsquos assets as measured by the ratio of risk-weighted assets to gross total assets regardless of its size classification We establish that TARP has negative short-term effects on small and medium banks and no immediate impact on the liquidity creation of the largest institutions in the sample In contrast participation in TARP leads to a long-term decline in liquidity provision per dollar of assets of the largest banks This persists even after the completion of the programme and repayment of TARP funding As nearly all of the largest TARP-recipient banks in the sample are subsequently classified as systemically important financial institutions our results suggest that the increased regulatory scrutiny may adversely affect their ability to create liquidity

By demonstrating that the stance of monetary policy and the level of bank capital requirements do not tangibly enhance the liquidity provision efficiency of the largest systemically important institutions in the system our study offers important insights for the design of effective macroprudential policies

13Liquidity and Economic Activity Conference

PAPER 9 Jonathan Goldberg Federal Reserve Board lsquoTHE SUPPLY OF LIQUIDITY AND REAL ECONOMIC ACTIVITYrsquo

Abstract This paper identifies shocks to the supply of liquidity by dealer firms and investigates their effects on real economic activity First I develop a simple theoretical model of dealer intermediation then in a structural VAR model I use sign restrictions derived from the theoretical model to identify liquidity supply shocks Liquidity supply shocks that are orthogonal to information contained in macroeconomic and asset price variables have considerable predictive power for economic activity Moreover positive liquidity supply shocks cause large and persistent increases in real activity

Keywords liquidity dealer intermediation risk-taking real activity liquidity shocks

JEL Codes G10 G12 G17 G24

14 Liquidity and Economic Activity Conference

PAPER 10 Dirk Bezemer University of Groningen the Netherlands and Lu Zhang Sustainable Finance Lab and Utrecht University lsquoMACROECONOMIC IMPLICATIONS OF LIQUIDITY CREATION CREDIT ALLOCATION AND POST CRISIS RECESSIONSrsquo

Abstract In this paper we address macroeconomic implications of liquidity creation through bank lending and the impacts of liquidity on economic activity We note that liquidity created through bank lending can be channeled into the real sector in support of economic activity or in financial and real estate markets in support of capital gains We collected macro-level data on bank credit aggregates over 2000ndash12 for 57 economies categorise according to the use of credit We note the long-term shift in the allocation of bank credit creation away from non-financial business lending and towards financial and especially real estate markets We then present new evidence on the channels from credit allocation pre-crisis to the severity of post-crisis recessions

Our first contribution is to show that it is not just the level but the composition of debt (defined as the share of mortgage credit in total credit) that matters A second contribution is to analyse the channels We collect additional industry-level data across 20 industries for a subset of economies We analyze the effect of changes in the pre-crisis composition of debt on total GDP and on investment consumption and capital allocation We find that changes in the share of household mortgage credit before the crisis have a significant effect on recession

severity after the 2007 crisis This is not the case for any other credit category nor for growth of total bank credit We address the causality challenge by using the difference between IMF growth forecasts and growth realisations This filters out country-specific drivers of both debt and income growth We address the model selection challenge by using Bayesian averaging models This indicates that the change in credit composition is among the three most robust determinants of post-crisis recession severity with income levels and current account balance The findings are robust to a wide range of control variables and to the different responses across advancedemerging and EMUnon-EMU economies

We then delve into the channels from change in debt composition to income growth loss The literature to date has focused on negative wealth effects on consumption for which we find strong evidence In addition we find evidence for two investment channels a loan supply effect and a capital allocation effect In the industry-level analysis we find that in economies which experienced a larger change in debt composition before 2008 there was a larger reduction of credit available and weaker capital re-allocation towards sectors with higher value-added This effect is observed already before the crisis and very strongly after the crisis We discuss policy implications and future research

Keywords private credit mortgages crisis output loss investment capital allocation

JEL Codes C11 C15 E01 O4

15Liquidity and Economic Activity Conference

PAPER 11 Iftekhar Hasan Gabelli School of Business Fordham University and Jean-Loup Soula Strasbourg University LaRGE Research Centre lsquoTECHNICAL EFFICIENCY IN BANK LIQUIDITY CREATIONrsquo

Abstract This paper generates an optimum bank liquidity creation benchmark by tracing an efficient frontier in liquidity creation (bank intermediation) and questions why some banks are more efficient than others in such activities Evidence reveals that medium size banks are most correlated to efficient frontier irrespective of their business models Small (large) banks ndash focused on traditional banking activities ndash are found to be the most (least) efficient in creating liquidity in on-balance sheet items whereas large banks ndash involved in non-traditional activities ndash are found to be most efficient in off-balance sheet liquidity creation Additionally the liquidity efficiency of small banks is more resilient during the 2007ndash08 financial crisis relative to other banks

Keywords banks technical efficiency liquidity creation diversification

Jel Codes G21 G28 G32

16 Liquidity and Economic Activity Conference

PAPER 12 Richard Anderson Lindenwood University John Duca Federal Reserve Bank of Dallas and Barry Jones Department of Economics State University of New York lsquoA BROAD MONETARY SERVICES (LIQUIDITY) INDEX AND ITS LONG-TERM LINKS TO ECONOMIC ACTIVITYrsquo

Abstract Liquid assets play a crucial role in economic activity as the medium in which payments are received and are made lsquoSudden stopsrsquo in financial markets ndash during which liquid assets are hoarded ndash are periods when economic activity slows abruptly Further it is the sine qua non of financial intermediation to alter the measured relative and absolute quantities of liquid assets In this way the observed quantities of liquid assets reflect both the path of past economic activity and anticipations of future activity The quantities must be regarded as arising endogenously within an intertemporal general equilibrium model of the economy a la Tobin (1958) and Merton (1971) Economic modeling and analysis traditionally proceeds by combining relatively high-dimension lists of specific assets into lower-dimension lsquomonetary aggregatesrsquo The defining characteristic of the assets included is that all are available to facilitate

the exchange of goods and services at a transaction cost less than infinity That is all included assets may be sold or used as collateral for the purchase and sale of goods and services and thereby provide liquidity services which can be tracked by measured opportunity costs of foregone interest which may not in practice reflect all transactions costs The last caveat particularly applies to household holdings of mutual fund assets outside of money market funds Our study contributes to the literature in two key ways First it expands a conventional Divisia measure of money services to account for the liquidity provided by such mutual fund assets Second it then explores the long-run connections between economic activity and monetary aggregates constructed as index numbers from 1929 to 2016 finding that the inclusion of mutual fund liquidity services results in a Divisia measure of money that has a much more stable velocity

We thank Emil Mihalov and Tyler Atkinson for research assistance The views expressed are those of the authors and do not necessarily reflect those of the Federal Reserve Bank of Dallas or the Federal Reserve System Any errors are our own

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 5: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

5Liquidity and Economic Activity Conference

Financial Services Indices Liquidity and Economic Activity PAPER ABSTRACTS

PAPER 1Robert Aliber University of Chicago lsquoAN ESSAY ON MONETARY TURBULENCE AND THE SUPPLY OF LIQUIDITYrsquo

Abstract The last 35 years have been the most turbulent in monetary history There have more than 100 banking crises many in one of four waves most of these crises have been lsquotwinnedrsquo with currency crises Moreover the deviations between the market prices of currencies have been much larger than ever before The purpose of this essay is to explain why there have been so many banking crises and why they have often occurred together with currency crises the answer is that the floating currency arrangement is inherently unstable because an increase in cross-border investment flows to a country leads to an increase in the price of its securities and the increase in the price of its currency The essay is based on a general equilibrium view that links the market in currencies with the markets in bonds stocks and real estate the increase in cross-border investment inflows to a country leads to an increase in the household wealth as an integral part of the adjustment process to ensure that the countryrsquos current account deficit increases as its capital account surplus increases otherwise the market in the countryrsquos currency would not clear The increase in the countryrsquos external indebtedness is much more rapid than the increase in its GDP When the lenders recognise that the indebted country

is on a non-sustainable trajectory their demand for the borrowersrsquo IOUs declines and the price of its securities and the price of its currency declines A banking crisis may follow if the decline in the price of the currency is large since it leads immediately to a sharp increase in the total indebtedness of the borrowers with liabilities denominated in a foreign currency

When currencies are no longer anchored to parities each central bank has much more monetary independence and investors have much more incentive to change the currency composition of the securities in their portfolios Changes in investor demand for foreign securities lead to much more price risk in both the markets for currencies and the markets for securities There has been a scissors-like movement in the market for liquidity the lsquodemandrsquo for liquidity by traders and investors has increased while the supply of liquidity has declined because the price risks are much larger Keywords banking crises market efficiency market failure flexible exchange rates the transfer problem process

JEL Codes E44 F31 F32 F33 F34 F38

6 Liquidity and Economic Activity Conference

PAPER 2Professor Tim Congdon CBE Chairman Institute of International Monetary Research at the University of Buckingham lsquoWHAT WERE THE CAUSES OF THE GREAT RECESSION THE IMPORTANCE OF THE ldquoWHICH AGGREGATErdquo DEBATErsquo

Abstract Monetary economists have long debated which measure of the quantity of money is the most useful in macroeconomic analysis Before during and after the Great Recession (to be understood ndash roughly speaking as the six quarters to mid-2009) the growth rates of different money aggregates diverged sharply in the leading economies The lsquowhich aggregatersquo debate was therefore of particular importance The focus in this paper is on the USArsquos experience although the behaviour of money in other economies and its relationship to prices and spending in them is mentioned where relevant and interesting

The argument is that broadly-defined money has long been the correct concept to use in interpreting macroeconomic developments and that its merits became clear in the Great Recession and its aftermath Broad money is to be viewed as including virtually all money-like assets (and certainly most of the deposit liabilities of the commercial banking system) For the purposes of the paper broad money is identified with the M3 money measure for which the Federal Reserve prepared data until early 2006 It will be shown that with its flow-of-funds data the Federal Reserve is still publishing information that enables an approximate M3 aggregate to be estimated Further the M3 money holdings

of the US economyrsquos main sectors ndash households non-financial business and financial business ndash can be tracked from the flow-of-funds numbers

The last decade has seen the lowest average annual increase per cent in nominal GDP since the 1930s The growth rate of M3 has also been the lowest since the 1930s This similarity of the rates of change contrasts with the behaviour of M1 M1 increased strongly during the Great Recession and afterwards Its rate of increase in the 2009ndash15 period was over double that in the preceding 48 years with its behaviour sharply divergent from nominal GDP With M2 the discrepancy is less marked but a discrepancy remains

Apart from its insights into the causation of the Great Recession the paper will have two provocative conclusions First the Federal Reserve should resume publication of an M3 aggregate and following the Bank of Englandrsquos example it should de-compose broad money into its sector constituents Second the interesting patterns in inter-sectoral money flows that seem to be recurrent in cyclical episodes can be monitored only from a simple-sum broad money aggregate Divisia indices prepared from aggregate economy-wide data cannot identify the patterns while ndash arguably ndash these patterns are important in understanding the transmission mechanism from money to the economy

The paperrsquos main thesis has already been developed in an informal way in an article in the 2016 Central Banking journal The aim of the paper will be to present the analysis in a more academically rigorous form

7Liquidity and Economic Activity Conference

PAPER 3Manmohan Singh International Monetary Fund lsquoTHE ROLE OF PLEDGED COLLATERAL IN LIQUIDITY METRICS AND MONETARY POLICYrsquo

Abstract Collateral does not flow in a vacuum and needs balance sheet(s) to move within the financial system Pledged collateral needs to be considered along with money metrics to fully understand the liquidity in the markets This paper analyses securities-lending derivatives and prime-brokerage markets as suppliers of collateral (as much has been written on the repo market) Going forward the official sectorrsquos choice of balance sheet(s) that allows the flow of liquidity (ie money and collateral) should be transparent and driven by market forces and not by ad hoc allocation by central banks Else this may be suboptimal on many fronts for monetary policy transmission for smooth money market functioning and ultimately for market liquidity

Keywords collateral velocity securities lending prime brokerage OTC derivatives repo

JEL Codes G21 G28 F33 K22

PAPER 4David Aikman Bank of England Andreas Lehnert Federal Reserve Bank Nellie Liang Federal Reserve Bank Michele Modugno Federal Reserve Bank lsquoCREDIT RISK APPETITE AND MONETARY POLICY TRANSMISSIONrsquo

Abstract We show that US economic performance and monetary policy transmission depend on nonfinancial sector credit and the effects are nonlinear When credit is below its trend increases in risk appetite lead to sustained increases in output In contrast when credit is above trend initial expansions are followed by additional excess borrowing and subsequent contractions suggesting an inter-temporal trade-off for economic activity Also tighter monetary policy is ineffective at slowing the economy when credit is high consistent with evidence of less transmission of policy changes to distant forward Treasury rates in high-credit periods

Keywords financial stability financial conditions credit asset bubbles monetary policy

JEL Codes E58 E65 G28

8 Liquidity and Economic Activity Conference

PAPER 5Kevin Fox University of New South Wales and Erwin Diewert University of British ColumbialsquoTHE DEMAND FOR MONETARY BALANCES AND THE MEASUREMENT OF PRODUCTIVITYrsquo

Abstract Firms in advanced economies have greatly increased their cash holdings since the mid-1990s While this has been observed and the reasons debated by central bankers international agencies and academics it remains somewhat of a puzzle This paper explores possible reasons for this increase and the implications for understanding productivity growth Monetary holdings have an opportunity cost ie allocating firm financial capital into monetary deposits means that investment in real assets is reduced Traditional measures of Total Factor Productivity (TFP) do not take into account these holdings of monetary assets Given the recent large increases in these holdings ex ante it can be expected that adding these monetary assets to the list of traditional sources of capital services will reduce the TFP of the business sector Using a new data set on the US aggregate (corporate and non-corporate) business sector we measure this effect for the noting the implications for the System of National Accounts of this expanded definition of capital services Also industry elasticities of demand to hold monetary balances using the Normalized Quadratic functional form A key finding is that the accumulation of monetary holdings is primarily a phenomenon of the non-corporate business sector

We have found that while conceptually more correct adding real money balances to our input aggregate does not change aggregate measured productivity performance very much for the corporate sector This is because even though there is some variation the asset share is relatively small The impact on the non-corporate sector is larger especially in the latter decades of the sample when currency and deposit holdings increased substantially especially relative to other asset holdings Finally the relative productivity of individual firms can be significantly impacted by differences in money holdings even if there is little aggregate effect at the sectoral level Indeed understanding productivity differences between small and large firms can be enhanced by taking into account currency and deposits small firms are often credit constrained and therefore have greater cash holdings Similarly accounting for cash holdings can provide an augmented understanding of productivity and profitability in studies of firm dynamics In addition understanding productivity differences between risky and less risky sectors and firms can be informed by differences in money balances where eg dependence on RampD is taken as a proxy for risk Hence this paper provides a framework and empirical results for a more comprehensive understanding of productivity growth and dynamics

9Liquidity and Economic Activity Conference

PAPER 6Dennis Fixler Bureau of Economic Analysis and Kim Zieschang International Monetary Fund lsquoPRODUCING LIQUIDITYrsquo

Abstract Based on a paper presented at the 2015 Meeting of the Society for Economic Measurement Paris Dennis Fixler is Chief Economist Bureau of Economic Analysis and Kim Zieschang is Adjunct Professor of Economics University of Queensland The views expressed in this paper are those of the authors and should not be attributed to the Bureau of Economic Analysis JEL codes E01 Measurement and Data on National Income and Product Accounts Commercial banks are a primary producer of liquidity in an economy Despite their importance there is no consensus on the measurement of the liquidity service and for the matter bank output in general This lack of consensus exits at the microeconomic level and at that level of the national accounts which attempts to capture the significant role of bank services in the output of the economy The current national accounts measure of bank output is termed financial intermediation services indirectly measured or lsquoFISIMrsquo springing from the 1993 version of the System of National accounts (the 1993 SNA) Calculating FISIM under the 2008 version of national accounting standards is simple and generally practical provided the compiler has a key datummdashthe lsquoreference rate of interestrsquo The calculation is essentially Output = (Reference rate of interest minus Deposit rate) times Deposit liabilities + (Loan rate minus Reference rate of interest) times Loan assets

As the deposit and loan financial instrument coverage of this formula implies the current national accounting standards apply it to deposit-takers such as banks as well as to

non-deposit-taking loan-making financial institutions such as finance companies and money lenders

A first issue in treatment of financial services since the 1993 version of the standards introduced the reference rate concept has been lack of consensus on how it should be determined Generally the idea has been to select an exogenous reference rate a government security rate or a combination of them is often used because it captures the risk-free reference rate that underlies the user cost of money as in Barnett (1978) Some have proposed alternative exogenous reference rates that are tied to the market determined risk of the security to which the reference rate is to be applied In any event we argue that the reference rate should be endogenously determined and should be the bankrsquos calculated cost of capital the overall rate of return paid to all sources of funding including equity on the liability side of the balance sheet Our lsquoreference rate of interestrsquo is therefore individual to each bank rather than an economy-wide constant A second issue in the national accounts dialogue on financial services is the scope of financial instruments that should be associated with the SNArsquos indirect financial services measure The 2008 SNA narrowed the scope of FISIM to the deposit and loan positions of financial corporations but previous versions included interest income flows on essentially all financial instruments Return to the broad 1993 financial instrument scope is nevertheless a research agenda item for the next version of the SNA and appears essential to align the SNA with the scope of liquidity measured by the money and banking literature and the associated standards for compiling financial statistics We argue that FISIM should cover all financial instruments

10 Liquidity and Economic Activity Conference

Armed with the cost of capital reference rate and full financial balance sheet instrument scope we derive the production identity (value of output equiv cost of production) from the income equiv expense and balanced sheet identities generating a FISIM like calculation of output with a single cost of capital reference rate for each enterprise rather than for the whole economy Note that by computing specific user cost prices of bank services it is in principle possible to aggregate them into a price index for liquidity services and correspondingly obtain a quantity index of liquidity services

Such aggregate measures would be useful in tracing the financial intermediation process into GDP or another aggregate measure of economic activity

On examining the SNA-type production identity for an individual bank we find the cost side contains a term within operating surplus ndash the equity leverage premium ndash that depends on the bankrsquos financing ndash the debt and equity composition of the liability side of its balance sheet Further it is inherent in the definition of the cost of capital reference rate that the equity leverage premium is identically equal to what we will term produced liquidity within the part of SNA financial services output of the bank coming from the debt instruments on the liability side of the its

balance sheet prominent among which being deposits Given that banks transform liabilities into assets the equity-leverage premium and the produced liquidity is tied to the risk bearing undertaken by the bank With an exogenous reference rate the risk bearing is completely embedded in the user cost price of the asset or liability product In our model because the entire balance sheet is used the risk bearing is tied to equity holders

This paper proposes a resolution to the scope and methodology issues in the ongoing national accounts conversation on financial services particularly on provision of liquidity by debt issuing enterprises and suggests that the equity leverage premium now included in the nominal output of banks be offset by an intermediate insurance input supplied by their equity holders This retains the current standardsrsquo origination of liquidity with banks (but also extends it to other debt issuing enterprises) while better exposing the link between the leverage risk bearing (provision of debt guarantees) of equity holding sectors and production of liquidity by banks (and other debt issuing enterprises) With the developed framework issues such as the measurement of output and productivity of the providers of liquidity services and other financial services can also be measured and incorporated into macroeconomic statistics

11Liquidity and Economic Activity Conference

PAPER 7Jan Willem Van den End De Nederlandsche Bank the Netherlands lsquoAPPLYING COMPLEXITY THEORY TO INTEREST RATES EVIDENCE OF CRITICAL TRANSITIONS IN THE EURO AREArsquo

Abstract We apply complexity theory to financial markets to show that excess liquidity created by the Eurosystem has led to critical transitions in the configuration of interest rates Complexity indicators turn out to be useful signals of tipping points and subsequent regime shifts in interest rates We find that the critical transitions are related to the increase of excess liquidity in the euro area These insights can help central banks to strike the right balance between the intention to support the financial system by injecting liquidity and potential unintended side-effects on market functioning

Keywords interest rates central banks and their policies monetary policy

JEL Codes E43 E58 E52

12 Liquidity and Economic Activity Conference

PAPER 8Michael Bowe Alliance Manchester Business School University of Manchester and University of Vaasa Olga Kolokolova Alliance Manchester Business School University of Manchester and Marcin Michalski Alliance Manchester Business School University of Manchester lsquoTOO BIG TO CARE TOO SMALL TO MATTER MACRO FINANCIAL POLICY AND BANK LIQUIDITY CREATIONrsquo

Abstract We estimate the volume of liquidity creation by US bank holding companies between 1997 and 2015 and examine the impact of changes in macrofinancial policies on the dynamics of this process We focus on three major policy developments occurring in the aftermath of the 2007ndash09 financial crisis bank capital regulation reform monetary stimulus through quantitative easing and the Troubled Asset Relief Program (TARP)

We use the three-step procedure proposed by Berger and Bouwman (2009) to calculate the dollar amount of liquidity a financial institution creates Initially we classify all balance sheet items and o_-balance sheet activities of an institution as liquid semi-liquid or illiquid to which we then assign liquidity weights of +1=2 (illiquid assets and liquid liabilities) 0 (semi-liquid assets and liabilities) or 10485761=2 (liquid assets illiquid liabilities and equity) respectively The dollar volume of liquidity creation is then calculated as liquidity-weighted sum of the items identified in the first step We find that the total amount of liquidity creation by banks in the sample increases by a factor of 365 from $14 trillion in 1997Q1 to $51 trillion in 2015Q4 Indeed the volume of liquidity creation increases at a faster pace than the gross domestic product of the United States which rises by a factor of 21 during the same period

The results of panel regressions reveal that the dynamics of bank liquidity creation differ considerably between small and large institutions The level of bank capital requirements and the stance of monetary policy impact the liquidity creation of both small and medium-sized banks Liquidity creation of the largest banks which control over 80 of the banking systemrsquos assets remains unaffected

We find that changes in the amount of liquidity creation by small banks per $1 of their gross total assets are positively related to changes in the term spread but inversely related to changes in their Tier 1 capital ratios Further we show that the volume of liquidity creation is positively related to the riskiness of a bankrsquos assets as measured by the ratio of risk-weighted assets to gross total assets regardless of its size classification We establish that TARP has negative short-term effects on small and medium banks and no immediate impact on the liquidity creation of the largest institutions in the sample In contrast participation in TARP leads to a long-term decline in liquidity provision per dollar of assets of the largest banks This persists even after the completion of the programme and repayment of TARP funding As nearly all of the largest TARP-recipient banks in the sample are subsequently classified as systemically important financial institutions our results suggest that the increased regulatory scrutiny may adversely affect their ability to create liquidity

By demonstrating that the stance of monetary policy and the level of bank capital requirements do not tangibly enhance the liquidity provision efficiency of the largest systemically important institutions in the system our study offers important insights for the design of effective macroprudential policies

13Liquidity and Economic Activity Conference

PAPER 9 Jonathan Goldberg Federal Reserve Board lsquoTHE SUPPLY OF LIQUIDITY AND REAL ECONOMIC ACTIVITYrsquo

Abstract This paper identifies shocks to the supply of liquidity by dealer firms and investigates their effects on real economic activity First I develop a simple theoretical model of dealer intermediation then in a structural VAR model I use sign restrictions derived from the theoretical model to identify liquidity supply shocks Liquidity supply shocks that are orthogonal to information contained in macroeconomic and asset price variables have considerable predictive power for economic activity Moreover positive liquidity supply shocks cause large and persistent increases in real activity

Keywords liquidity dealer intermediation risk-taking real activity liquidity shocks

JEL Codes G10 G12 G17 G24

14 Liquidity and Economic Activity Conference

PAPER 10 Dirk Bezemer University of Groningen the Netherlands and Lu Zhang Sustainable Finance Lab and Utrecht University lsquoMACROECONOMIC IMPLICATIONS OF LIQUIDITY CREATION CREDIT ALLOCATION AND POST CRISIS RECESSIONSrsquo

Abstract In this paper we address macroeconomic implications of liquidity creation through bank lending and the impacts of liquidity on economic activity We note that liquidity created through bank lending can be channeled into the real sector in support of economic activity or in financial and real estate markets in support of capital gains We collected macro-level data on bank credit aggregates over 2000ndash12 for 57 economies categorise according to the use of credit We note the long-term shift in the allocation of bank credit creation away from non-financial business lending and towards financial and especially real estate markets We then present new evidence on the channels from credit allocation pre-crisis to the severity of post-crisis recessions

Our first contribution is to show that it is not just the level but the composition of debt (defined as the share of mortgage credit in total credit) that matters A second contribution is to analyse the channels We collect additional industry-level data across 20 industries for a subset of economies We analyze the effect of changes in the pre-crisis composition of debt on total GDP and on investment consumption and capital allocation We find that changes in the share of household mortgage credit before the crisis have a significant effect on recession

severity after the 2007 crisis This is not the case for any other credit category nor for growth of total bank credit We address the causality challenge by using the difference between IMF growth forecasts and growth realisations This filters out country-specific drivers of both debt and income growth We address the model selection challenge by using Bayesian averaging models This indicates that the change in credit composition is among the three most robust determinants of post-crisis recession severity with income levels and current account balance The findings are robust to a wide range of control variables and to the different responses across advancedemerging and EMUnon-EMU economies

We then delve into the channels from change in debt composition to income growth loss The literature to date has focused on negative wealth effects on consumption for which we find strong evidence In addition we find evidence for two investment channels a loan supply effect and a capital allocation effect In the industry-level analysis we find that in economies which experienced a larger change in debt composition before 2008 there was a larger reduction of credit available and weaker capital re-allocation towards sectors with higher value-added This effect is observed already before the crisis and very strongly after the crisis We discuss policy implications and future research

Keywords private credit mortgages crisis output loss investment capital allocation

JEL Codes C11 C15 E01 O4

15Liquidity and Economic Activity Conference

PAPER 11 Iftekhar Hasan Gabelli School of Business Fordham University and Jean-Loup Soula Strasbourg University LaRGE Research Centre lsquoTECHNICAL EFFICIENCY IN BANK LIQUIDITY CREATIONrsquo

Abstract This paper generates an optimum bank liquidity creation benchmark by tracing an efficient frontier in liquidity creation (bank intermediation) and questions why some banks are more efficient than others in such activities Evidence reveals that medium size banks are most correlated to efficient frontier irrespective of their business models Small (large) banks ndash focused on traditional banking activities ndash are found to be the most (least) efficient in creating liquidity in on-balance sheet items whereas large banks ndash involved in non-traditional activities ndash are found to be most efficient in off-balance sheet liquidity creation Additionally the liquidity efficiency of small banks is more resilient during the 2007ndash08 financial crisis relative to other banks

Keywords banks technical efficiency liquidity creation diversification

Jel Codes G21 G28 G32

16 Liquidity and Economic Activity Conference

PAPER 12 Richard Anderson Lindenwood University John Duca Federal Reserve Bank of Dallas and Barry Jones Department of Economics State University of New York lsquoA BROAD MONETARY SERVICES (LIQUIDITY) INDEX AND ITS LONG-TERM LINKS TO ECONOMIC ACTIVITYrsquo

Abstract Liquid assets play a crucial role in economic activity as the medium in which payments are received and are made lsquoSudden stopsrsquo in financial markets ndash during which liquid assets are hoarded ndash are periods when economic activity slows abruptly Further it is the sine qua non of financial intermediation to alter the measured relative and absolute quantities of liquid assets In this way the observed quantities of liquid assets reflect both the path of past economic activity and anticipations of future activity The quantities must be regarded as arising endogenously within an intertemporal general equilibrium model of the economy a la Tobin (1958) and Merton (1971) Economic modeling and analysis traditionally proceeds by combining relatively high-dimension lists of specific assets into lower-dimension lsquomonetary aggregatesrsquo The defining characteristic of the assets included is that all are available to facilitate

the exchange of goods and services at a transaction cost less than infinity That is all included assets may be sold or used as collateral for the purchase and sale of goods and services and thereby provide liquidity services which can be tracked by measured opportunity costs of foregone interest which may not in practice reflect all transactions costs The last caveat particularly applies to household holdings of mutual fund assets outside of money market funds Our study contributes to the literature in two key ways First it expands a conventional Divisia measure of money services to account for the liquidity provided by such mutual fund assets Second it then explores the long-run connections between economic activity and monetary aggregates constructed as index numbers from 1929 to 2016 finding that the inclusion of mutual fund liquidity services results in a Divisia measure of money that has a much more stable velocity

We thank Emil Mihalov and Tyler Atkinson for research assistance The views expressed are those of the authors and do not necessarily reflect those of the Federal Reserve Bank of Dallas or the Federal Reserve System Any errors are our own

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 6: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

6 Liquidity and Economic Activity Conference

PAPER 2Professor Tim Congdon CBE Chairman Institute of International Monetary Research at the University of Buckingham lsquoWHAT WERE THE CAUSES OF THE GREAT RECESSION THE IMPORTANCE OF THE ldquoWHICH AGGREGATErdquo DEBATErsquo

Abstract Monetary economists have long debated which measure of the quantity of money is the most useful in macroeconomic analysis Before during and after the Great Recession (to be understood ndash roughly speaking as the six quarters to mid-2009) the growth rates of different money aggregates diverged sharply in the leading economies The lsquowhich aggregatersquo debate was therefore of particular importance The focus in this paper is on the USArsquos experience although the behaviour of money in other economies and its relationship to prices and spending in them is mentioned where relevant and interesting

The argument is that broadly-defined money has long been the correct concept to use in interpreting macroeconomic developments and that its merits became clear in the Great Recession and its aftermath Broad money is to be viewed as including virtually all money-like assets (and certainly most of the deposit liabilities of the commercial banking system) For the purposes of the paper broad money is identified with the M3 money measure for which the Federal Reserve prepared data until early 2006 It will be shown that with its flow-of-funds data the Federal Reserve is still publishing information that enables an approximate M3 aggregate to be estimated Further the M3 money holdings

of the US economyrsquos main sectors ndash households non-financial business and financial business ndash can be tracked from the flow-of-funds numbers

The last decade has seen the lowest average annual increase per cent in nominal GDP since the 1930s The growth rate of M3 has also been the lowest since the 1930s This similarity of the rates of change contrasts with the behaviour of M1 M1 increased strongly during the Great Recession and afterwards Its rate of increase in the 2009ndash15 period was over double that in the preceding 48 years with its behaviour sharply divergent from nominal GDP With M2 the discrepancy is less marked but a discrepancy remains

Apart from its insights into the causation of the Great Recession the paper will have two provocative conclusions First the Federal Reserve should resume publication of an M3 aggregate and following the Bank of Englandrsquos example it should de-compose broad money into its sector constituents Second the interesting patterns in inter-sectoral money flows that seem to be recurrent in cyclical episodes can be monitored only from a simple-sum broad money aggregate Divisia indices prepared from aggregate economy-wide data cannot identify the patterns while ndash arguably ndash these patterns are important in understanding the transmission mechanism from money to the economy

The paperrsquos main thesis has already been developed in an informal way in an article in the 2016 Central Banking journal The aim of the paper will be to present the analysis in a more academically rigorous form

7Liquidity and Economic Activity Conference

PAPER 3Manmohan Singh International Monetary Fund lsquoTHE ROLE OF PLEDGED COLLATERAL IN LIQUIDITY METRICS AND MONETARY POLICYrsquo

Abstract Collateral does not flow in a vacuum and needs balance sheet(s) to move within the financial system Pledged collateral needs to be considered along with money metrics to fully understand the liquidity in the markets This paper analyses securities-lending derivatives and prime-brokerage markets as suppliers of collateral (as much has been written on the repo market) Going forward the official sectorrsquos choice of balance sheet(s) that allows the flow of liquidity (ie money and collateral) should be transparent and driven by market forces and not by ad hoc allocation by central banks Else this may be suboptimal on many fronts for monetary policy transmission for smooth money market functioning and ultimately for market liquidity

Keywords collateral velocity securities lending prime brokerage OTC derivatives repo

JEL Codes G21 G28 F33 K22

PAPER 4David Aikman Bank of England Andreas Lehnert Federal Reserve Bank Nellie Liang Federal Reserve Bank Michele Modugno Federal Reserve Bank lsquoCREDIT RISK APPETITE AND MONETARY POLICY TRANSMISSIONrsquo

Abstract We show that US economic performance and monetary policy transmission depend on nonfinancial sector credit and the effects are nonlinear When credit is below its trend increases in risk appetite lead to sustained increases in output In contrast when credit is above trend initial expansions are followed by additional excess borrowing and subsequent contractions suggesting an inter-temporal trade-off for economic activity Also tighter monetary policy is ineffective at slowing the economy when credit is high consistent with evidence of less transmission of policy changes to distant forward Treasury rates in high-credit periods

Keywords financial stability financial conditions credit asset bubbles monetary policy

JEL Codes E58 E65 G28

8 Liquidity and Economic Activity Conference

PAPER 5Kevin Fox University of New South Wales and Erwin Diewert University of British ColumbialsquoTHE DEMAND FOR MONETARY BALANCES AND THE MEASUREMENT OF PRODUCTIVITYrsquo

Abstract Firms in advanced economies have greatly increased their cash holdings since the mid-1990s While this has been observed and the reasons debated by central bankers international agencies and academics it remains somewhat of a puzzle This paper explores possible reasons for this increase and the implications for understanding productivity growth Monetary holdings have an opportunity cost ie allocating firm financial capital into monetary deposits means that investment in real assets is reduced Traditional measures of Total Factor Productivity (TFP) do not take into account these holdings of monetary assets Given the recent large increases in these holdings ex ante it can be expected that adding these monetary assets to the list of traditional sources of capital services will reduce the TFP of the business sector Using a new data set on the US aggregate (corporate and non-corporate) business sector we measure this effect for the noting the implications for the System of National Accounts of this expanded definition of capital services Also industry elasticities of demand to hold monetary balances using the Normalized Quadratic functional form A key finding is that the accumulation of monetary holdings is primarily a phenomenon of the non-corporate business sector

We have found that while conceptually more correct adding real money balances to our input aggregate does not change aggregate measured productivity performance very much for the corporate sector This is because even though there is some variation the asset share is relatively small The impact on the non-corporate sector is larger especially in the latter decades of the sample when currency and deposit holdings increased substantially especially relative to other asset holdings Finally the relative productivity of individual firms can be significantly impacted by differences in money holdings even if there is little aggregate effect at the sectoral level Indeed understanding productivity differences between small and large firms can be enhanced by taking into account currency and deposits small firms are often credit constrained and therefore have greater cash holdings Similarly accounting for cash holdings can provide an augmented understanding of productivity and profitability in studies of firm dynamics In addition understanding productivity differences between risky and less risky sectors and firms can be informed by differences in money balances where eg dependence on RampD is taken as a proxy for risk Hence this paper provides a framework and empirical results for a more comprehensive understanding of productivity growth and dynamics

9Liquidity and Economic Activity Conference

PAPER 6Dennis Fixler Bureau of Economic Analysis and Kim Zieschang International Monetary Fund lsquoPRODUCING LIQUIDITYrsquo

Abstract Based on a paper presented at the 2015 Meeting of the Society for Economic Measurement Paris Dennis Fixler is Chief Economist Bureau of Economic Analysis and Kim Zieschang is Adjunct Professor of Economics University of Queensland The views expressed in this paper are those of the authors and should not be attributed to the Bureau of Economic Analysis JEL codes E01 Measurement and Data on National Income and Product Accounts Commercial banks are a primary producer of liquidity in an economy Despite their importance there is no consensus on the measurement of the liquidity service and for the matter bank output in general This lack of consensus exits at the microeconomic level and at that level of the national accounts which attempts to capture the significant role of bank services in the output of the economy The current national accounts measure of bank output is termed financial intermediation services indirectly measured or lsquoFISIMrsquo springing from the 1993 version of the System of National accounts (the 1993 SNA) Calculating FISIM under the 2008 version of national accounting standards is simple and generally practical provided the compiler has a key datummdashthe lsquoreference rate of interestrsquo The calculation is essentially Output = (Reference rate of interest minus Deposit rate) times Deposit liabilities + (Loan rate minus Reference rate of interest) times Loan assets

As the deposit and loan financial instrument coverage of this formula implies the current national accounting standards apply it to deposit-takers such as banks as well as to

non-deposit-taking loan-making financial institutions such as finance companies and money lenders

A first issue in treatment of financial services since the 1993 version of the standards introduced the reference rate concept has been lack of consensus on how it should be determined Generally the idea has been to select an exogenous reference rate a government security rate or a combination of them is often used because it captures the risk-free reference rate that underlies the user cost of money as in Barnett (1978) Some have proposed alternative exogenous reference rates that are tied to the market determined risk of the security to which the reference rate is to be applied In any event we argue that the reference rate should be endogenously determined and should be the bankrsquos calculated cost of capital the overall rate of return paid to all sources of funding including equity on the liability side of the balance sheet Our lsquoreference rate of interestrsquo is therefore individual to each bank rather than an economy-wide constant A second issue in the national accounts dialogue on financial services is the scope of financial instruments that should be associated with the SNArsquos indirect financial services measure The 2008 SNA narrowed the scope of FISIM to the deposit and loan positions of financial corporations but previous versions included interest income flows on essentially all financial instruments Return to the broad 1993 financial instrument scope is nevertheless a research agenda item for the next version of the SNA and appears essential to align the SNA with the scope of liquidity measured by the money and banking literature and the associated standards for compiling financial statistics We argue that FISIM should cover all financial instruments

10 Liquidity and Economic Activity Conference

Armed with the cost of capital reference rate and full financial balance sheet instrument scope we derive the production identity (value of output equiv cost of production) from the income equiv expense and balanced sheet identities generating a FISIM like calculation of output with a single cost of capital reference rate for each enterprise rather than for the whole economy Note that by computing specific user cost prices of bank services it is in principle possible to aggregate them into a price index for liquidity services and correspondingly obtain a quantity index of liquidity services

Such aggregate measures would be useful in tracing the financial intermediation process into GDP or another aggregate measure of economic activity

On examining the SNA-type production identity for an individual bank we find the cost side contains a term within operating surplus ndash the equity leverage premium ndash that depends on the bankrsquos financing ndash the debt and equity composition of the liability side of its balance sheet Further it is inherent in the definition of the cost of capital reference rate that the equity leverage premium is identically equal to what we will term produced liquidity within the part of SNA financial services output of the bank coming from the debt instruments on the liability side of the its

balance sheet prominent among which being deposits Given that banks transform liabilities into assets the equity-leverage premium and the produced liquidity is tied to the risk bearing undertaken by the bank With an exogenous reference rate the risk bearing is completely embedded in the user cost price of the asset or liability product In our model because the entire balance sheet is used the risk bearing is tied to equity holders

This paper proposes a resolution to the scope and methodology issues in the ongoing national accounts conversation on financial services particularly on provision of liquidity by debt issuing enterprises and suggests that the equity leverage premium now included in the nominal output of banks be offset by an intermediate insurance input supplied by their equity holders This retains the current standardsrsquo origination of liquidity with banks (but also extends it to other debt issuing enterprises) while better exposing the link between the leverage risk bearing (provision of debt guarantees) of equity holding sectors and production of liquidity by banks (and other debt issuing enterprises) With the developed framework issues such as the measurement of output and productivity of the providers of liquidity services and other financial services can also be measured and incorporated into macroeconomic statistics

11Liquidity and Economic Activity Conference

PAPER 7Jan Willem Van den End De Nederlandsche Bank the Netherlands lsquoAPPLYING COMPLEXITY THEORY TO INTEREST RATES EVIDENCE OF CRITICAL TRANSITIONS IN THE EURO AREArsquo

Abstract We apply complexity theory to financial markets to show that excess liquidity created by the Eurosystem has led to critical transitions in the configuration of interest rates Complexity indicators turn out to be useful signals of tipping points and subsequent regime shifts in interest rates We find that the critical transitions are related to the increase of excess liquidity in the euro area These insights can help central banks to strike the right balance between the intention to support the financial system by injecting liquidity and potential unintended side-effects on market functioning

Keywords interest rates central banks and their policies monetary policy

JEL Codes E43 E58 E52

12 Liquidity and Economic Activity Conference

PAPER 8Michael Bowe Alliance Manchester Business School University of Manchester and University of Vaasa Olga Kolokolova Alliance Manchester Business School University of Manchester and Marcin Michalski Alliance Manchester Business School University of Manchester lsquoTOO BIG TO CARE TOO SMALL TO MATTER MACRO FINANCIAL POLICY AND BANK LIQUIDITY CREATIONrsquo

Abstract We estimate the volume of liquidity creation by US bank holding companies between 1997 and 2015 and examine the impact of changes in macrofinancial policies on the dynamics of this process We focus on three major policy developments occurring in the aftermath of the 2007ndash09 financial crisis bank capital regulation reform monetary stimulus through quantitative easing and the Troubled Asset Relief Program (TARP)

We use the three-step procedure proposed by Berger and Bouwman (2009) to calculate the dollar amount of liquidity a financial institution creates Initially we classify all balance sheet items and o_-balance sheet activities of an institution as liquid semi-liquid or illiquid to which we then assign liquidity weights of +1=2 (illiquid assets and liquid liabilities) 0 (semi-liquid assets and liabilities) or 10485761=2 (liquid assets illiquid liabilities and equity) respectively The dollar volume of liquidity creation is then calculated as liquidity-weighted sum of the items identified in the first step We find that the total amount of liquidity creation by banks in the sample increases by a factor of 365 from $14 trillion in 1997Q1 to $51 trillion in 2015Q4 Indeed the volume of liquidity creation increases at a faster pace than the gross domestic product of the United States which rises by a factor of 21 during the same period

The results of panel regressions reveal that the dynamics of bank liquidity creation differ considerably between small and large institutions The level of bank capital requirements and the stance of monetary policy impact the liquidity creation of both small and medium-sized banks Liquidity creation of the largest banks which control over 80 of the banking systemrsquos assets remains unaffected

We find that changes in the amount of liquidity creation by small banks per $1 of their gross total assets are positively related to changes in the term spread but inversely related to changes in their Tier 1 capital ratios Further we show that the volume of liquidity creation is positively related to the riskiness of a bankrsquos assets as measured by the ratio of risk-weighted assets to gross total assets regardless of its size classification We establish that TARP has negative short-term effects on small and medium banks and no immediate impact on the liquidity creation of the largest institutions in the sample In contrast participation in TARP leads to a long-term decline in liquidity provision per dollar of assets of the largest banks This persists even after the completion of the programme and repayment of TARP funding As nearly all of the largest TARP-recipient banks in the sample are subsequently classified as systemically important financial institutions our results suggest that the increased regulatory scrutiny may adversely affect their ability to create liquidity

By demonstrating that the stance of monetary policy and the level of bank capital requirements do not tangibly enhance the liquidity provision efficiency of the largest systemically important institutions in the system our study offers important insights for the design of effective macroprudential policies

13Liquidity and Economic Activity Conference

PAPER 9 Jonathan Goldberg Federal Reserve Board lsquoTHE SUPPLY OF LIQUIDITY AND REAL ECONOMIC ACTIVITYrsquo

Abstract This paper identifies shocks to the supply of liquidity by dealer firms and investigates their effects on real economic activity First I develop a simple theoretical model of dealer intermediation then in a structural VAR model I use sign restrictions derived from the theoretical model to identify liquidity supply shocks Liquidity supply shocks that are orthogonal to information contained in macroeconomic and asset price variables have considerable predictive power for economic activity Moreover positive liquidity supply shocks cause large and persistent increases in real activity

Keywords liquidity dealer intermediation risk-taking real activity liquidity shocks

JEL Codes G10 G12 G17 G24

14 Liquidity and Economic Activity Conference

PAPER 10 Dirk Bezemer University of Groningen the Netherlands and Lu Zhang Sustainable Finance Lab and Utrecht University lsquoMACROECONOMIC IMPLICATIONS OF LIQUIDITY CREATION CREDIT ALLOCATION AND POST CRISIS RECESSIONSrsquo

Abstract In this paper we address macroeconomic implications of liquidity creation through bank lending and the impacts of liquidity on economic activity We note that liquidity created through bank lending can be channeled into the real sector in support of economic activity or in financial and real estate markets in support of capital gains We collected macro-level data on bank credit aggregates over 2000ndash12 for 57 economies categorise according to the use of credit We note the long-term shift in the allocation of bank credit creation away from non-financial business lending and towards financial and especially real estate markets We then present new evidence on the channels from credit allocation pre-crisis to the severity of post-crisis recessions

Our first contribution is to show that it is not just the level but the composition of debt (defined as the share of mortgage credit in total credit) that matters A second contribution is to analyse the channels We collect additional industry-level data across 20 industries for a subset of economies We analyze the effect of changes in the pre-crisis composition of debt on total GDP and on investment consumption and capital allocation We find that changes in the share of household mortgage credit before the crisis have a significant effect on recession

severity after the 2007 crisis This is not the case for any other credit category nor for growth of total bank credit We address the causality challenge by using the difference between IMF growth forecasts and growth realisations This filters out country-specific drivers of both debt and income growth We address the model selection challenge by using Bayesian averaging models This indicates that the change in credit composition is among the three most robust determinants of post-crisis recession severity with income levels and current account balance The findings are robust to a wide range of control variables and to the different responses across advancedemerging and EMUnon-EMU economies

We then delve into the channels from change in debt composition to income growth loss The literature to date has focused on negative wealth effects on consumption for which we find strong evidence In addition we find evidence for two investment channels a loan supply effect and a capital allocation effect In the industry-level analysis we find that in economies which experienced a larger change in debt composition before 2008 there was a larger reduction of credit available and weaker capital re-allocation towards sectors with higher value-added This effect is observed already before the crisis and very strongly after the crisis We discuss policy implications and future research

Keywords private credit mortgages crisis output loss investment capital allocation

JEL Codes C11 C15 E01 O4

15Liquidity and Economic Activity Conference

PAPER 11 Iftekhar Hasan Gabelli School of Business Fordham University and Jean-Loup Soula Strasbourg University LaRGE Research Centre lsquoTECHNICAL EFFICIENCY IN BANK LIQUIDITY CREATIONrsquo

Abstract This paper generates an optimum bank liquidity creation benchmark by tracing an efficient frontier in liquidity creation (bank intermediation) and questions why some banks are more efficient than others in such activities Evidence reveals that medium size banks are most correlated to efficient frontier irrespective of their business models Small (large) banks ndash focused on traditional banking activities ndash are found to be the most (least) efficient in creating liquidity in on-balance sheet items whereas large banks ndash involved in non-traditional activities ndash are found to be most efficient in off-balance sheet liquidity creation Additionally the liquidity efficiency of small banks is more resilient during the 2007ndash08 financial crisis relative to other banks

Keywords banks technical efficiency liquidity creation diversification

Jel Codes G21 G28 G32

16 Liquidity and Economic Activity Conference

PAPER 12 Richard Anderson Lindenwood University John Duca Federal Reserve Bank of Dallas and Barry Jones Department of Economics State University of New York lsquoA BROAD MONETARY SERVICES (LIQUIDITY) INDEX AND ITS LONG-TERM LINKS TO ECONOMIC ACTIVITYrsquo

Abstract Liquid assets play a crucial role in economic activity as the medium in which payments are received and are made lsquoSudden stopsrsquo in financial markets ndash during which liquid assets are hoarded ndash are periods when economic activity slows abruptly Further it is the sine qua non of financial intermediation to alter the measured relative and absolute quantities of liquid assets In this way the observed quantities of liquid assets reflect both the path of past economic activity and anticipations of future activity The quantities must be regarded as arising endogenously within an intertemporal general equilibrium model of the economy a la Tobin (1958) and Merton (1971) Economic modeling and analysis traditionally proceeds by combining relatively high-dimension lists of specific assets into lower-dimension lsquomonetary aggregatesrsquo The defining characteristic of the assets included is that all are available to facilitate

the exchange of goods and services at a transaction cost less than infinity That is all included assets may be sold or used as collateral for the purchase and sale of goods and services and thereby provide liquidity services which can be tracked by measured opportunity costs of foregone interest which may not in practice reflect all transactions costs The last caveat particularly applies to household holdings of mutual fund assets outside of money market funds Our study contributes to the literature in two key ways First it expands a conventional Divisia measure of money services to account for the liquidity provided by such mutual fund assets Second it then explores the long-run connections between economic activity and monetary aggregates constructed as index numbers from 1929 to 2016 finding that the inclusion of mutual fund liquidity services results in a Divisia measure of money that has a much more stable velocity

We thank Emil Mihalov and Tyler Atkinson for research assistance The views expressed are those of the authors and do not necessarily reflect those of the Federal Reserve Bank of Dallas or the Federal Reserve System Any errors are our own

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 7: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

7Liquidity and Economic Activity Conference

PAPER 3Manmohan Singh International Monetary Fund lsquoTHE ROLE OF PLEDGED COLLATERAL IN LIQUIDITY METRICS AND MONETARY POLICYrsquo

Abstract Collateral does not flow in a vacuum and needs balance sheet(s) to move within the financial system Pledged collateral needs to be considered along with money metrics to fully understand the liquidity in the markets This paper analyses securities-lending derivatives and prime-brokerage markets as suppliers of collateral (as much has been written on the repo market) Going forward the official sectorrsquos choice of balance sheet(s) that allows the flow of liquidity (ie money and collateral) should be transparent and driven by market forces and not by ad hoc allocation by central banks Else this may be suboptimal on many fronts for monetary policy transmission for smooth money market functioning and ultimately for market liquidity

Keywords collateral velocity securities lending prime brokerage OTC derivatives repo

JEL Codes G21 G28 F33 K22

PAPER 4David Aikman Bank of England Andreas Lehnert Federal Reserve Bank Nellie Liang Federal Reserve Bank Michele Modugno Federal Reserve Bank lsquoCREDIT RISK APPETITE AND MONETARY POLICY TRANSMISSIONrsquo

Abstract We show that US economic performance and monetary policy transmission depend on nonfinancial sector credit and the effects are nonlinear When credit is below its trend increases in risk appetite lead to sustained increases in output In contrast when credit is above trend initial expansions are followed by additional excess borrowing and subsequent contractions suggesting an inter-temporal trade-off for economic activity Also tighter monetary policy is ineffective at slowing the economy when credit is high consistent with evidence of less transmission of policy changes to distant forward Treasury rates in high-credit periods

Keywords financial stability financial conditions credit asset bubbles monetary policy

JEL Codes E58 E65 G28

8 Liquidity and Economic Activity Conference

PAPER 5Kevin Fox University of New South Wales and Erwin Diewert University of British ColumbialsquoTHE DEMAND FOR MONETARY BALANCES AND THE MEASUREMENT OF PRODUCTIVITYrsquo

Abstract Firms in advanced economies have greatly increased their cash holdings since the mid-1990s While this has been observed and the reasons debated by central bankers international agencies and academics it remains somewhat of a puzzle This paper explores possible reasons for this increase and the implications for understanding productivity growth Monetary holdings have an opportunity cost ie allocating firm financial capital into monetary deposits means that investment in real assets is reduced Traditional measures of Total Factor Productivity (TFP) do not take into account these holdings of monetary assets Given the recent large increases in these holdings ex ante it can be expected that adding these monetary assets to the list of traditional sources of capital services will reduce the TFP of the business sector Using a new data set on the US aggregate (corporate and non-corporate) business sector we measure this effect for the noting the implications for the System of National Accounts of this expanded definition of capital services Also industry elasticities of demand to hold monetary balances using the Normalized Quadratic functional form A key finding is that the accumulation of monetary holdings is primarily a phenomenon of the non-corporate business sector

We have found that while conceptually more correct adding real money balances to our input aggregate does not change aggregate measured productivity performance very much for the corporate sector This is because even though there is some variation the asset share is relatively small The impact on the non-corporate sector is larger especially in the latter decades of the sample when currency and deposit holdings increased substantially especially relative to other asset holdings Finally the relative productivity of individual firms can be significantly impacted by differences in money holdings even if there is little aggregate effect at the sectoral level Indeed understanding productivity differences between small and large firms can be enhanced by taking into account currency and deposits small firms are often credit constrained and therefore have greater cash holdings Similarly accounting for cash holdings can provide an augmented understanding of productivity and profitability in studies of firm dynamics In addition understanding productivity differences between risky and less risky sectors and firms can be informed by differences in money balances where eg dependence on RampD is taken as a proxy for risk Hence this paper provides a framework and empirical results for a more comprehensive understanding of productivity growth and dynamics

9Liquidity and Economic Activity Conference

PAPER 6Dennis Fixler Bureau of Economic Analysis and Kim Zieschang International Monetary Fund lsquoPRODUCING LIQUIDITYrsquo

Abstract Based on a paper presented at the 2015 Meeting of the Society for Economic Measurement Paris Dennis Fixler is Chief Economist Bureau of Economic Analysis and Kim Zieschang is Adjunct Professor of Economics University of Queensland The views expressed in this paper are those of the authors and should not be attributed to the Bureau of Economic Analysis JEL codes E01 Measurement and Data on National Income and Product Accounts Commercial banks are a primary producer of liquidity in an economy Despite their importance there is no consensus on the measurement of the liquidity service and for the matter bank output in general This lack of consensus exits at the microeconomic level and at that level of the national accounts which attempts to capture the significant role of bank services in the output of the economy The current national accounts measure of bank output is termed financial intermediation services indirectly measured or lsquoFISIMrsquo springing from the 1993 version of the System of National accounts (the 1993 SNA) Calculating FISIM under the 2008 version of national accounting standards is simple and generally practical provided the compiler has a key datummdashthe lsquoreference rate of interestrsquo The calculation is essentially Output = (Reference rate of interest minus Deposit rate) times Deposit liabilities + (Loan rate minus Reference rate of interest) times Loan assets

As the deposit and loan financial instrument coverage of this formula implies the current national accounting standards apply it to deposit-takers such as banks as well as to

non-deposit-taking loan-making financial institutions such as finance companies and money lenders

A first issue in treatment of financial services since the 1993 version of the standards introduced the reference rate concept has been lack of consensus on how it should be determined Generally the idea has been to select an exogenous reference rate a government security rate or a combination of them is often used because it captures the risk-free reference rate that underlies the user cost of money as in Barnett (1978) Some have proposed alternative exogenous reference rates that are tied to the market determined risk of the security to which the reference rate is to be applied In any event we argue that the reference rate should be endogenously determined and should be the bankrsquos calculated cost of capital the overall rate of return paid to all sources of funding including equity on the liability side of the balance sheet Our lsquoreference rate of interestrsquo is therefore individual to each bank rather than an economy-wide constant A second issue in the national accounts dialogue on financial services is the scope of financial instruments that should be associated with the SNArsquos indirect financial services measure The 2008 SNA narrowed the scope of FISIM to the deposit and loan positions of financial corporations but previous versions included interest income flows on essentially all financial instruments Return to the broad 1993 financial instrument scope is nevertheless a research agenda item for the next version of the SNA and appears essential to align the SNA with the scope of liquidity measured by the money and banking literature and the associated standards for compiling financial statistics We argue that FISIM should cover all financial instruments

10 Liquidity and Economic Activity Conference

Armed with the cost of capital reference rate and full financial balance sheet instrument scope we derive the production identity (value of output equiv cost of production) from the income equiv expense and balanced sheet identities generating a FISIM like calculation of output with a single cost of capital reference rate for each enterprise rather than for the whole economy Note that by computing specific user cost prices of bank services it is in principle possible to aggregate them into a price index for liquidity services and correspondingly obtain a quantity index of liquidity services

Such aggregate measures would be useful in tracing the financial intermediation process into GDP or another aggregate measure of economic activity

On examining the SNA-type production identity for an individual bank we find the cost side contains a term within operating surplus ndash the equity leverage premium ndash that depends on the bankrsquos financing ndash the debt and equity composition of the liability side of its balance sheet Further it is inherent in the definition of the cost of capital reference rate that the equity leverage premium is identically equal to what we will term produced liquidity within the part of SNA financial services output of the bank coming from the debt instruments on the liability side of the its

balance sheet prominent among which being deposits Given that banks transform liabilities into assets the equity-leverage premium and the produced liquidity is tied to the risk bearing undertaken by the bank With an exogenous reference rate the risk bearing is completely embedded in the user cost price of the asset or liability product In our model because the entire balance sheet is used the risk bearing is tied to equity holders

This paper proposes a resolution to the scope and methodology issues in the ongoing national accounts conversation on financial services particularly on provision of liquidity by debt issuing enterprises and suggests that the equity leverage premium now included in the nominal output of banks be offset by an intermediate insurance input supplied by their equity holders This retains the current standardsrsquo origination of liquidity with banks (but also extends it to other debt issuing enterprises) while better exposing the link between the leverage risk bearing (provision of debt guarantees) of equity holding sectors and production of liquidity by banks (and other debt issuing enterprises) With the developed framework issues such as the measurement of output and productivity of the providers of liquidity services and other financial services can also be measured and incorporated into macroeconomic statistics

11Liquidity and Economic Activity Conference

PAPER 7Jan Willem Van den End De Nederlandsche Bank the Netherlands lsquoAPPLYING COMPLEXITY THEORY TO INTEREST RATES EVIDENCE OF CRITICAL TRANSITIONS IN THE EURO AREArsquo

Abstract We apply complexity theory to financial markets to show that excess liquidity created by the Eurosystem has led to critical transitions in the configuration of interest rates Complexity indicators turn out to be useful signals of tipping points and subsequent regime shifts in interest rates We find that the critical transitions are related to the increase of excess liquidity in the euro area These insights can help central banks to strike the right balance between the intention to support the financial system by injecting liquidity and potential unintended side-effects on market functioning

Keywords interest rates central banks and their policies monetary policy

JEL Codes E43 E58 E52

12 Liquidity and Economic Activity Conference

PAPER 8Michael Bowe Alliance Manchester Business School University of Manchester and University of Vaasa Olga Kolokolova Alliance Manchester Business School University of Manchester and Marcin Michalski Alliance Manchester Business School University of Manchester lsquoTOO BIG TO CARE TOO SMALL TO MATTER MACRO FINANCIAL POLICY AND BANK LIQUIDITY CREATIONrsquo

Abstract We estimate the volume of liquidity creation by US bank holding companies between 1997 and 2015 and examine the impact of changes in macrofinancial policies on the dynamics of this process We focus on three major policy developments occurring in the aftermath of the 2007ndash09 financial crisis bank capital regulation reform monetary stimulus through quantitative easing and the Troubled Asset Relief Program (TARP)

We use the three-step procedure proposed by Berger and Bouwman (2009) to calculate the dollar amount of liquidity a financial institution creates Initially we classify all balance sheet items and o_-balance sheet activities of an institution as liquid semi-liquid or illiquid to which we then assign liquidity weights of +1=2 (illiquid assets and liquid liabilities) 0 (semi-liquid assets and liabilities) or 10485761=2 (liquid assets illiquid liabilities and equity) respectively The dollar volume of liquidity creation is then calculated as liquidity-weighted sum of the items identified in the first step We find that the total amount of liquidity creation by banks in the sample increases by a factor of 365 from $14 trillion in 1997Q1 to $51 trillion in 2015Q4 Indeed the volume of liquidity creation increases at a faster pace than the gross domestic product of the United States which rises by a factor of 21 during the same period

The results of panel regressions reveal that the dynamics of bank liquidity creation differ considerably between small and large institutions The level of bank capital requirements and the stance of monetary policy impact the liquidity creation of both small and medium-sized banks Liquidity creation of the largest banks which control over 80 of the banking systemrsquos assets remains unaffected

We find that changes in the amount of liquidity creation by small banks per $1 of their gross total assets are positively related to changes in the term spread but inversely related to changes in their Tier 1 capital ratios Further we show that the volume of liquidity creation is positively related to the riskiness of a bankrsquos assets as measured by the ratio of risk-weighted assets to gross total assets regardless of its size classification We establish that TARP has negative short-term effects on small and medium banks and no immediate impact on the liquidity creation of the largest institutions in the sample In contrast participation in TARP leads to a long-term decline in liquidity provision per dollar of assets of the largest banks This persists even after the completion of the programme and repayment of TARP funding As nearly all of the largest TARP-recipient banks in the sample are subsequently classified as systemically important financial institutions our results suggest that the increased regulatory scrutiny may adversely affect their ability to create liquidity

By demonstrating that the stance of monetary policy and the level of bank capital requirements do not tangibly enhance the liquidity provision efficiency of the largest systemically important institutions in the system our study offers important insights for the design of effective macroprudential policies

13Liquidity and Economic Activity Conference

PAPER 9 Jonathan Goldberg Federal Reserve Board lsquoTHE SUPPLY OF LIQUIDITY AND REAL ECONOMIC ACTIVITYrsquo

Abstract This paper identifies shocks to the supply of liquidity by dealer firms and investigates their effects on real economic activity First I develop a simple theoretical model of dealer intermediation then in a structural VAR model I use sign restrictions derived from the theoretical model to identify liquidity supply shocks Liquidity supply shocks that are orthogonal to information contained in macroeconomic and asset price variables have considerable predictive power for economic activity Moreover positive liquidity supply shocks cause large and persistent increases in real activity

Keywords liquidity dealer intermediation risk-taking real activity liquidity shocks

JEL Codes G10 G12 G17 G24

14 Liquidity and Economic Activity Conference

PAPER 10 Dirk Bezemer University of Groningen the Netherlands and Lu Zhang Sustainable Finance Lab and Utrecht University lsquoMACROECONOMIC IMPLICATIONS OF LIQUIDITY CREATION CREDIT ALLOCATION AND POST CRISIS RECESSIONSrsquo

Abstract In this paper we address macroeconomic implications of liquidity creation through bank lending and the impacts of liquidity on economic activity We note that liquidity created through bank lending can be channeled into the real sector in support of economic activity or in financial and real estate markets in support of capital gains We collected macro-level data on bank credit aggregates over 2000ndash12 for 57 economies categorise according to the use of credit We note the long-term shift in the allocation of bank credit creation away from non-financial business lending and towards financial and especially real estate markets We then present new evidence on the channels from credit allocation pre-crisis to the severity of post-crisis recessions

Our first contribution is to show that it is not just the level but the composition of debt (defined as the share of mortgage credit in total credit) that matters A second contribution is to analyse the channels We collect additional industry-level data across 20 industries for a subset of economies We analyze the effect of changes in the pre-crisis composition of debt on total GDP and on investment consumption and capital allocation We find that changes in the share of household mortgage credit before the crisis have a significant effect on recession

severity after the 2007 crisis This is not the case for any other credit category nor for growth of total bank credit We address the causality challenge by using the difference between IMF growth forecasts and growth realisations This filters out country-specific drivers of both debt and income growth We address the model selection challenge by using Bayesian averaging models This indicates that the change in credit composition is among the three most robust determinants of post-crisis recession severity with income levels and current account balance The findings are robust to a wide range of control variables and to the different responses across advancedemerging and EMUnon-EMU economies

We then delve into the channels from change in debt composition to income growth loss The literature to date has focused on negative wealth effects on consumption for which we find strong evidence In addition we find evidence for two investment channels a loan supply effect and a capital allocation effect In the industry-level analysis we find that in economies which experienced a larger change in debt composition before 2008 there was a larger reduction of credit available and weaker capital re-allocation towards sectors with higher value-added This effect is observed already before the crisis and very strongly after the crisis We discuss policy implications and future research

Keywords private credit mortgages crisis output loss investment capital allocation

JEL Codes C11 C15 E01 O4

15Liquidity and Economic Activity Conference

PAPER 11 Iftekhar Hasan Gabelli School of Business Fordham University and Jean-Loup Soula Strasbourg University LaRGE Research Centre lsquoTECHNICAL EFFICIENCY IN BANK LIQUIDITY CREATIONrsquo

Abstract This paper generates an optimum bank liquidity creation benchmark by tracing an efficient frontier in liquidity creation (bank intermediation) and questions why some banks are more efficient than others in such activities Evidence reveals that medium size banks are most correlated to efficient frontier irrespective of their business models Small (large) banks ndash focused on traditional banking activities ndash are found to be the most (least) efficient in creating liquidity in on-balance sheet items whereas large banks ndash involved in non-traditional activities ndash are found to be most efficient in off-balance sheet liquidity creation Additionally the liquidity efficiency of small banks is more resilient during the 2007ndash08 financial crisis relative to other banks

Keywords banks technical efficiency liquidity creation diversification

Jel Codes G21 G28 G32

16 Liquidity and Economic Activity Conference

PAPER 12 Richard Anderson Lindenwood University John Duca Federal Reserve Bank of Dallas and Barry Jones Department of Economics State University of New York lsquoA BROAD MONETARY SERVICES (LIQUIDITY) INDEX AND ITS LONG-TERM LINKS TO ECONOMIC ACTIVITYrsquo

Abstract Liquid assets play a crucial role in economic activity as the medium in which payments are received and are made lsquoSudden stopsrsquo in financial markets ndash during which liquid assets are hoarded ndash are periods when economic activity slows abruptly Further it is the sine qua non of financial intermediation to alter the measured relative and absolute quantities of liquid assets In this way the observed quantities of liquid assets reflect both the path of past economic activity and anticipations of future activity The quantities must be regarded as arising endogenously within an intertemporal general equilibrium model of the economy a la Tobin (1958) and Merton (1971) Economic modeling and analysis traditionally proceeds by combining relatively high-dimension lists of specific assets into lower-dimension lsquomonetary aggregatesrsquo The defining characteristic of the assets included is that all are available to facilitate

the exchange of goods and services at a transaction cost less than infinity That is all included assets may be sold or used as collateral for the purchase and sale of goods and services and thereby provide liquidity services which can be tracked by measured opportunity costs of foregone interest which may not in practice reflect all transactions costs The last caveat particularly applies to household holdings of mutual fund assets outside of money market funds Our study contributes to the literature in two key ways First it expands a conventional Divisia measure of money services to account for the liquidity provided by such mutual fund assets Second it then explores the long-run connections between economic activity and monetary aggregates constructed as index numbers from 1929 to 2016 finding that the inclusion of mutual fund liquidity services results in a Divisia measure of money that has a much more stable velocity

We thank Emil Mihalov and Tyler Atkinson for research assistance The views expressed are those of the authors and do not necessarily reflect those of the Federal Reserve Bank of Dallas or the Federal Reserve System Any errors are our own

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 8: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

8 Liquidity and Economic Activity Conference

PAPER 5Kevin Fox University of New South Wales and Erwin Diewert University of British ColumbialsquoTHE DEMAND FOR MONETARY BALANCES AND THE MEASUREMENT OF PRODUCTIVITYrsquo

Abstract Firms in advanced economies have greatly increased their cash holdings since the mid-1990s While this has been observed and the reasons debated by central bankers international agencies and academics it remains somewhat of a puzzle This paper explores possible reasons for this increase and the implications for understanding productivity growth Monetary holdings have an opportunity cost ie allocating firm financial capital into monetary deposits means that investment in real assets is reduced Traditional measures of Total Factor Productivity (TFP) do not take into account these holdings of monetary assets Given the recent large increases in these holdings ex ante it can be expected that adding these monetary assets to the list of traditional sources of capital services will reduce the TFP of the business sector Using a new data set on the US aggregate (corporate and non-corporate) business sector we measure this effect for the noting the implications for the System of National Accounts of this expanded definition of capital services Also industry elasticities of demand to hold monetary balances using the Normalized Quadratic functional form A key finding is that the accumulation of monetary holdings is primarily a phenomenon of the non-corporate business sector

We have found that while conceptually more correct adding real money balances to our input aggregate does not change aggregate measured productivity performance very much for the corporate sector This is because even though there is some variation the asset share is relatively small The impact on the non-corporate sector is larger especially in the latter decades of the sample when currency and deposit holdings increased substantially especially relative to other asset holdings Finally the relative productivity of individual firms can be significantly impacted by differences in money holdings even if there is little aggregate effect at the sectoral level Indeed understanding productivity differences between small and large firms can be enhanced by taking into account currency and deposits small firms are often credit constrained and therefore have greater cash holdings Similarly accounting for cash holdings can provide an augmented understanding of productivity and profitability in studies of firm dynamics In addition understanding productivity differences between risky and less risky sectors and firms can be informed by differences in money balances where eg dependence on RampD is taken as a proxy for risk Hence this paper provides a framework and empirical results for a more comprehensive understanding of productivity growth and dynamics

9Liquidity and Economic Activity Conference

PAPER 6Dennis Fixler Bureau of Economic Analysis and Kim Zieschang International Monetary Fund lsquoPRODUCING LIQUIDITYrsquo

Abstract Based on a paper presented at the 2015 Meeting of the Society for Economic Measurement Paris Dennis Fixler is Chief Economist Bureau of Economic Analysis and Kim Zieschang is Adjunct Professor of Economics University of Queensland The views expressed in this paper are those of the authors and should not be attributed to the Bureau of Economic Analysis JEL codes E01 Measurement and Data on National Income and Product Accounts Commercial banks are a primary producer of liquidity in an economy Despite their importance there is no consensus on the measurement of the liquidity service and for the matter bank output in general This lack of consensus exits at the microeconomic level and at that level of the national accounts which attempts to capture the significant role of bank services in the output of the economy The current national accounts measure of bank output is termed financial intermediation services indirectly measured or lsquoFISIMrsquo springing from the 1993 version of the System of National accounts (the 1993 SNA) Calculating FISIM under the 2008 version of national accounting standards is simple and generally practical provided the compiler has a key datummdashthe lsquoreference rate of interestrsquo The calculation is essentially Output = (Reference rate of interest minus Deposit rate) times Deposit liabilities + (Loan rate minus Reference rate of interest) times Loan assets

As the deposit and loan financial instrument coverage of this formula implies the current national accounting standards apply it to deposit-takers such as banks as well as to

non-deposit-taking loan-making financial institutions such as finance companies and money lenders

A first issue in treatment of financial services since the 1993 version of the standards introduced the reference rate concept has been lack of consensus on how it should be determined Generally the idea has been to select an exogenous reference rate a government security rate or a combination of them is often used because it captures the risk-free reference rate that underlies the user cost of money as in Barnett (1978) Some have proposed alternative exogenous reference rates that are tied to the market determined risk of the security to which the reference rate is to be applied In any event we argue that the reference rate should be endogenously determined and should be the bankrsquos calculated cost of capital the overall rate of return paid to all sources of funding including equity on the liability side of the balance sheet Our lsquoreference rate of interestrsquo is therefore individual to each bank rather than an economy-wide constant A second issue in the national accounts dialogue on financial services is the scope of financial instruments that should be associated with the SNArsquos indirect financial services measure The 2008 SNA narrowed the scope of FISIM to the deposit and loan positions of financial corporations but previous versions included interest income flows on essentially all financial instruments Return to the broad 1993 financial instrument scope is nevertheless a research agenda item for the next version of the SNA and appears essential to align the SNA with the scope of liquidity measured by the money and banking literature and the associated standards for compiling financial statistics We argue that FISIM should cover all financial instruments

10 Liquidity and Economic Activity Conference

Armed with the cost of capital reference rate and full financial balance sheet instrument scope we derive the production identity (value of output equiv cost of production) from the income equiv expense and balanced sheet identities generating a FISIM like calculation of output with a single cost of capital reference rate for each enterprise rather than for the whole economy Note that by computing specific user cost prices of bank services it is in principle possible to aggregate them into a price index for liquidity services and correspondingly obtain a quantity index of liquidity services

Such aggregate measures would be useful in tracing the financial intermediation process into GDP or another aggregate measure of economic activity

On examining the SNA-type production identity for an individual bank we find the cost side contains a term within operating surplus ndash the equity leverage premium ndash that depends on the bankrsquos financing ndash the debt and equity composition of the liability side of its balance sheet Further it is inherent in the definition of the cost of capital reference rate that the equity leverage premium is identically equal to what we will term produced liquidity within the part of SNA financial services output of the bank coming from the debt instruments on the liability side of the its

balance sheet prominent among which being deposits Given that banks transform liabilities into assets the equity-leverage premium and the produced liquidity is tied to the risk bearing undertaken by the bank With an exogenous reference rate the risk bearing is completely embedded in the user cost price of the asset or liability product In our model because the entire balance sheet is used the risk bearing is tied to equity holders

This paper proposes a resolution to the scope and methodology issues in the ongoing national accounts conversation on financial services particularly on provision of liquidity by debt issuing enterprises and suggests that the equity leverage premium now included in the nominal output of banks be offset by an intermediate insurance input supplied by their equity holders This retains the current standardsrsquo origination of liquidity with banks (but also extends it to other debt issuing enterprises) while better exposing the link between the leverage risk bearing (provision of debt guarantees) of equity holding sectors and production of liquidity by banks (and other debt issuing enterprises) With the developed framework issues such as the measurement of output and productivity of the providers of liquidity services and other financial services can also be measured and incorporated into macroeconomic statistics

11Liquidity and Economic Activity Conference

PAPER 7Jan Willem Van den End De Nederlandsche Bank the Netherlands lsquoAPPLYING COMPLEXITY THEORY TO INTEREST RATES EVIDENCE OF CRITICAL TRANSITIONS IN THE EURO AREArsquo

Abstract We apply complexity theory to financial markets to show that excess liquidity created by the Eurosystem has led to critical transitions in the configuration of interest rates Complexity indicators turn out to be useful signals of tipping points and subsequent regime shifts in interest rates We find that the critical transitions are related to the increase of excess liquidity in the euro area These insights can help central banks to strike the right balance between the intention to support the financial system by injecting liquidity and potential unintended side-effects on market functioning

Keywords interest rates central banks and their policies monetary policy

JEL Codes E43 E58 E52

12 Liquidity and Economic Activity Conference

PAPER 8Michael Bowe Alliance Manchester Business School University of Manchester and University of Vaasa Olga Kolokolova Alliance Manchester Business School University of Manchester and Marcin Michalski Alliance Manchester Business School University of Manchester lsquoTOO BIG TO CARE TOO SMALL TO MATTER MACRO FINANCIAL POLICY AND BANK LIQUIDITY CREATIONrsquo

Abstract We estimate the volume of liquidity creation by US bank holding companies between 1997 and 2015 and examine the impact of changes in macrofinancial policies on the dynamics of this process We focus on three major policy developments occurring in the aftermath of the 2007ndash09 financial crisis bank capital regulation reform monetary stimulus through quantitative easing and the Troubled Asset Relief Program (TARP)

We use the three-step procedure proposed by Berger and Bouwman (2009) to calculate the dollar amount of liquidity a financial institution creates Initially we classify all balance sheet items and o_-balance sheet activities of an institution as liquid semi-liquid or illiquid to which we then assign liquidity weights of +1=2 (illiquid assets and liquid liabilities) 0 (semi-liquid assets and liabilities) or 10485761=2 (liquid assets illiquid liabilities and equity) respectively The dollar volume of liquidity creation is then calculated as liquidity-weighted sum of the items identified in the first step We find that the total amount of liquidity creation by banks in the sample increases by a factor of 365 from $14 trillion in 1997Q1 to $51 trillion in 2015Q4 Indeed the volume of liquidity creation increases at a faster pace than the gross domestic product of the United States which rises by a factor of 21 during the same period

The results of panel regressions reveal that the dynamics of bank liquidity creation differ considerably between small and large institutions The level of bank capital requirements and the stance of monetary policy impact the liquidity creation of both small and medium-sized banks Liquidity creation of the largest banks which control over 80 of the banking systemrsquos assets remains unaffected

We find that changes in the amount of liquidity creation by small banks per $1 of their gross total assets are positively related to changes in the term spread but inversely related to changes in their Tier 1 capital ratios Further we show that the volume of liquidity creation is positively related to the riskiness of a bankrsquos assets as measured by the ratio of risk-weighted assets to gross total assets regardless of its size classification We establish that TARP has negative short-term effects on small and medium banks and no immediate impact on the liquidity creation of the largest institutions in the sample In contrast participation in TARP leads to a long-term decline in liquidity provision per dollar of assets of the largest banks This persists even after the completion of the programme and repayment of TARP funding As nearly all of the largest TARP-recipient banks in the sample are subsequently classified as systemically important financial institutions our results suggest that the increased regulatory scrutiny may adversely affect their ability to create liquidity

By demonstrating that the stance of monetary policy and the level of bank capital requirements do not tangibly enhance the liquidity provision efficiency of the largest systemically important institutions in the system our study offers important insights for the design of effective macroprudential policies

13Liquidity and Economic Activity Conference

PAPER 9 Jonathan Goldberg Federal Reserve Board lsquoTHE SUPPLY OF LIQUIDITY AND REAL ECONOMIC ACTIVITYrsquo

Abstract This paper identifies shocks to the supply of liquidity by dealer firms and investigates their effects on real economic activity First I develop a simple theoretical model of dealer intermediation then in a structural VAR model I use sign restrictions derived from the theoretical model to identify liquidity supply shocks Liquidity supply shocks that are orthogonal to information contained in macroeconomic and asset price variables have considerable predictive power for economic activity Moreover positive liquidity supply shocks cause large and persistent increases in real activity

Keywords liquidity dealer intermediation risk-taking real activity liquidity shocks

JEL Codes G10 G12 G17 G24

14 Liquidity and Economic Activity Conference

PAPER 10 Dirk Bezemer University of Groningen the Netherlands and Lu Zhang Sustainable Finance Lab and Utrecht University lsquoMACROECONOMIC IMPLICATIONS OF LIQUIDITY CREATION CREDIT ALLOCATION AND POST CRISIS RECESSIONSrsquo

Abstract In this paper we address macroeconomic implications of liquidity creation through bank lending and the impacts of liquidity on economic activity We note that liquidity created through bank lending can be channeled into the real sector in support of economic activity or in financial and real estate markets in support of capital gains We collected macro-level data on bank credit aggregates over 2000ndash12 for 57 economies categorise according to the use of credit We note the long-term shift in the allocation of bank credit creation away from non-financial business lending and towards financial and especially real estate markets We then present new evidence on the channels from credit allocation pre-crisis to the severity of post-crisis recessions

Our first contribution is to show that it is not just the level but the composition of debt (defined as the share of mortgage credit in total credit) that matters A second contribution is to analyse the channels We collect additional industry-level data across 20 industries for a subset of economies We analyze the effect of changes in the pre-crisis composition of debt on total GDP and on investment consumption and capital allocation We find that changes in the share of household mortgage credit before the crisis have a significant effect on recession

severity after the 2007 crisis This is not the case for any other credit category nor for growth of total bank credit We address the causality challenge by using the difference between IMF growth forecasts and growth realisations This filters out country-specific drivers of both debt and income growth We address the model selection challenge by using Bayesian averaging models This indicates that the change in credit composition is among the three most robust determinants of post-crisis recession severity with income levels and current account balance The findings are robust to a wide range of control variables and to the different responses across advancedemerging and EMUnon-EMU economies

We then delve into the channels from change in debt composition to income growth loss The literature to date has focused on negative wealth effects on consumption for which we find strong evidence In addition we find evidence for two investment channels a loan supply effect and a capital allocation effect In the industry-level analysis we find that in economies which experienced a larger change in debt composition before 2008 there was a larger reduction of credit available and weaker capital re-allocation towards sectors with higher value-added This effect is observed already before the crisis and very strongly after the crisis We discuss policy implications and future research

Keywords private credit mortgages crisis output loss investment capital allocation

JEL Codes C11 C15 E01 O4

15Liquidity and Economic Activity Conference

PAPER 11 Iftekhar Hasan Gabelli School of Business Fordham University and Jean-Loup Soula Strasbourg University LaRGE Research Centre lsquoTECHNICAL EFFICIENCY IN BANK LIQUIDITY CREATIONrsquo

Abstract This paper generates an optimum bank liquidity creation benchmark by tracing an efficient frontier in liquidity creation (bank intermediation) and questions why some banks are more efficient than others in such activities Evidence reveals that medium size banks are most correlated to efficient frontier irrespective of their business models Small (large) banks ndash focused on traditional banking activities ndash are found to be the most (least) efficient in creating liquidity in on-balance sheet items whereas large banks ndash involved in non-traditional activities ndash are found to be most efficient in off-balance sheet liquidity creation Additionally the liquidity efficiency of small banks is more resilient during the 2007ndash08 financial crisis relative to other banks

Keywords banks technical efficiency liquidity creation diversification

Jel Codes G21 G28 G32

16 Liquidity and Economic Activity Conference

PAPER 12 Richard Anderson Lindenwood University John Duca Federal Reserve Bank of Dallas and Barry Jones Department of Economics State University of New York lsquoA BROAD MONETARY SERVICES (LIQUIDITY) INDEX AND ITS LONG-TERM LINKS TO ECONOMIC ACTIVITYrsquo

Abstract Liquid assets play a crucial role in economic activity as the medium in which payments are received and are made lsquoSudden stopsrsquo in financial markets ndash during which liquid assets are hoarded ndash are periods when economic activity slows abruptly Further it is the sine qua non of financial intermediation to alter the measured relative and absolute quantities of liquid assets In this way the observed quantities of liquid assets reflect both the path of past economic activity and anticipations of future activity The quantities must be regarded as arising endogenously within an intertemporal general equilibrium model of the economy a la Tobin (1958) and Merton (1971) Economic modeling and analysis traditionally proceeds by combining relatively high-dimension lists of specific assets into lower-dimension lsquomonetary aggregatesrsquo The defining characteristic of the assets included is that all are available to facilitate

the exchange of goods and services at a transaction cost less than infinity That is all included assets may be sold or used as collateral for the purchase and sale of goods and services and thereby provide liquidity services which can be tracked by measured opportunity costs of foregone interest which may not in practice reflect all transactions costs The last caveat particularly applies to household holdings of mutual fund assets outside of money market funds Our study contributes to the literature in two key ways First it expands a conventional Divisia measure of money services to account for the liquidity provided by such mutual fund assets Second it then explores the long-run connections between economic activity and monetary aggregates constructed as index numbers from 1929 to 2016 finding that the inclusion of mutual fund liquidity services results in a Divisia measure of money that has a much more stable velocity

We thank Emil Mihalov and Tyler Atkinson for research assistance The views expressed are those of the authors and do not necessarily reflect those of the Federal Reserve Bank of Dallas or the Federal Reserve System Any errors are our own

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 9: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

9Liquidity and Economic Activity Conference

PAPER 6Dennis Fixler Bureau of Economic Analysis and Kim Zieschang International Monetary Fund lsquoPRODUCING LIQUIDITYrsquo

Abstract Based on a paper presented at the 2015 Meeting of the Society for Economic Measurement Paris Dennis Fixler is Chief Economist Bureau of Economic Analysis and Kim Zieschang is Adjunct Professor of Economics University of Queensland The views expressed in this paper are those of the authors and should not be attributed to the Bureau of Economic Analysis JEL codes E01 Measurement and Data on National Income and Product Accounts Commercial banks are a primary producer of liquidity in an economy Despite their importance there is no consensus on the measurement of the liquidity service and for the matter bank output in general This lack of consensus exits at the microeconomic level and at that level of the national accounts which attempts to capture the significant role of bank services in the output of the economy The current national accounts measure of bank output is termed financial intermediation services indirectly measured or lsquoFISIMrsquo springing from the 1993 version of the System of National accounts (the 1993 SNA) Calculating FISIM under the 2008 version of national accounting standards is simple and generally practical provided the compiler has a key datummdashthe lsquoreference rate of interestrsquo The calculation is essentially Output = (Reference rate of interest minus Deposit rate) times Deposit liabilities + (Loan rate minus Reference rate of interest) times Loan assets

As the deposit and loan financial instrument coverage of this formula implies the current national accounting standards apply it to deposit-takers such as banks as well as to

non-deposit-taking loan-making financial institutions such as finance companies and money lenders

A first issue in treatment of financial services since the 1993 version of the standards introduced the reference rate concept has been lack of consensus on how it should be determined Generally the idea has been to select an exogenous reference rate a government security rate or a combination of them is often used because it captures the risk-free reference rate that underlies the user cost of money as in Barnett (1978) Some have proposed alternative exogenous reference rates that are tied to the market determined risk of the security to which the reference rate is to be applied In any event we argue that the reference rate should be endogenously determined and should be the bankrsquos calculated cost of capital the overall rate of return paid to all sources of funding including equity on the liability side of the balance sheet Our lsquoreference rate of interestrsquo is therefore individual to each bank rather than an economy-wide constant A second issue in the national accounts dialogue on financial services is the scope of financial instruments that should be associated with the SNArsquos indirect financial services measure The 2008 SNA narrowed the scope of FISIM to the deposit and loan positions of financial corporations but previous versions included interest income flows on essentially all financial instruments Return to the broad 1993 financial instrument scope is nevertheless a research agenda item for the next version of the SNA and appears essential to align the SNA with the scope of liquidity measured by the money and banking literature and the associated standards for compiling financial statistics We argue that FISIM should cover all financial instruments

10 Liquidity and Economic Activity Conference

Armed with the cost of capital reference rate and full financial balance sheet instrument scope we derive the production identity (value of output equiv cost of production) from the income equiv expense and balanced sheet identities generating a FISIM like calculation of output with a single cost of capital reference rate for each enterprise rather than for the whole economy Note that by computing specific user cost prices of bank services it is in principle possible to aggregate them into a price index for liquidity services and correspondingly obtain a quantity index of liquidity services

Such aggregate measures would be useful in tracing the financial intermediation process into GDP or another aggregate measure of economic activity

On examining the SNA-type production identity for an individual bank we find the cost side contains a term within operating surplus ndash the equity leverage premium ndash that depends on the bankrsquos financing ndash the debt and equity composition of the liability side of its balance sheet Further it is inherent in the definition of the cost of capital reference rate that the equity leverage premium is identically equal to what we will term produced liquidity within the part of SNA financial services output of the bank coming from the debt instruments on the liability side of the its

balance sheet prominent among which being deposits Given that banks transform liabilities into assets the equity-leverage premium and the produced liquidity is tied to the risk bearing undertaken by the bank With an exogenous reference rate the risk bearing is completely embedded in the user cost price of the asset or liability product In our model because the entire balance sheet is used the risk bearing is tied to equity holders

This paper proposes a resolution to the scope and methodology issues in the ongoing national accounts conversation on financial services particularly on provision of liquidity by debt issuing enterprises and suggests that the equity leverage premium now included in the nominal output of banks be offset by an intermediate insurance input supplied by their equity holders This retains the current standardsrsquo origination of liquidity with banks (but also extends it to other debt issuing enterprises) while better exposing the link between the leverage risk bearing (provision of debt guarantees) of equity holding sectors and production of liquidity by banks (and other debt issuing enterprises) With the developed framework issues such as the measurement of output and productivity of the providers of liquidity services and other financial services can also be measured and incorporated into macroeconomic statistics

11Liquidity and Economic Activity Conference

PAPER 7Jan Willem Van den End De Nederlandsche Bank the Netherlands lsquoAPPLYING COMPLEXITY THEORY TO INTEREST RATES EVIDENCE OF CRITICAL TRANSITIONS IN THE EURO AREArsquo

Abstract We apply complexity theory to financial markets to show that excess liquidity created by the Eurosystem has led to critical transitions in the configuration of interest rates Complexity indicators turn out to be useful signals of tipping points and subsequent regime shifts in interest rates We find that the critical transitions are related to the increase of excess liquidity in the euro area These insights can help central banks to strike the right balance between the intention to support the financial system by injecting liquidity and potential unintended side-effects on market functioning

Keywords interest rates central banks and their policies monetary policy

JEL Codes E43 E58 E52

12 Liquidity and Economic Activity Conference

PAPER 8Michael Bowe Alliance Manchester Business School University of Manchester and University of Vaasa Olga Kolokolova Alliance Manchester Business School University of Manchester and Marcin Michalski Alliance Manchester Business School University of Manchester lsquoTOO BIG TO CARE TOO SMALL TO MATTER MACRO FINANCIAL POLICY AND BANK LIQUIDITY CREATIONrsquo

Abstract We estimate the volume of liquidity creation by US bank holding companies between 1997 and 2015 and examine the impact of changes in macrofinancial policies on the dynamics of this process We focus on three major policy developments occurring in the aftermath of the 2007ndash09 financial crisis bank capital regulation reform monetary stimulus through quantitative easing and the Troubled Asset Relief Program (TARP)

We use the three-step procedure proposed by Berger and Bouwman (2009) to calculate the dollar amount of liquidity a financial institution creates Initially we classify all balance sheet items and o_-balance sheet activities of an institution as liquid semi-liquid or illiquid to which we then assign liquidity weights of +1=2 (illiquid assets and liquid liabilities) 0 (semi-liquid assets and liabilities) or 10485761=2 (liquid assets illiquid liabilities and equity) respectively The dollar volume of liquidity creation is then calculated as liquidity-weighted sum of the items identified in the first step We find that the total amount of liquidity creation by banks in the sample increases by a factor of 365 from $14 trillion in 1997Q1 to $51 trillion in 2015Q4 Indeed the volume of liquidity creation increases at a faster pace than the gross domestic product of the United States which rises by a factor of 21 during the same period

The results of panel regressions reveal that the dynamics of bank liquidity creation differ considerably between small and large institutions The level of bank capital requirements and the stance of monetary policy impact the liquidity creation of both small and medium-sized banks Liquidity creation of the largest banks which control over 80 of the banking systemrsquos assets remains unaffected

We find that changes in the amount of liquidity creation by small banks per $1 of their gross total assets are positively related to changes in the term spread but inversely related to changes in their Tier 1 capital ratios Further we show that the volume of liquidity creation is positively related to the riskiness of a bankrsquos assets as measured by the ratio of risk-weighted assets to gross total assets regardless of its size classification We establish that TARP has negative short-term effects on small and medium banks and no immediate impact on the liquidity creation of the largest institutions in the sample In contrast participation in TARP leads to a long-term decline in liquidity provision per dollar of assets of the largest banks This persists even after the completion of the programme and repayment of TARP funding As nearly all of the largest TARP-recipient banks in the sample are subsequently classified as systemically important financial institutions our results suggest that the increased regulatory scrutiny may adversely affect their ability to create liquidity

By demonstrating that the stance of monetary policy and the level of bank capital requirements do not tangibly enhance the liquidity provision efficiency of the largest systemically important institutions in the system our study offers important insights for the design of effective macroprudential policies

13Liquidity and Economic Activity Conference

PAPER 9 Jonathan Goldberg Federal Reserve Board lsquoTHE SUPPLY OF LIQUIDITY AND REAL ECONOMIC ACTIVITYrsquo

Abstract This paper identifies shocks to the supply of liquidity by dealer firms and investigates their effects on real economic activity First I develop a simple theoretical model of dealer intermediation then in a structural VAR model I use sign restrictions derived from the theoretical model to identify liquidity supply shocks Liquidity supply shocks that are orthogonal to information contained in macroeconomic and asset price variables have considerable predictive power for economic activity Moreover positive liquidity supply shocks cause large and persistent increases in real activity

Keywords liquidity dealer intermediation risk-taking real activity liquidity shocks

JEL Codes G10 G12 G17 G24

14 Liquidity and Economic Activity Conference

PAPER 10 Dirk Bezemer University of Groningen the Netherlands and Lu Zhang Sustainable Finance Lab and Utrecht University lsquoMACROECONOMIC IMPLICATIONS OF LIQUIDITY CREATION CREDIT ALLOCATION AND POST CRISIS RECESSIONSrsquo

Abstract In this paper we address macroeconomic implications of liquidity creation through bank lending and the impacts of liquidity on economic activity We note that liquidity created through bank lending can be channeled into the real sector in support of economic activity or in financial and real estate markets in support of capital gains We collected macro-level data on bank credit aggregates over 2000ndash12 for 57 economies categorise according to the use of credit We note the long-term shift in the allocation of bank credit creation away from non-financial business lending and towards financial and especially real estate markets We then present new evidence on the channels from credit allocation pre-crisis to the severity of post-crisis recessions

Our first contribution is to show that it is not just the level but the composition of debt (defined as the share of mortgage credit in total credit) that matters A second contribution is to analyse the channels We collect additional industry-level data across 20 industries for a subset of economies We analyze the effect of changes in the pre-crisis composition of debt on total GDP and on investment consumption and capital allocation We find that changes in the share of household mortgage credit before the crisis have a significant effect on recession

severity after the 2007 crisis This is not the case for any other credit category nor for growth of total bank credit We address the causality challenge by using the difference between IMF growth forecasts and growth realisations This filters out country-specific drivers of both debt and income growth We address the model selection challenge by using Bayesian averaging models This indicates that the change in credit composition is among the three most robust determinants of post-crisis recession severity with income levels and current account balance The findings are robust to a wide range of control variables and to the different responses across advancedemerging and EMUnon-EMU economies

We then delve into the channels from change in debt composition to income growth loss The literature to date has focused on negative wealth effects on consumption for which we find strong evidence In addition we find evidence for two investment channels a loan supply effect and a capital allocation effect In the industry-level analysis we find that in economies which experienced a larger change in debt composition before 2008 there was a larger reduction of credit available and weaker capital re-allocation towards sectors with higher value-added This effect is observed already before the crisis and very strongly after the crisis We discuss policy implications and future research

Keywords private credit mortgages crisis output loss investment capital allocation

JEL Codes C11 C15 E01 O4

15Liquidity and Economic Activity Conference

PAPER 11 Iftekhar Hasan Gabelli School of Business Fordham University and Jean-Loup Soula Strasbourg University LaRGE Research Centre lsquoTECHNICAL EFFICIENCY IN BANK LIQUIDITY CREATIONrsquo

Abstract This paper generates an optimum bank liquidity creation benchmark by tracing an efficient frontier in liquidity creation (bank intermediation) and questions why some banks are more efficient than others in such activities Evidence reveals that medium size banks are most correlated to efficient frontier irrespective of their business models Small (large) banks ndash focused on traditional banking activities ndash are found to be the most (least) efficient in creating liquidity in on-balance sheet items whereas large banks ndash involved in non-traditional activities ndash are found to be most efficient in off-balance sheet liquidity creation Additionally the liquidity efficiency of small banks is more resilient during the 2007ndash08 financial crisis relative to other banks

Keywords banks technical efficiency liquidity creation diversification

Jel Codes G21 G28 G32

16 Liquidity and Economic Activity Conference

PAPER 12 Richard Anderson Lindenwood University John Duca Federal Reserve Bank of Dallas and Barry Jones Department of Economics State University of New York lsquoA BROAD MONETARY SERVICES (LIQUIDITY) INDEX AND ITS LONG-TERM LINKS TO ECONOMIC ACTIVITYrsquo

Abstract Liquid assets play a crucial role in economic activity as the medium in which payments are received and are made lsquoSudden stopsrsquo in financial markets ndash during which liquid assets are hoarded ndash are periods when economic activity slows abruptly Further it is the sine qua non of financial intermediation to alter the measured relative and absolute quantities of liquid assets In this way the observed quantities of liquid assets reflect both the path of past economic activity and anticipations of future activity The quantities must be regarded as arising endogenously within an intertemporal general equilibrium model of the economy a la Tobin (1958) and Merton (1971) Economic modeling and analysis traditionally proceeds by combining relatively high-dimension lists of specific assets into lower-dimension lsquomonetary aggregatesrsquo The defining characteristic of the assets included is that all are available to facilitate

the exchange of goods and services at a transaction cost less than infinity That is all included assets may be sold or used as collateral for the purchase and sale of goods and services and thereby provide liquidity services which can be tracked by measured opportunity costs of foregone interest which may not in practice reflect all transactions costs The last caveat particularly applies to household holdings of mutual fund assets outside of money market funds Our study contributes to the literature in two key ways First it expands a conventional Divisia measure of money services to account for the liquidity provided by such mutual fund assets Second it then explores the long-run connections between economic activity and monetary aggregates constructed as index numbers from 1929 to 2016 finding that the inclusion of mutual fund liquidity services results in a Divisia measure of money that has a much more stable velocity

We thank Emil Mihalov and Tyler Atkinson for research assistance The views expressed are those of the authors and do not necessarily reflect those of the Federal Reserve Bank of Dallas or the Federal Reserve System Any errors are our own

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 10: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

10 Liquidity and Economic Activity Conference

Armed with the cost of capital reference rate and full financial balance sheet instrument scope we derive the production identity (value of output equiv cost of production) from the income equiv expense and balanced sheet identities generating a FISIM like calculation of output with a single cost of capital reference rate for each enterprise rather than for the whole economy Note that by computing specific user cost prices of bank services it is in principle possible to aggregate them into a price index for liquidity services and correspondingly obtain a quantity index of liquidity services

Such aggregate measures would be useful in tracing the financial intermediation process into GDP or another aggregate measure of economic activity

On examining the SNA-type production identity for an individual bank we find the cost side contains a term within operating surplus ndash the equity leverage premium ndash that depends on the bankrsquos financing ndash the debt and equity composition of the liability side of its balance sheet Further it is inherent in the definition of the cost of capital reference rate that the equity leverage premium is identically equal to what we will term produced liquidity within the part of SNA financial services output of the bank coming from the debt instruments on the liability side of the its

balance sheet prominent among which being deposits Given that banks transform liabilities into assets the equity-leverage premium and the produced liquidity is tied to the risk bearing undertaken by the bank With an exogenous reference rate the risk bearing is completely embedded in the user cost price of the asset or liability product In our model because the entire balance sheet is used the risk bearing is tied to equity holders

This paper proposes a resolution to the scope and methodology issues in the ongoing national accounts conversation on financial services particularly on provision of liquidity by debt issuing enterprises and suggests that the equity leverage premium now included in the nominal output of banks be offset by an intermediate insurance input supplied by their equity holders This retains the current standardsrsquo origination of liquidity with banks (but also extends it to other debt issuing enterprises) while better exposing the link between the leverage risk bearing (provision of debt guarantees) of equity holding sectors and production of liquidity by banks (and other debt issuing enterprises) With the developed framework issues such as the measurement of output and productivity of the providers of liquidity services and other financial services can also be measured and incorporated into macroeconomic statistics

11Liquidity and Economic Activity Conference

PAPER 7Jan Willem Van den End De Nederlandsche Bank the Netherlands lsquoAPPLYING COMPLEXITY THEORY TO INTEREST RATES EVIDENCE OF CRITICAL TRANSITIONS IN THE EURO AREArsquo

Abstract We apply complexity theory to financial markets to show that excess liquidity created by the Eurosystem has led to critical transitions in the configuration of interest rates Complexity indicators turn out to be useful signals of tipping points and subsequent regime shifts in interest rates We find that the critical transitions are related to the increase of excess liquidity in the euro area These insights can help central banks to strike the right balance between the intention to support the financial system by injecting liquidity and potential unintended side-effects on market functioning

Keywords interest rates central banks and their policies monetary policy

JEL Codes E43 E58 E52

12 Liquidity and Economic Activity Conference

PAPER 8Michael Bowe Alliance Manchester Business School University of Manchester and University of Vaasa Olga Kolokolova Alliance Manchester Business School University of Manchester and Marcin Michalski Alliance Manchester Business School University of Manchester lsquoTOO BIG TO CARE TOO SMALL TO MATTER MACRO FINANCIAL POLICY AND BANK LIQUIDITY CREATIONrsquo

Abstract We estimate the volume of liquidity creation by US bank holding companies between 1997 and 2015 and examine the impact of changes in macrofinancial policies on the dynamics of this process We focus on three major policy developments occurring in the aftermath of the 2007ndash09 financial crisis bank capital regulation reform monetary stimulus through quantitative easing and the Troubled Asset Relief Program (TARP)

We use the three-step procedure proposed by Berger and Bouwman (2009) to calculate the dollar amount of liquidity a financial institution creates Initially we classify all balance sheet items and o_-balance sheet activities of an institution as liquid semi-liquid or illiquid to which we then assign liquidity weights of +1=2 (illiquid assets and liquid liabilities) 0 (semi-liquid assets and liabilities) or 10485761=2 (liquid assets illiquid liabilities and equity) respectively The dollar volume of liquidity creation is then calculated as liquidity-weighted sum of the items identified in the first step We find that the total amount of liquidity creation by banks in the sample increases by a factor of 365 from $14 trillion in 1997Q1 to $51 trillion in 2015Q4 Indeed the volume of liquidity creation increases at a faster pace than the gross domestic product of the United States which rises by a factor of 21 during the same period

The results of panel regressions reveal that the dynamics of bank liquidity creation differ considerably between small and large institutions The level of bank capital requirements and the stance of monetary policy impact the liquidity creation of both small and medium-sized banks Liquidity creation of the largest banks which control over 80 of the banking systemrsquos assets remains unaffected

We find that changes in the amount of liquidity creation by small banks per $1 of their gross total assets are positively related to changes in the term spread but inversely related to changes in their Tier 1 capital ratios Further we show that the volume of liquidity creation is positively related to the riskiness of a bankrsquos assets as measured by the ratio of risk-weighted assets to gross total assets regardless of its size classification We establish that TARP has negative short-term effects on small and medium banks and no immediate impact on the liquidity creation of the largest institutions in the sample In contrast participation in TARP leads to a long-term decline in liquidity provision per dollar of assets of the largest banks This persists even after the completion of the programme and repayment of TARP funding As nearly all of the largest TARP-recipient banks in the sample are subsequently classified as systemically important financial institutions our results suggest that the increased regulatory scrutiny may adversely affect their ability to create liquidity

By demonstrating that the stance of monetary policy and the level of bank capital requirements do not tangibly enhance the liquidity provision efficiency of the largest systemically important institutions in the system our study offers important insights for the design of effective macroprudential policies

13Liquidity and Economic Activity Conference

PAPER 9 Jonathan Goldberg Federal Reserve Board lsquoTHE SUPPLY OF LIQUIDITY AND REAL ECONOMIC ACTIVITYrsquo

Abstract This paper identifies shocks to the supply of liquidity by dealer firms and investigates their effects on real economic activity First I develop a simple theoretical model of dealer intermediation then in a structural VAR model I use sign restrictions derived from the theoretical model to identify liquidity supply shocks Liquidity supply shocks that are orthogonal to information contained in macroeconomic and asset price variables have considerable predictive power for economic activity Moreover positive liquidity supply shocks cause large and persistent increases in real activity

Keywords liquidity dealer intermediation risk-taking real activity liquidity shocks

JEL Codes G10 G12 G17 G24

14 Liquidity and Economic Activity Conference

PAPER 10 Dirk Bezemer University of Groningen the Netherlands and Lu Zhang Sustainable Finance Lab and Utrecht University lsquoMACROECONOMIC IMPLICATIONS OF LIQUIDITY CREATION CREDIT ALLOCATION AND POST CRISIS RECESSIONSrsquo

Abstract In this paper we address macroeconomic implications of liquidity creation through bank lending and the impacts of liquidity on economic activity We note that liquidity created through bank lending can be channeled into the real sector in support of economic activity or in financial and real estate markets in support of capital gains We collected macro-level data on bank credit aggregates over 2000ndash12 for 57 economies categorise according to the use of credit We note the long-term shift in the allocation of bank credit creation away from non-financial business lending and towards financial and especially real estate markets We then present new evidence on the channels from credit allocation pre-crisis to the severity of post-crisis recessions

Our first contribution is to show that it is not just the level but the composition of debt (defined as the share of mortgage credit in total credit) that matters A second contribution is to analyse the channels We collect additional industry-level data across 20 industries for a subset of economies We analyze the effect of changes in the pre-crisis composition of debt on total GDP and on investment consumption and capital allocation We find that changes in the share of household mortgage credit before the crisis have a significant effect on recession

severity after the 2007 crisis This is not the case for any other credit category nor for growth of total bank credit We address the causality challenge by using the difference between IMF growth forecasts and growth realisations This filters out country-specific drivers of both debt and income growth We address the model selection challenge by using Bayesian averaging models This indicates that the change in credit composition is among the three most robust determinants of post-crisis recession severity with income levels and current account balance The findings are robust to a wide range of control variables and to the different responses across advancedemerging and EMUnon-EMU economies

We then delve into the channels from change in debt composition to income growth loss The literature to date has focused on negative wealth effects on consumption for which we find strong evidence In addition we find evidence for two investment channels a loan supply effect and a capital allocation effect In the industry-level analysis we find that in economies which experienced a larger change in debt composition before 2008 there was a larger reduction of credit available and weaker capital re-allocation towards sectors with higher value-added This effect is observed already before the crisis and very strongly after the crisis We discuss policy implications and future research

Keywords private credit mortgages crisis output loss investment capital allocation

JEL Codes C11 C15 E01 O4

15Liquidity and Economic Activity Conference

PAPER 11 Iftekhar Hasan Gabelli School of Business Fordham University and Jean-Loup Soula Strasbourg University LaRGE Research Centre lsquoTECHNICAL EFFICIENCY IN BANK LIQUIDITY CREATIONrsquo

Abstract This paper generates an optimum bank liquidity creation benchmark by tracing an efficient frontier in liquidity creation (bank intermediation) and questions why some banks are more efficient than others in such activities Evidence reveals that medium size banks are most correlated to efficient frontier irrespective of their business models Small (large) banks ndash focused on traditional banking activities ndash are found to be the most (least) efficient in creating liquidity in on-balance sheet items whereas large banks ndash involved in non-traditional activities ndash are found to be most efficient in off-balance sheet liquidity creation Additionally the liquidity efficiency of small banks is more resilient during the 2007ndash08 financial crisis relative to other banks

Keywords banks technical efficiency liquidity creation diversification

Jel Codes G21 G28 G32

16 Liquidity and Economic Activity Conference

PAPER 12 Richard Anderson Lindenwood University John Duca Federal Reserve Bank of Dallas and Barry Jones Department of Economics State University of New York lsquoA BROAD MONETARY SERVICES (LIQUIDITY) INDEX AND ITS LONG-TERM LINKS TO ECONOMIC ACTIVITYrsquo

Abstract Liquid assets play a crucial role in economic activity as the medium in which payments are received and are made lsquoSudden stopsrsquo in financial markets ndash during which liquid assets are hoarded ndash are periods when economic activity slows abruptly Further it is the sine qua non of financial intermediation to alter the measured relative and absolute quantities of liquid assets In this way the observed quantities of liquid assets reflect both the path of past economic activity and anticipations of future activity The quantities must be regarded as arising endogenously within an intertemporal general equilibrium model of the economy a la Tobin (1958) and Merton (1971) Economic modeling and analysis traditionally proceeds by combining relatively high-dimension lists of specific assets into lower-dimension lsquomonetary aggregatesrsquo The defining characteristic of the assets included is that all are available to facilitate

the exchange of goods and services at a transaction cost less than infinity That is all included assets may be sold or used as collateral for the purchase and sale of goods and services and thereby provide liquidity services which can be tracked by measured opportunity costs of foregone interest which may not in practice reflect all transactions costs The last caveat particularly applies to household holdings of mutual fund assets outside of money market funds Our study contributes to the literature in two key ways First it expands a conventional Divisia measure of money services to account for the liquidity provided by such mutual fund assets Second it then explores the long-run connections between economic activity and monetary aggregates constructed as index numbers from 1929 to 2016 finding that the inclusion of mutual fund liquidity services results in a Divisia measure of money that has a much more stable velocity

We thank Emil Mihalov and Tyler Atkinson for research assistance The views expressed are those of the authors and do not necessarily reflect those of the Federal Reserve Bank of Dallas or the Federal Reserve System Any errors are our own

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 11: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

11Liquidity and Economic Activity Conference

PAPER 7Jan Willem Van den End De Nederlandsche Bank the Netherlands lsquoAPPLYING COMPLEXITY THEORY TO INTEREST RATES EVIDENCE OF CRITICAL TRANSITIONS IN THE EURO AREArsquo

Abstract We apply complexity theory to financial markets to show that excess liquidity created by the Eurosystem has led to critical transitions in the configuration of interest rates Complexity indicators turn out to be useful signals of tipping points and subsequent regime shifts in interest rates We find that the critical transitions are related to the increase of excess liquidity in the euro area These insights can help central banks to strike the right balance between the intention to support the financial system by injecting liquidity and potential unintended side-effects on market functioning

Keywords interest rates central banks and their policies monetary policy

JEL Codes E43 E58 E52

12 Liquidity and Economic Activity Conference

PAPER 8Michael Bowe Alliance Manchester Business School University of Manchester and University of Vaasa Olga Kolokolova Alliance Manchester Business School University of Manchester and Marcin Michalski Alliance Manchester Business School University of Manchester lsquoTOO BIG TO CARE TOO SMALL TO MATTER MACRO FINANCIAL POLICY AND BANK LIQUIDITY CREATIONrsquo

Abstract We estimate the volume of liquidity creation by US bank holding companies between 1997 and 2015 and examine the impact of changes in macrofinancial policies on the dynamics of this process We focus on three major policy developments occurring in the aftermath of the 2007ndash09 financial crisis bank capital regulation reform monetary stimulus through quantitative easing and the Troubled Asset Relief Program (TARP)

We use the three-step procedure proposed by Berger and Bouwman (2009) to calculate the dollar amount of liquidity a financial institution creates Initially we classify all balance sheet items and o_-balance sheet activities of an institution as liquid semi-liquid or illiquid to which we then assign liquidity weights of +1=2 (illiquid assets and liquid liabilities) 0 (semi-liquid assets and liabilities) or 10485761=2 (liquid assets illiquid liabilities and equity) respectively The dollar volume of liquidity creation is then calculated as liquidity-weighted sum of the items identified in the first step We find that the total amount of liquidity creation by banks in the sample increases by a factor of 365 from $14 trillion in 1997Q1 to $51 trillion in 2015Q4 Indeed the volume of liquidity creation increases at a faster pace than the gross domestic product of the United States which rises by a factor of 21 during the same period

The results of panel regressions reveal that the dynamics of bank liquidity creation differ considerably between small and large institutions The level of bank capital requirements and the stance of monetary policy impact the liquidity creation of both small and medium-sized banks Liquidity creation of the largest banks which control over 80 of the banking systemrsquos assets remains unaffected

We find that changes in the amount of liquidity creation by small banks per $1 of their gross total assets are positively related to changes in the term spread but inversely related to changes in their Tier 1 capital ratios Further we show that the volume of liquidity creation is positively related to the riskiness of a bankrsquos assets as measured by the ratio of risk-weighted assets to gross total assets regardless of its size classification We establish that TARP has negative short-term effects on small and medium banks and no immediate impact on the liquidity creation of the largest institutions in the sample In contrast participation in TARP leads to a long-term decline in liquidity provision per dollar of assets of the largest banks This persists even after the completion of the programme and repayment of TARP funding As nearly all of the largest TARP-recipient banks in the sample are subsequently classified as systemically important financial institutions our results suggest that the increased regulatory scrutiny may adversely affect their ability to create liquidity

By demonstrating that the stance of monetary policy and the level of bank capital requirements do not tangibly enhance the liquidity provision efficiency of the largest systemically important institutions in the system our study offers important insights for the design of effective macroprudential policies

13Liquidity and Economic Activity Conference

PAPER 9 Jonathan Goldberg Federal Reserve Board lsquoTHE SUPPLY OF LIQUIDITY AND REAL ECONOMIC ACTIVITYrsquo

Abstract This paper identifies shocks to the supply of liquidity by dealer firms and investigates their effects on real economic activity First I develop a simple theoretical model of dealer intermediation then in a structural VAR model I use sign restrictions derived from the theoretical model to identify liquidity supply shocks Liquidity supply shocks that are orthogonal to information contained in macroeconomic and asset price variables have considerable predictive power for economic activity Moreover positive liquidity supply shocks cause large and persistent increases in real activity

Keywords liquidity dealer intermediation risk-taking real activity liquidity shocks

JEL Codes G10 G12 G17 G24

14 Liquidity and Economic Activity Conference

PAPER 10 Dirk Bezemer University of Groningen the Netherlands and Lu Zhang Sustainable Finance Lab and Utrecht University lsquoMACROECONOMIC IMPLICATIONS OF LIQUIDITY CREATION CREDIT ALLOCATION AND POST CRISIS RECESSIONSrsquo

Abstract In this paper we address macroeconomic implications of liquidity creation through bank lending and the impacts of liquidity on economic activity We note that liquidity created through bank lending can be channeled into the real sector in support of economic activity or in financial and real estate markets in support of capital gains We collected macro-level data on bank credit aggregates over 2000ndash12 for 57 economies categorise according to the use of credit We note the long-term shift in the allocation of bank credit creation away from non-financial business lending and towards financial and especially real estate markets We then present new evidence on the channels from credit allocation pre-crisis to the severity of post-crisis recessions

Our first contribution is to show that it is not just the level but the composition of debt (defined as the share of mortgage credit in total credit) that matters A second contribution is to analyse the channels We collect additional industry-level data across 20 industries for a subset of economies We analyze the effect of changes in the pre-crisis composition of debt on total GDP and on investment consumption and capital allocation We find that changes in the share of household mortgage credit before the crisis have a significant effect on recession

severity after the 2007 crisis This is not the case for any other credit category nor for growth of total bank credit We address the causality challenge by using the difference between IMF growth forecasts and growth realisations This filters out country-specific drivers of both debt and income growth We address the model selection challenge by using Bayesian averaging models This indicates that the change in credit composition is among the three most robust determinants of post-crisis recession severity with income levels and current account balance The findings are robust to a wide range of control variables and to the different responses across advancedemerging and EMUnon-EMU economies

We then delve into the channels from change in debt composition to income growth loss The literature to date has focused on negative wealth effects on consumption for which we find strong evidence In addition we find evidence for two investment channels a loan supply effect and a capital allocation effect In the industry-level analysis we find that in economies which experienced a larger change in debt composition before 2008 there was a larger reduction of credit available and weaker capital re-allocation towards sectors with higher value-added This effect is observed already before the crisis and very strongly after the crisis We discuss policy implications and future research

Keywords private credit mortgages crisis output loss investment capital allocation

JEL Codes C11 C15 E01 O4

15Liquidity and Economic Activity Conference

PAPER 11 Iftekhar Hasan Gabelli School of Business Fordham University and Jean-Loup Soula Strasbourg University LaRGE Research Centre lsquoTECHNICAL EFFICIENCY IN BANK LIQUIDITY CREATIONrsquo

Abstract This paper generates an optimum bank liquidity creation benchmark by tracing an efficient frontier in liquidity creation (bank intermediation) and questions why some banks are more efficient than others in such activities Evidence reveals that medium size banks are most correlated to efficient frontier irrespective of their business models Small (large) banks ndash focused on traditional banking activities ndash are found to be the most (least) efficient in creating liquidity in on-balance sheet items whereas large banks ndash involved in non-traditional activities ndash are found to be most efficient in off-balance sheet liquidity creation Additionally the liquidity efficiency of small banks is more resilient during the 2007ndash08 financial crisis relative to other banks

Keywords banks technical efficiency liquidity creation diversification

Jel Codes G21 G28 G32

16 Liquidity and Economic Activity Conference

PAPER 12 Richard Anderson Lindenwood University John Duca Federal Reserve Bank of Dallas and Barry Jones Department of Economics State University of New York lsquoA BROAD MONETARY SERVICES (LIQUIDITY) INDEX AND ITS LONG-TERM LINKS TO ECONOMIC ACTIVITYrsquo

Abstract Liquid assets play a crucial role in economic activity as the medium in which payments are received and are made lsquoSudden stopsrsquo in financial markets ndash during which liquid assets are hoarded ndash are periods when economic activity slows abruptly Further it is the sine qua non of financial intermediation to alter the measured relative and absolute quantities of liquid assets In this way the observed quantities of liquid assets reflect both the path of past economic activity and anticipations of future activity The quantities must be regarded as arising endogenously within an intertemporal general equilibrium model of the economy a la Tobin (1958) and Merton (1971) Economic modeling and analysis traditionally proceeds by combining relatively high-dimension lists of specific assets into lower-dimension lsquomonetary aggregatesrsquo The defining characteristic of the assets included is that all are available to facilitate

the exchange of goods and services at a transaction cost less than infinity That is all included assets may be sold or used as collateral for the purchase and sale of goods and services and thereby provide liquidity services which can be tracked by measured opportunity costs of foregone interest which may not in practice reflect all transactions costs The last caveat particularly applies to household holdings of mutual fund assets outside of money market funds Our study contributes to the literature in two key ways First it expands a conventional Divisia measure of money services to account for the liquidity provided by such mutual fund assets Second it then explores the long-run connections between economic activity and monetary aggregates constructed as index numbers from 1929 to 2016 finding that the inclusion of mutual fund liquidity services results in a Divisia measure of money that has a much more stable velocity

We thank Emil Mihalov and Tyler Atkinson for research assistance The views expressed are those of the authors and do not necessarily reflect those of the Federal Reserve Bank of Dallas or the Federal Reserve System Any errors are our own

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 12: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

12 Liquidity and Economic Activity Conference

PAPER 8Michael Bowe Alliance Manchester Business School University of Manchester and University of Vaasa Olga Kolokolova Alliance Manchester Business School University of Manchester and Marcin Michalski Alliance Manchester Business School University of Manchester lsquoTOO BIG TO CARE TOO SMALL TO MATTER MACRO FINANCIAL POLICY AND BANK LIQUIDITY CREATIONrsquo

Abstract We estimate the volume of liquidity creation by US bank holding companies between 1997 and 2015 and examine the impact of changes in macrofinancial policies on the dynamics of this process We focus on three major policy developments occurring in the aftermath of the 2007ndash09 financial crisis bank capital regulation reform monetary stimulus through quantitative easing and the Troubled Asset Relief Program (TARP)

We use the three-step procedure proposed by Berger and Bouwman (2009) to calculate the dollar amount of liquidity a financial institution creates Initially we classify all balance sheet items and o_-balance sheet activities of an institution as liquid semi-liquid or illiquid to which we then assign liquidity weights of +1=2 (illiquid assets and liquid liabilities) 0 (semi-liquid assets and liabilities) or 10485761=2 (liquid assets illiquid liabilities and equity) respectively The dollar volume of liquidity creation is then calculated as liquidity-weighted sum of the items identified in the first step We find that the total amount of liquidity creation by banks in the sample increases by a factor of 365 from $14 trillion in 1997Q1 to $51 trillion in 2015Q4 Indeed the volume of liquidity creation increases at a faster pace than the gross domestic product of the United States which rises by a factor of 21 during the same period

The results of panel regressions reveal that the dynamics of bank liquidity creation differ considerably between small and large institutions The level of bank capital requirements and the stance of monetary policy impact the liquidity creation of both small and medium-sized banks Liquidity creation of the largest banks which control over 80 of the banking systemrsquos assets remains unaffected

We find that changes in the amount of liquidity creation by small banks per $1 of their gross total assets are positively related to changes in the term spread but inversely related to changes in their Tier 1 capital ratios Further we show that the volume of liquidity creation is positively related to the riskiness of a bankrsquos assets as measured by the ratio of risk-weighted assets to gross total assets regardless of its size classification We establish that TARP has negative short-term effects on small and medium banks and no immediate impact on the liquidity creation of the largest institutions in the sample In contrast participation in TARP leads to a long-term decline in liquidity provision per dollar of assets of the largest banks This persists even after the completion of the programme and repayment of TARP funding As nearly all of the largest TARP-recipient banks in the sample are subsequently classified as systemically important financial institutions our results suggest that the increased regulatory scrutiny may adversely affect their ability to create liquidity

By demonstrating that the stance of monetary policy and the level of bank capital requirements do not tangibly enhance the liquidity provision efficiency of the largest systemically important institutions in the system our study offers important insights for the design of effective macroprudential policies

13Liquidity and Economic Activity Conference

PAPER 9 Jonathan Goldberg Federal Reserve Board lsquoTHE SUPPLY OF LIQUIDITY AND REAL ECONOMIC ACTIVITYrsquo

Abstract This paper identifies shocks to the supply of liquidity by dealer firms and investigates their effects on real economic activity First I develop a simple theoretical model of dealer intermediation then in a structural VAR model I use sign restrictions derived from the theoretical model to identify liquidity supply shocks Liquidity supply shocks that are orthogonal to information contained in macroeconomic and asset price variables have considerable predictive power for economic activity Moreover positive liquidity supply shocks cause large and persistent increases in real activity

Keywords liquidity dealer intermediation risk-taking real activity liquidity shocks

JEL Codes G10 G12 G17 G24

14 Liquidity and Economic Activity Conference

PAPER 10 Dirk Bezemer University of Groningen the Netherlands and Lu Zhang Sustainable Finance Lab and Utrecht University lsquoMACROECONOMIC IMPLICATIONS OF LIQUIDITY CREATION CREDIT ALLOCATION AND POST CRISIS RECESSIONSrsquo

Abstract In this paper we address macroeconomic implications of liquidity creation through bank lending and the impacts of liquidity on economic activity We note that liquidity created through bank lending can be channeled into the real sector in support of economic activity or in financial and real estate markets in support of capital gains We collected macro-level data on bank credit aggregates over 2000ndash12 for 57 economies categorise according to the use of credit We note the long-term shift in the allocation of bank credit creation away from non-financial business lending and towards financial and especially real estate markets We then present new evidence on the channels from credit allocation pre-crisis to the severity of post-crisis recessions

Our first contribution is to show that it is not just the level but the composition of debt (defined as the share of mortgage credit in total credit) that matters A second contribution is to analyse the channels We collect additional industry-level data across 20 industries for a subset of economies We analyze the effect of changes in the pre-crisis composition of debt on total GDP and on investment consumption and capital allocation We find that changes in the share of household mortgage credit before the crisis have a significant effect on recession

severity after the 2007 crisis This is not the case for any other credit category nor for growth of total bank credit We address the causality challenge by using the difference between IMF growth forecasts and growth realisations This filters out country-specific drivers of both debt and income growth We address the model selection challenge by using Bayesian averaging models This indicates that the change in credit composition is among the three most robust determinants of post-crisis recession severity with income levels and current account balance The findings are robust to a wide range of control variables and to the different responses across advancedemerging and EMUnon-EMU economies

We then delve into the channels from change in debt composition to income growth loss The literature to date has focused on negative wealth effects on consumption for which we find strong evidence In addition we find evidence for two investment channels a loan supply effect and a capital allocation effect In the industry-level analysis we find that in economies which experienced a larger change in debt composition before 2008 there was a larger reduction of credit available and weaker capital re-allocation towards sectors with higher value-added This effect is observed already before the crisis and very strongly after the crisis We discuss policy implications and future research

Keywords private credit mortgages crisis output loss investment capital allocation

JEL Codes C11 C15 E01 O4

15Liquidity and Economic Activity Conference

PAPER 11 Iftekhar Hasan Gabelli School of Business Fordham University and Jean-Loup Soula Strasbourg University LaRGE Research Centre lsquoTECHNICAL EFFICIENCY IN BANK LIQUIDITY CREATIONrsquo

Abstract This paper generates an optimum bank liquidity creation benchmark by tracing an efficient frontier in liquidity creation (bank intermediation) and questions why some banks are more efficient than others in such activities Evidence reveals that medium size banks are most correlated to efficient frontier irrespective of their business models Small (large) banks ndash focused on traditional banking activities ndash are found to be the most (least) efficient in creating liquidity in on-balance sheet items whereas large banks ndash involved in non-traditional activities ndash are found to be most efficient in off-balance sheet liquidity creation Additionally the liquidity efficiency of small banks is more resilient during the 2007ndash08 financial crisis relative to other banks

Keywords banks technical efficiency liquidity creation diversification

Jel Codes G21 G28 G32

16 Liquidity and Economic Activity Conference

PAPER 12 Richard Anderson Lindenwood University John Duca Federal Reserve Bank of Dallas and Barry Jones Department of Economics State University of New York lsquoA BROAD MONETARY SERVICES (LIQUIDITY) INDEX AND ITS LONG-TERM LINKS TO ECONOMIC ACTIVITYrsquo

Abstract Liquid assets play a crucial role in economic activity as the medium in which payments are received and are made lsquoSudden stopsrsquo in financial markets ndash during which liquid assets are hoarded ndash are periods when economic activity slows abruptly Further it is the sine qua non of financial intermediation to alter the measured relative and absolute quantities of liquid assets In this way the observed quantities of liquid assets reflect both the path of past economic activity and anticipations of future activity The quantities must be regarded as arising endogenously within an intertemporal general equilibrium model of the economy a la Tobin (1958) and Merton (1971) Economic modeling and analysis traditionally proceeds by combining relatively high-dimension lists of specific assets into lower-dimension lsquomonetary aggregatesrsquo The defining characteristic of the assets included is that all are available to facilitate

the exchange of goods and services at a transaction cost less than infinity That is all included assets may be sold or used as collateral for the purchase and sale of goods and services and thereby provide liquidity services which can be tracked by measured opportunity costs of foregone interest which may not in practice reflect all transactions costs The last caveat particularly applies to household holdings of mutual fund assets outside of money market funds Our study contributes to the literature in two key ways First it expands a conventional Divisia measure of money services to account for the liquidity provided by such mutual fund assets Second it then explores the long-run connections between economic activity and monetary aggregates constructed as index numbers from 1929 to 2016 finding that the inclusion of mutual fund liquidity services results in a Divisia measure of money that has a much more stable velocity

We thank Emil Mihalov and Tyler Atkinson for research assistance The views expressed are those of the authors and do not necessarily reflect those of the Federal Reserve Bank of Dallas or the Federal Reserve System Any errors are our own

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 13: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

13Liquidity and Economic Activity Conference

PAPER 9 Jonathan Goldberg Federal Reserve Board lsquoTHE SUPPLY OF LIQUIDITY AND REAL ECONOMIC ACTIVITYrsquo

Abstract This paper identifies shocks to the supply of liquidity by dealer firms and investigates their effects on real economic activity First I develop a simple theoretical model of dealer intermediation then in a structural VAR model I use sign restrictions derived from the theoretical model to identify liquidity supply shocks Liquidity supply shocks that are orthogonal to information contained in macroeconomic and asset price variables have considerable predictive power for economic activity Moreover positive liquidity supply shocks cause large and persistent increases in real activity

Keywords liquidity dealer intermediation risk-taking real activity liquidity shocks

JEL Codes G10 G12 G17 G24

14 Liquidity and Economic Activity Conference

PAPER 10 Dirk Bezemer University of Groningen the Netherlands and Lu Zhang Sustainable Finance Lab and Utrecht University lsquoMACROECONOMIC IMPLICATIONS OF LIQUIDITY CREATION CREDIT ALLOCATION AND POST CRISIS RECESSIONSrsquo

Abstract In this paper we address macroeconomic implications of liquidity creation through bank lending and the impacts of liquidity on economic activity We note that liquidity created through bank lending can be channeled into the real sector in support of economic activity or in financial and real estate markets in support of capital gains We collected macro-level data on bank credit aggregates over 2000ndash12 for 57 economies categorise according to the use of credit We note the long-term shift in the allocation of bank credit creation away from non-financial business lending and towards financial and especially real estate markets We then present new evidence on the channels from credit allocation pre-crisis to the severity of post-crisis recessions

Our first contribution is to show that it is not just the level but the composition of debt (defined as the share of mortgage credit in total credit) that matters A second contribution is to analyse the channels We collect additional industry-level data across 20 industries for a subset of economies We analyze the effect of changes in the pre-crisis composition of debt on total GDP and on investment consumption and capital allocation We find that changes in the share of household mortgage credit before the crisis have a significant effect on recession

severity after the 2007 crisis This is not the case for any other credit category nor for growth of total bank credit We address the causality challenge by using the difference between IMF growth forecasts and growth realisations This filters out country-specific drivers of both debt and income growth We address the model selection challenge by using Bayesian averaging models This indicates that the change in credit composition is among the three most robust determinants of post-crisis recession severity with income levels and current account balance The findings are robust to a wide range of control variables and to the different responses across advancedemerging and EMUnon-EMU economies

We then delve into the channels from change in debt composition to income growth loss The literature to date has focused on negative wealth effects on consumption for which we find strong evidence In addition we find evidence for two investment channels a loan supply effect and a capital allocation effect In the industry-level analysis we find that in economies which experienced a larger change in debt composition before 2008 there was a larger reduction of credit available and weaker capital re-allocation towards sectors with higher value-added This effect is observed already before the crisis and very strongly after the crisis We discuss policy implications and future research

Keywords private credit mortgages crisis output loss investment capital allocation

JEL Codes C11 C15 E01 O4

15Liquidity and Economic Activity Conference

PAPER 11 Iftekhar Hasan Gabelli School of Business Fordham University and Jean-Loup Soula Strasbourg University LaRGE Research Centre lsquoTECHNICAL EFFICIENCY IN BANK LIQUIDITY CREATIONrsquo

Abstract This paper generates an optimum bank liquidity creation benchmark by tracing an efficient frontier in liquidity creation (bank intermediation) and questions why some banks are more efficient than others in such activities Evidence reveals that medium size banks are most correlated to efficient frontier irrespective of their business models Small (large) banks ndash focused on traditional banking activities ndash are found to be the most (least) efficient in creating liquidity in on-balance sheet items whereas large banks ndash involved in non-traditional activities ndash are found to be most efficient in off-balance sheet liquidity creation Additionally the liquidity efficiency of small banks is more resilient during the 2007ndash08 financial crisis relative to other banks

Keywords banks technical efficiency liquidity creation diversification

Jel Codes G21 G28 G32

16 Liquidity and Economic Activity Conference

PAPER 12 Richard Anderson Lindenwood University John Duca Federal Reserve Bank of Dallas and Barry Jones Department of Economics State University of New York lsquoA BROAD MONETARY SERVICES (LIQUIDITY) INDEX AND ITS LONG-TERM LINKS TO ECONOMIC ACTIVITYrsquo

Abstract Liquid assets play a crucial role in economic activity as the medium in which payments are received and are made lsquoSudden stopsrsquo in financial markets ndash during which liquid assets are hoarded ndash are periods when economic activity slows abruptly Further it is the sine qua non of financial intermediation to alter the measured relative and absolute quantities of liquid assets In this way the observed quantities of liquid assets reflect both the path of past economic activity and anticipations of future activity The quantities must be regarded as arising endogenously within an intertemporal general equilibrium model of the economy a la Tobin (1958) and Merton (1971) Economic modeling and analysis traditionally proceeds by combining relatively high-dimension lists of specific assets into lower-dimension lsquomonetary aggregatesrsquo The defining characteristic of the assets included is that all are available to facilitate

the exchange of goods and services at a transaction cost less than infinity That is all included assets may be sold or used as collateral for the purchase and sale of goods and services and thereby provide liquidity services which can be tracked by measured opportunity costs of foregone interest which may not in practice reflect all transactions costs The last caveat particularly applies to household holdings of mutual fund assets outside of money market funds Our study contributes to the literature in two key ways First it expands a conventional Divisia measure of money services to account for the liquidity provided by such mutual fund assets Second it then explores the long-run connections between economic activity and monetary aggregates constructed as index numbers from 1929 to 2016 finding that the inclusion of mutual fund liquidity services results in a Divisia measure of money that has a much more stable velocity

We thank Emil Mihalov and Tyler Atkinson for research assistance The views expressed are those of the authors and do not necessarily reflect those of the Federal Reserve Bank of Dallas or the Federal Reserve System Any errors are our own

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 14: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

14 Liquidity and Economic Activity Conference

PAPER 10 Dirk Bezemer University of Groningen the Netherlands and Lu Zhang Sustainable Finance Lab and Utrecht University lsquoMACROECONOMIC IMPLICATIONS OF LIQUIDITY CREATION CREDIT ALLOCATION AND POST CRISIS RECESSIONSrsquo

Abstract In this paper we address macroeconomic implications of liquidity creation through bank lending and the impacts of liquidity on economic activity We note that liquidity created through bank lending can be channeled into the real sector in support of economic activity or in financial and real estate markets in support of capital gains We collected macro-level data on bank credit aggregates over 2000ndash12 for 57 economies categorise according to the use of credit We note the long-term shift in the allocation of bank credit creation away from non-financial business lending and towards financial and especially real estate markets We then present new evidence on the channels from credit allocation pre-crisis to the severity of post-crisis recessions

Our first contribution is to show that it is not just the level but the composition of debt (defined as the share of mortgage credit in total credit) that matters A second contribution is to analyse the channels We collect additional industry-level data across 20 industries for a subset of economies We analyze the effect of changes in the pre-crisis composition of debt on total GDP and on investment consumption and capital allocation We find that changes in the share of household mortgage credit before the crisis have a significant effect on recession

severity after the 2007 crisis This is not the case for any other credit category nor for growth of total bank credit We address the causality challenge by using the difference between IMF growth forecasts and growth realisations This filters out country-specific drivers of both debt and income growth We address the model selection challenge by using Bayesian averaging models This indicates that the change in credit composition is among the three most robust determinants of post-crisis recession severity with income levels and current account balance The findings are robust to a wide range of control variables and to the different responses across advancedemerging and EMUnon-EMU economies

We then delve into the channels from change in debt composition to income growth loss The literature to date has focused on negative wealth effects on consumption for which we find strong evidence In addition we find evidence for two investment channels a loan supply effect and a capital allocation effect In the industry-level analysis we find that in economies which experienced a larger change in debt composition before 2008 there was a larger reduction of credit available and weaker capital re-allocation towards sectors with higher value-added This effect is observed already before the crisis and very strongly after the crisis We discuss policy implications and future research

Keywords private credit mortgages crisis output loss investment capital allocation

JEL Codes C11 C15 E01 O4

15Liquidity and Economic Activity Conference

PAPER 11 Iftekhar Hasan Gabelli School of Business Fordham University and Jean-Loup Soula Strasbourg University LaRGE Research Centre lsquoTECHNICAL EFFICIENCY IN BANK LIQUIDITY CREATIONrsquo

Abstract This paper generates an optimum bank liquidity creation benchmark by tracing an efficient frontier in liquidity creation (bank intermediation) and questions why some banks are more efficient than others in such activities Evidence reveals that medium size banks are most correlated to efficient frontier irrespective of their business models Small (large) banks ndash focused on traditional banking activities ndash are found to be the most (least) efficient in creating liquidity in on-balance sheet items whereas large banks ndash involved in non-traditional activities ndash are found to be most efficient in off-balance sheet liquidity creation Additionally the liquidity efficiency of small banks is more resilient during the 2007ndash08 financial crisis relative to other banks

Keywords banks technical efficiency liquidity creation diversification

Jel Codes G21 G28 G32

16 Liquidity and Economic Activity Conference

PAPER 12 Richard Anderson Lindenwood University John Duca Federal Reserve Bank of Dallas and Barry Jones Department of Economics State University of New York lsquoA BROAD MONETARY SERVICES (LIQUIDITY) INDEX AND ITS LONG-TERM LINKS TO ECONOMIC ACTIVITYrsquo

Abstract Liquid assets play a crucial role in economic activity as the medium in which payments are received and are made lsquoSudden stopsrsquo in financial markets ndash during which liquid assets are hoarded ndash are periods when economic activity slows abruptly Further it is the sine qua non of financial intermediation to alter the measured relative and absolute quantities of liquid assets In this way the observed quantities of liquid assets reflect both the path of past economic activity and anticipations of future activity The quantities must be regarded as arising endogenously within an intertemporal general equilibrium model of the economy a la Tobin (1958) and Merton (1971) Economic modeling and analysis traditionally proceeds by combining relatively high-dimension lists of specific assets into lower-dimension lsquomonetary aggregatesrsquo The defining characteristic of the assets included is that all are available to facilitate

the exchange of goods and services at a transaction cost less than infinity That is all included assets may be sold or used as collateral for the purchase and sale of goods and services and thereby provide liquidity services which can be tracked by measured opportunity costs of foregone interest which may not in practice reflect all transactions costs The last caveat particularly applies to household holdings of mutual fund assets outside of money market funds Our study contributes to the literature in two key ways First it expands a conventional Divisia measure of money services to account for the liquidity provided by such mutual fund assets Second it then explores the long-run connections between economic activity and monetary aggregates constructed as index numbers from 1929 to 2016 finding that the inclusion of mutual fund liquidity services results in a Divisia measure of money that has a much more stable velocity

We thank Emil Mihalov and Tyler Atkinson for research assistance The views expressed are those of the authors and do not necessarily reflect those of the Federal Reserve Bank of Dallas or the Federal Reserve System Any errors are our own

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 15: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

15Liquidity and Economic Activity Conference

PAPER 11 Iftekhar Hasan Gabelli School of Business Fordham University and Jean-Loup Soula Strasbourg University LaRGE Research Centre lsquoTECHNICAL EFFICIENCY IN BANK LIQUIDITY CREATIONrsquo

Abstract This paper generates an optimum bank liquidity creation benchmark by tracing an efficient frontier in liquidity creation (bank intermediation) and questions why some banks are more efficient than others in such activities Evidence reveals that medium size banks are most correlated to efficient frontier irrespective of their business models Small (large) banks ndash focused on traditional banking activities ndash are found to be the most (least) efficient in creating liquidity in on-balance sheet items whereas large banks ndash involved in non-traditional activities ndash are found to be most efficient in off-balance sheet liquidity creation Additionally the liquidity efficiency of small banks is more resilient during the 2007ndash08 financial crisis relative to other banks

Keywords banks technical efficiency liquidity creation diversification

Jel Codes G21 G28 G32

16 Liquidity and Economic Activity Conference

PAPER 12 Richard Anderson Lindenwood University John Duca Federal Reserve Bank of Dallas and Barry Jones Department of Economics State University of New York lsquoA BROAD MONETARY SERVICES (LIQUIDITY) INDEX AND ITS LONG-TERM LINKS TO ECONOMIC ACTIVITYrsquo

Abstract Liquid assets play a crucial role in economic activity as the medium in which payments are received and are made lsquoSudden stopsrsquo in financial markets ndash during which liquid assets are hoarded ndash are periods when economic activity slows abruptly Further it is the sine qua non of financial intermediation to alter the measured relative and absolute quantities of liquid assets In this way the observed quantities of liquid assets reflect both the path of past economic activity and anticipations of future activity The quantities must be regarded as arising endogenously within an intertemporal general equilibrium model of the economy a la Tobin (1958) and Merton (1971) Economic modeling and analysis traditionally proceeds by combining relatively high-dimension lists of specific assets into lower-dimension lsquomonetary aggregatesrsquo The defining characteristic of the assets included is that all are available to facilitate

the exchange of goods and services at a transaction cost less than infinity That is all included assets may be sold or used as collateral for the purchase and sale of goods and services and thereby provide liquidity services which can be tracked by measured opportunity costs of foregone interest which may not in practice reflect all transactions costs The last caveat particularly applies to household holdings of mutual fund assets outside of money market funds Our study contributes to the literature in two key ways First it expands a conventional Divisia measure of money services to account for the liquidity provided by such mutual fund assets Second it then explores the long-run connections between economic activity and monetary aggregates constructed as index numbers from 1929 to 2016 finding that the inclusion of mutual fund liquidity services results in a Divisia measure of money that has a much more stable velocity

We thank Emil Mihalov and Tyler Atkinson for research assistance The views expressed are those of the authors and do not necessarily reflect those of the Federal Reserve Bank of Dallas or the Federal Reserve System Any errors are our own

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 16: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

16 Liquidity and Economic Activity Conference

PAPER 12 Richard Anderson Lindenwood University John Duca Federal Reserve Bank of Dallas and Barry Jones Department of Economics State University of New York lsquoA BROAD MONETARY SERVICES (LIQUIDITY) INDEX AND ITS LONG-TERM LINKS TO ECONOMIC ACTIVITYrsquo

Abstract Liquid assets play a crucial role in economic activity as the medium in which payments are received and are made lsquoSudden stopsrsquo in financial markets ndash during which liquid assets are hoarded ndash are periods when economic activity slows abruptly Further it is the sine qua non of financial intermediation to alter the measured relative and absolute quantities of liquid assets In this way the observed quantities of liquid assets reflect both the path of past economic activity and anticipations of future activity The quantities must be regarded as arising endogenously within an intertemporal general equilibrium model of the economy a la Tobin (1958) and Merton (1971) Economic modeling and analysis traditionally proceeds by combining relatively high-dimension lists of specific assets into lower-dimension lsquomonetary aggregatesrsquo The defining characteristic of the assets included is that all are available to facilitate

the exchange of goods and services at a transaction cost less than infinity That is all included assets may be sold or used as collateral for the purchase and sale of goods and services and thereby provide liquidity services which can be tracked by measured opportunity costs of foregone interest which may not in practice reflect all transactions costs The last caveat particularly applies to household holdings of mutual fund assets outside of money market funds Our study contributes to the literature in two key ways First it expands a conventional Divisia measure of money services to account for the liquidity provided by such mutual fund assets Second it then explores the long-run connections between economic activity and monetary aggregates constructed as index numbers from 1929 to 2016 finding that the inclusion of mutual fund liquidity services results in a Divisia measure of money that has a much more stable velocity

We thank Emil Mihalov and Tyler Atkinson for research assistance The views expressed are those of the authors and do not necessarily reflect those of the Federal Reserve Bank of Dallas or the Federal Reserve System Any errors are our own

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 17: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

17Liquidity and Economic Activity Conference

PAPER 13 John W Keating University of Kansas and A Lee Smith KC Federal Reserve BanklsquoTHE OPTIMAL MONETARY INSTRUMENT AND THE (MIS) USE OF GRANGER CAUSALITYrsquo

Abstract Is it better to use an interest rate or a monetary aggregate to conduct monetary policy Operating procedures designed around interest rates are overwhelmingly preferred by central bankers This paper re-examines the question in light of a New Keynesian DSGE (dynamic stochastic general equilibrium) model Our calibrated model is very similar to many others in the literature except we follow Belongia and Ireland (2014) by augmenting the standard framework with a simple model of financial intermediation We assume banks operate in competitive markets maximising profits subject to financial market disturbances

We use this model to examine the welfare consequences of alternative choices for monetary policy instrument For the interest rate policy we employ the specification from Clarida Gali and Gertler (2000) in which the central bank reacts to expected future output and inflation gaps A gap is defined as the percent deviation from target We compare this rule to a k-percent rule for each monetary aggregate We consider three alternative aggregates determined within our model the monetary base the simple sum measure of money and the Divisia measure

Welfare results are striking While the interest rate dominates the monetary base both simple sum and Divisia k-percent rules outperform the interest rate In fact the Divisia rule is overall best This is an interesting finding because it contradicts the view of most central bankers that interest rates are superior Note that if we replaced k-percent money growth rules with rules allowing an appropriate reaction to macroeconomic variables

as in the Clarida Gali and Gertler (2000) interest rate rule the welfare benefits of simple sum and Divisia money would be even greater

Next we study the performance of Granger Causality tests in the context of data generated from our model For this we assume the Clarida Gali and Gertler (2000) interest rate rule characterises monetary policy While it is not optimal from a welfare perspective this rule is used for normative purposes Research suggests their framework provides a fairly good description of actual Fed behaviour

For each of our four potential monetary instruments we test for Granger Causality with respect to output and then with respect to prices We find the interest rate Granger Causes both variables at extremely high significance levels The same result is obtained for monetary base Simple sum money also Granger Causes prices at a highly significant level but only causes output at the 10 level The test results for Divisia are weakest of all Divisia fails to Granger Cause output and the evidence for prices is only at the 10 level

What do we learn from this investigation First a quantity aggregate as monetary instrument may have significant welfare benefits compared to an interest rate Second a weighted monetary aggregate (Divisia) performs best of all the candidate instruments we considered Third Granger Causality tests may be a poor method for selecting the best monetary policy instrument In our model Granger Causality tests suggest at best a weak effect of Divisia on inflation and no effect output Thus if instrument choice is based solely on Granger Causality test results an inferior policy instrument will be selected

Keywords monetary policy instrument monetary aggregates Granger Causality Divisia aggregates Jel Codes C43 C32 E37 E44 E52

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 18: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

18 Liquidity and Economic Activity Conference

PAPER 14Jane Binner University of Birmingham Logan Kelly University of Wisconsin and Jon Tepper Nottingham Trent UniversitylsquoON THE ROBUSTNESS OF SLUGGISH STATE-BASED NEURAL NETWORKS FOR PROVIDING USEFUL INSIGHT INTO THE NEW KEYNESIAN PHILLIPS CURVErsquo

Abstract The New Keynesian Phillips curve implies that the output gap the deviation of the actual output from its natural level due to nominal rigidities drives the dynamics of inflation relative to expected inflation and lagged inflation This paper exploits the empirical success of the New Keynesian Phillips curve in explaining the USArsquos inflation dynamics using a new nonlinear model of the output gap We estimate the output gap using the artificial intelligence method of multi-recurrent neural networks (MRNs) a type of neural network that is able to establish variable sensitivity to recent and more historic temporal information through the use of layer- and self-recurrent links to form a so-called sluggish state-based memory The MRN is able to robustly latch onto structural interactions amongst inflation oil prices and real output in the USA to produce a set of interesting impulse responses We present our empirical results using monthly data spanning 1960ndash2016 and contrast our new nonlinear models of the output gap with that of traditional measures in fitting the New Keynesian Phillips curve to provide useful insights for inflation dynamics and monetary policy analysis in the USA

PAPER 15Rakesh Bissoondeeal Aston University Michael Karaglou Aston University Jane Binner University of BirminghamlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 19: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

19Liquidity and Economic Activity Conference

PAPER 16Costas Milas University of Liverpool and Michael Ellington University of LiverpoollsquoIDENTIFYING AGGREGATE LIQUIDITY SHOCKS IN CONJUNCTION WITH MONETARY POLICY SHOCKS AN APPLICATION USING UK DATArsquo

Abstract We propose a new identification scheme for aggregate liquidity shocks in harmony with conventional monetary policy shocks in a partially identified structural VAR model We fit a Bayesian time-varying parameter VAR model to UK data from 1955 to 2016 The transmission mechanism of aggregate liquidity shocks changes substantially throughout time with the magnitude of these shocks increasing during recessions We provide statistically significant evidence in favour of asymmetric contributions of these shocks during the implementation of Quantitative Easing relative to the 2008 recession Aggregate liquidity shocks explain 32 and 47 of the variance in real GDP and inflation at business cycle frequency during the Great Recession respectively Preliminary Draft Please Do Not Cite

Keywords liquidity shocks time-varying parameter VAR money growth

JEL Codes E32 E47 E52 E58

PAPER 17Makram El Shagi Henan University China and Logan Kelly University of WisconsinlsquoSTRUCTURAL CHANGES AND THE ROLE OF MONEY IN THE UKrsquo

Abstract We investigate whether or not monetary aggregates are important in determining output In addition to the official Simple Sum measure of money we employ the weighted Divisia aggregate We use data-driven procedures to identify breaks in the data We find that monetary aggregates particularly Divisia are important in determining output but the results are sensitive to the time period employed There is a strong correlation between the significance of monetary aggregates and the importance the central bank attaches to them The results also suggest that the recovery from the financial crisis could have been faster if money was not being hoarded

Keywords monetary aggregates Divisia IS curve structural breaks

JEL Codes C22 E52

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 20: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

20 Liquidity and Economic Activity Conference

PAPER 18Soumya Suvra Bhadury National Council of Applied Economic Research (NCAER) New Delhi India and Taniya Ghosh Indira Gandhi Institute of Development Research (IGIDR) Mumbai India lsquoHAS MONEY LOST ITS RELEVANCE DETERMINING A SOLUTION TO THE EXCHANGE RATE DISCONNECT PUZZLE IN THE SMALL OPEN ECONOMIESrsquo

Abstract This study uses monthly data from 2000 to 2015 and utilises an open-economy structural vector autoregression model to identify the monetary policy shock responsible for exchange rate fluctuations in the economies of India Poland and UK Following Kim and Roubini (J Monet Econ 45(3)561ndash586 2000) our model is shown to fit these small open economies well by estimating theoretically correct and significant responses of price output and exchange rate to the monetary policy tightening The importance of monetary policy shock is determined by examining the variance decomposition of forecast error impulse response function and out-of-sample forecast Following Barnett (J Econ 14 (September)11ndash48 1980) we adopt a superior monetary measure namely

aggregationndashtheoretic Divisia monetary aggregate in our model The significance of adopting precisely measured money in the exchange rate model follows the comparison between models with no-money simple-sum monetary aggregates and Divisia monetary measures The bootstrap Granger causality test establishes a strong causal link from money especially Divisia money to exchange rate Additionally the empirical results provide three important findings The first finding suggests that the estimated responses of output prices money and exchange rate to monetary policy shocks are significant in models that adopt monetary aggregates The second finding indicates that Divisia money facilitate monetary policy to explain more of the fluctuation of exchange rate The third result supports the inclusion of Divisia money for better out-of-sample forecasting of exchange rate

Keywords monetary policy monetary aggregates divisia structural VAR exchange rate overshooting liquidity puzzle price puzzle exchange rate disconnect puzzle forward discount bias puzzle

JEL Codes C32 E41 E51 E52 F31 F41 F47

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 21: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

21Liquidity and Economic Activity Conference

PAPER 19William A Barnett and Jinan Liu University of Kansas lsquoUSER COST OF CREDIT CARD SERVICES UNDER INTERTEMPORAL NON-SEPARABILITYrsquo

Abstract This paper examines the user cost of monetary assets and credit card services under the assumption of intertemporal nonseparability and interest rate risk Barnett and Su (2016) derived theory permitting inclusion of credit card transaction services into Divisia monetary aggregates The risk adjustment in their theory is based on CCAPM under intertemporal separability The risk adjustment by their method is expected to be small as has been the case in prior studies of CCAPM risk adjustment of asset returns The equity premium puzzle focusses on that downward bias in the CCAPM risk adjustment But credit card interest rates aggregated over consumers are much more volatile than interest rates on monetary assets While the known downward bias of CCAPM risk adjustments are of little concern with Divisia monetary aggregates containing only low-risk monetary assets that downward bias cannot be ignored once credit card services are included We believe that extending to intertemporal nonseparability will provide a more plausible risk adjustment as has been emphasised by Barnett and Wu (2015)

In this paper we extend the credit-card-augmented Divisia monetary quantity aggregates and user costs to the case of risk aversion and intertemporal nonseparability in consumption Our results are for the lsquorepresentative consumerrsquo aggregated over all consumers While credit card interest-rate risk may be low for many consumers the volatility of credit card interest rates for the representative consumer is high as reflected by the high volatility of the Federal Reserversquos data on credit card interest rates aggregated over consumers The relationship between that volatility and the theory of aggregation over consumers under risk is beyond the scope

of this research but is a serious matter meriting future research

To implement our theory we introduce a pricing kernel into our model in accordance with the approach advocated by Barnett and Wu (2015) We assume that the pricing kernel is a linear function of the rate of return on a well-diversified wealth portfolio We find that the risk adjustment of the credit-card-services user cost to its certainty equivalence level can be measured by its beta That beta depends upon the covariance between the interest rates on credit card services and on the wealth portfolio of the consumer in a manner analogous to the standard CAPM adjustment

Credit card servicesrsquo risk premiums depend on their market portfolio risk exposure which is measured by the beta of the credit card interest rates The larger the beta through risk exposure to the wealth portfolio the larger the risk adjustment If the beta were very small then the user-cost risk adjustment would be very small In that case the unadjusted Divisia monetary index would be a good proxy even with high volatility of credit card interest rates One method of introducing intertemporal nonseparability is to assume habit formation We explore that possibility We are currently conducting research on empirical implementation of the theory proposed in this paper We believe that under intertemporal nonseparablity we will be able to generate a more accurate credit-card-augmented Divisia monetary index to explain the available empirical data

Keywords divisia index monetary aggregation intertemporal nonseparability credit card services risk adjustment

JEL Codes C43 D81 E03 E40 E41 E44 E51 G12

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 22: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

22 Liquidity and Economic Activity Conference

PAPER 20Per Hjertstrand Research Institute of Industrial EconomicsInstitutet foumlr Naumlringslivsforskning Stockholm Gerald A Whitney University of New Orleans and James L Swofford University of South AlabamalsquoPANEL DATA TESTS OF INDEX NUMBERS AND REVEALED PREFERENCE RANKINGSrsquo

Abstract For weakly separable blockings of goods from panel data we construct aggregates using index numbers We examine how well these aggregates ldquofitrdquo the data by investigating how close they come to solving revealed preference conditions

Samuelson (1983) discussed the relationship of index number theory to revealed preference analysis saying lsquoIndex number theory is shown to be merely an aspect of the theory of revealed preferencehelliprsquo Revealed preference has been used to test whether data on prices and quantities can be rationalised by a well-behaved utility function that is weakly separable in some subset of two goods Weak separability allows for a separation of a subset of goods into a sub-utility or aggregator function This is a necessary condition for the existence of an economic aggregate The existence of an economic aggregate is a justification for the use of superlative index numbers Superlative indexes are exact if they can provide a second order approximation to a particular aggregator function

Varianrsquos (1983) revealed preference conditions for weak separability do not rely on a particular functional form The solution values to these conditions can be interpreted as levels of utility These solution values while not unique are consistent with the preferences revealed by the data If period t is preferred to period s then the utility level assigned to t is greater or equal to that assigned to s Since indexes need not mirror

preferences this property is not guaranteed to hold after aggregation

Barnett and Choirsquos (2008) definition spans all superlative index numbers We consider aggregates based on two superlative index numbers the Fisher and Walsh and the non-superlative Paasche and Laspeyres indexes that are exact for a first order approximation to an aggregator function Because of its wide-spread use by central banks to construct monetary aggregates we also consider the simple sum aggregate and the index number version of the simple sum the Dutot index

We investigate how close aggregates come to solving the revealed preference conditions for weak separability when it is known that a solution exists We use the revealed preference solution values to check how well the indexes reflect the preference orderings in the data by comparing the direction of change between adjacent periods computing preference orderings implied by transitivity and then comparing the rankings revealed by the aggregates to the rankings from the weak separability conditions We calculate how much the aggregates need to be perturbed in order to satisfy weak separability and test whether the aggregates and solution values are equal in distribution

Using panel data we find as the number of goods increases the superiority of the superlative indexes manifests themselves This is consistent with the critique of the Dutot index by Fisher (1922) and Barnettrsquos (1980) critique of simple sum monetary aggregates

Keywords superlative index numbers aggregation preference orderings weak separability

Jel Code C43

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 23: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

23Liquidity and Economic Activity Conference

PAPER 21Sajid Chaudhry University of Birmingham Jane Binner University of Birmingham James Swofford University of South Alabama and Andrew Mullineux University of BirminghamlsquoSCOTLAND AS AN OPTIMUM CURRENCY AREArsquo

Abstract The June UK referendum on continued EU membership where the people of Scotland voted to remain while the rest of the UK voted to leave once again makes the issue of whether Scotland is an optimal currency area very topical England voted strongly to remain in Europe whilst Scotland backed remain by 62 to 38 The Scottish government published its draft bill on a second independence referendum this October The move does not mean another referendum will definitely be held but this does raise the possibility that Scotland might choose independence and staying in the EU without the rest of the UK If Scotland charts a course of independence from the rest of the UK then they would likely either issue their own currency or join or form another currency area In this paper we test the microeconomic foundations of a common currency area for Scotland UK and the rest of the UK without Scotland We find that the UK without Scotland meets the microeconomic criteria for a common currency area while the UK and Scotland alone have some small violations of these conditions We also find differences in the UK less Scotland and Scotland economies in loan data With respect to further research into the development of the monetary aggregation theory we recommend that this might be profitably directed towards exploring more sophisticated aggregation procedures as suggested by Barnett (forthcoming) or towards incorporating risk bearing assets into the money measures

Keywords Scottish independence common currency areas microeconomic foundations

PAPER 22Victor J Varcarcel University of Texas at DallaslsquoINTEREST RATE PASS-THROUGH DIVISIA USER COSTS OF MONETARY ASSETS AND THE FEDERAL FUNDS RATErsquo

Abstract Evidence of substantial pass-through in short-term rates and other rates of financial and monetary assets has been typically rejected by the data In a first this paper investigates level and volatility transmission among the Federal Funds rate and the user costs of various monetary assets which include both instruments of public debt (eg t-bills) and private debt (eg commercial paper) Results suggest substantial and time varying pass-through Higher degrees of bi-directional pass-through occurs between the Federal Funds rate and the user cost of more liquid assets both in levels and volatilities Federal Funds rate spillovers propagate faster onto more liquid rates as well These findings have important implications for monetary transmission not only across the term structure but along markets of varying liquidity

Keywords user cost of money federal funds rate volatility spillovers VAR monetary transmission interest rate channel

JEL Codes E30 E31 E65

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited

Page 24: Liquidity and Economic Activity - University of Birmingham · 2017-05-22 · Financial Services Indices Liquidity and Economic Activity In honour of Oswald Distinguished Professor

1525

5 copy

Uni

vers

ity o

f Birm

ingh

am 2

017

Prin

ted

on a

recy

cled

gra

de p

aper

con

tain

ing

100

pos

t-co

nsum

er w

aste

Edgbaston Birmingham B15 2TT United Kingdom

wwwbirminghamacuk

Designed and printed by

lsquoTriple-crownrsquo accredited