Since the advent of Markov chain Monte Carlo (MCMC) methods in the early 1990s, Bayesian methods have been proposed for a large and growing number of applications. One of the main advantages of Bayesian inference is the ability to deal with many different sources of uncertainty, including data, models, parameters and parameter restriction uncertainties, in a unified and coherent framework. This book contributes to this literature by collecting a set of carefully evaluated contributions that are grouped amongst two topics in financial economics. The first three papers refer to macro-finance issues for real economy, including the elasticity of factor substitution (ES) in the Cobb-Douglas production function, the effects of government public spending components, and quantitative easing, monetary policy and economics. The last three contributions focus on cryptocurrency and stock market predictability. All arguments are central ingredients in the current economic discussion and their importance has only been further emphasized by the COVID-19 crisis.
This thesis is an attempt to obtain further insight into the role of spatial and dynamic linkages in the field of Economics given the crucial need for a better understanding of the fundamental processes behind the spatial and temporal correlation patterns observable in the economic data. To date, most theoretical economic models and econometric studies have treated units of analysis as isolated entities, ignoring the spatial characteristics of the data and the potential role of space in modulating the economic evolution of countries, regions, municipalities, etc. In this regard, the essence of spatial economic analysis is that space matters. This implies that what happens in one economic unit of analysis is linked to what happens in neighboring economic units. In a spatial economic modeling framework, the spatial dimension and geographical arrangement of interacting economic agents are key drivers of economic processes and their final outcomes. The recognition of the wide range of interconnections between the interacting agents in economics requires to accommodate such interdependence in the modeling process and in order to verify models of social and spatial interaction, these spatial effects need to be explicitly accounted for. Failure to take into account spatial dependence and spatial heterogeneity in econometric models leads to major estimation problems because the coefficient estimates will be biased, inconsistent and/or inefficient. A distinct and innovative feature of this research is the use of static and dynamic spatial panel data estimation techniques for the empirical testing and validation of the theoretical models developed in the different chapters. This methodological approach is particularly appropriate for the analysis of economic phenomena from an integrated space-time perspective because it allows to model spillover, feedback and diffusion effects among the study units. Frequentist Spatial Econometrics modeling tools are complemented with Bayesian Spatial Econometrics and Relative Importance metrics in order to gain knowledge about the type of connectivity structures, the underlying spatial processes behind the observable data and to carry out inference in the relevance of the different factors explaining disparities among spatial units in time. The structure of this thesis consists of four self-contained chapters. Chapter 1 analyzes the volatility-regional growth nexus in a sample of European regions. To that end, a model of stochastic neoclassical growth with spatial interdependence is developed. In this framework, the economic growth rate of a particular region is affected not only by its own degree of volatility but also by the output fluctuations experienced by the remaining regions. In order to investigate the empirical validity of this result, the link between volatility and economic growth is examined in a sample of 272 European regions over the period 1991-2011 using a variety of static spatial pane specifications including spatial fixed effects. The results suggest the existence of a robust negative link between volatility and growth. Chapter 2 investigates regional development dynamics in a sample of 254 NUTS 2 European Union regions over the period 2000–2010. To that end, a new version of the Regional Lisbon Index (RLI) containing changes with respect the index developed by Dijkstra is proposed. The RLI employment, education and R&D indicators. Targets for these indicators are related to an action and economic development plan for the EU regions and have been incorporated into European Regional Policy programming. The analysis of regional development is based on the estimation of the spatial Durbin model. Different specifications of the spatial weights matrix describing the spatial arrangement are compared by means of Spatial Bayesian Econometrics techniques. The salient finding of this chapter is that the main drivers of the RLI growth rate are technological capital, infrastructures and employment growth. Chapter 3 analyzes unemployment differentials in 241 European regions during the period 2000-2011. To that end, a theoretical model with substantive spatial interactions among regions is developed. The solution implies a Dynamic Spatial Durbin Model specification including regional and institutional level factors as explanatory variables. In conjunction with dynamic-spatial panel estimates, relative importance metrics are used to quantify the effect of regional disequilibrium, equilibrium and national level factors. Relative importance analysis suggests that during the pre-crisis period unemployment disparities were mainly driven by regional level equilibrium factors. Nevertheless, labor market institutions are of major importance to explain increasing disparities during 2009-2011. Chapter 4 looks into the nature of fiscal policy interactions in local fiscal policy in Spain. This study extends traditional spatial spillover models of government spending by including dynamic effects in order to test the relevance of the incremental budget hypothesis stemming from political science research. The theoretical model developed in this study points out to an empirical specification including simultaneous and lagged endogenous interactions among the sample of municipalities, as well as exogenous interaction effects. To that end, a Dynamic Spatial Durbin panel data model is used to quantify the relevance of spatial spillovers and diffusion effects over time. Using annual data for a sample of 1230 Spanish municipalities during 2000 to 2012, it is observed that: there are significant positive simultaneous spatial spillovers in different government expenditure categories and that the incremental hypothesis stemming from political science has a greater explanatory power than that of spatial spillovers. ; Programa de Doctorado en Economía, Empresa y Derecho (RD 99/2011) ; Ekonomiako, Enpresako eta Zuzenbideko Doktoretza Programa (ED 99/2011)
The first paper sheds light on the informational content of high frequency data and daily data. I assess the economic value of the two family models comparing their performance in forecasting asset volatility through the Value at Risk metric. In running the comparison this paper introduces two key assumptions: jumps in prices and leverage effect in volatility dynamics. Findings suggest that high frequency data models do not exhibit a superior performance over daily data models. In the second paper, building on Majewski et al. (2015), I propose an affine-discrete time model, labeled VARG-J, which is characterized by a multifactor volatility specification. In the VARG-J model volatility experiences periods of extreme movements through a jump factor modeled as an Autoregressive Gamma Zero process. The estimation under historical measure is done by quasi-maximum likelihood and the Extended Kalman Filter. This strategy allows to filter out both volatility factors introducing a measurement equation that relates the Realized Volatility to latent volatility. The risk premia parameters are calibrated using call options written on S&P500 Index. The results clearly illustrate the important contribution of the jump factor in the pricing performance of options and the economic significance of the volatility jump risk premia. In the third paper, I analyze whether there is empirical evidence of contagion at the bank level, measuring the direction and the size of contagion transmission between European markets. In order to understand and quantify the contagion transmission on banking market, I estimate the econometric model by Aït-Sahalia et al. (2015) in which contagion is defined as the within and between countries transmission of shocks and asset returns are directly modeled as a Hawkes jump diffusion process. The empirical analysis indicates that there is a clear evidence of contagion from Greece to European countries as well as self-contagion in all countries.
This Editorial evaluates 14 invaluable and interesting articles in the Special Issue "Applied Econometrics" for the Journal of Risk and Financial Management (JRFM). The topics covered include recovering historical inflation data from postage stamps prices, FHA loans in foreclosure proceedings through distinguishing sources of interdependence in competing risks, information in earnings forecasts, nonlinear time series modeling, a systemic approach to management control through determining factors, economic freedom and FDI versus economic growth, efficient cash use of the Taiwan dollar, financial health prediction in companies from post-Communist countries, influence of misery index on U.S. Presidential political elections, multivariate student versus Gaussian regression models in finance, financial derivatives markets and economic development, income inequality and economic growth in middle-income countries, abnormal returns, mis-measured risk, network effects, and risk spillovers in stock returns.
The technological innovations in information processing and the increased storage capability have made possible to collect very large data sets in various fields of economics and finance. Researchers, companies, and governments look for ways to exploit this rich information. This special issue collects 11 papers who present state-of-the-art techniques to deal with many predictors, many regressors or many instruments. It grew out of the CIREQ conference on high dimensional models in econometrics organized by Marine Carrasco and Sílvia Gonçalves in Montreal, Canada, on May 4–5, 2012.
From a global perspective, this dissertation illustrates the consequences of choosing a particular balance between completeness and manageability in terms of model building, both in the field of macroeconomics and econometrics. Each of the three chapters shows that there are potentially dramatic consequences of taking into account, in a manageable way, additional and – with respect to the question at hand – essential layers of reality. In particular, in terms of econometric theory, Chapter 1 demonstrates that considerably more precise estimates within a dynamic factor model are obtainable by employing simple two-step estimators taking into account additional features of the data-generating process. Chapter 2, furthermore, considers a macroeconomic model featuring labor market frictions. It highlights the important consequences for equilibrium allocations and optimal monetary policy when altering the central aspect of the wage determination mechanism, so that it is consistent with empirical evidence. Finally, Chapter 3 presents an empirical investigation, studying the effects of fiscal policy on the macroeconomy. In this regard, it demonstrates the importance of allowing for particular features of the information structure as well as of distinguishing certain subcomponents of the fiscal variables, which might have different macroeconomic effects as implied by economic theory. As a result, we can illustrate that while at a certain level of abstraction, the findings of different approaches in the literature seem to be in conflict with each other, at another level the antagonism vanishes. More specifically, Chapter 1 considers efficient estimation of dynamic factor models. A simple two-step estimation procedure is suggested to obtain efficient estimates in the presence of both heteroskedasticity and autocorrelation. Interestingly, with respect to the factors, it is only potential heteroskedasticity which has to be taken into account, whereas for the loadings the relevant aspect is just autocorrelation. While, as we show, the feasible two-step PC-GLS estimator is asymptotically as efficient as the estimator that (locally) maximizes the full approximate likelihood function, small sample gains may be obtained by iterating the two-step estimator. This is indeed reflected in the results of our extensive Monte Carlo investigation. Moreover, we also document the superior performance of the two-step PC-GLS estimator compared to standard PC. Chapter 2 studies optimal monetary policy using a simple New-Keynesian model featuring labor market frictions, heterogeneous wage setting, as well as markup shocks. Replacing the typically used uniformly rigid wage by a form of wage heterogeneity consistent with the data, has profound effects on the policy implications of this model. In particular, the sizable short-run inflation unemployment trade-off, which is present in the original setup, disappears. This results despite the fact that the original setup is just slightly changed and even though the model features an economy-wide average wage which is still rigid. As an overall rigid real wage is typically employed to address the so-called unemployment volatility puzzle, I follow suggestions in the literature with respect to an alternative mechanism and introduce markup shocks as additional driving forces into the model. While a short-run inflation unemployment trade-off indeed arises in this setup, optimal policy is nevertheless characterized by an overriding focus on inflation stabilization. In light of the conflicting empirical results concerning the effects of fiscal policy on the macroeconomy and the potentially important role of fiscal policy anticipation in this regard, Chapter 3 investigates the response of private consumption to fiscal shocks within an SVAR framework, explicitly taking into account fiscal foresight. A new empirical approach is suggested, designed to align the information sets of the private agents and the econometrician, which allows us to avoid the problems of standard VARs. A simulation experiment based on a theoretical model featuring (imperfect) fiscal foresight documents the ability of the approach, in contrast to a standard VAR, to correctly capture macroeconomic dynamics. This result is even robust to deviations from the underlying informational assumptions of the expectation augmented VAR. The subsequent application to real life data indicates that it is indeed important in empirical work to allow for anticipation of fiscal policy. Moreover, it shows that it is crucial to distinguish subcomponents of total government expenditure which might have different macroeconomic effects according to economic theory. By distinguishing government defense and non-defense spending, it is possible to reconcile the results of the narrative and SVAR approaches to the study of fiscal policy effects.
Lawrence R. Klein (1920-2013) played a major role in the construction and in the further dissemination of econometrics from the 1940s. Considered as one of the main developers and practitioners of macroeconometrics, Klein's influence is reflected in his application of econometric modelling " to the analysis of economic fluctuations and economic policies " for which he was awarded the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel in 1980. The purpose of this paper is to give an account of Klein's image of econometrics focusing on his early period as an econometrician (1944-1950), and more specifically on his period as a Cowlesman (1944-1947). Independently of how short this period might appear, it contains a set of fundamental publications and events, which were decisive for Klein's conception of econometrics, and which formed Klein's unique way of doing econometrics. At least four features are worth mentioning, which characterise this uniqueness. First, Klein was the only Cowlesman who carried on the macroeconometric programme beyond the 1940s, even if the Cowles had already abandoned it. Second, his pluralistic approach in terms of economic theory allowed him not only to use the Walrasian framework appraised by the Cowles Commission and especially by T.C. Koopmans, but also the Marxian and Keynesian frameworks, enriching the process of model specification and motivating economists of different stripes to make use of the nascent econometrics. Third, Klein differentiated himself from the rigid methodology praised at Cowles; while the latter promoted the use of highly sophisticated methods of estimation, Klein was convinced that institutional reality and economic intuition would contribute more to econometrics than the sophistication of these statistical techniques. Last but not least, Klein never gave up what he thought was the political objective of econometrics: economic planning and social reform.
Lawrence R. Klein (1920-2013) played a major role in the construction and in the further dissemination of econometrics from the 1940s. Considered as one of the main developers and practitioners of macroeconometrics, Klein's influence is reflected in his application of econometric modelling " to the analysis of economic fluctuations and economic policies " for which he was awarded the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel in 1980. The purpose of this paper is to give an account of Klein's image of econometrics focusing on his early period as an econometrician (1944-1950), and more specifically on his period as a Cowlesman (1944-1947). Independently of how short this period might appear, it contains a set of fundamental publications and events, which were decisive for Klein's conception of econometrics, and which formed Klein's unique way of doing econometrics. At least four features are worth mentioning, which characterise this uniqueness. First, Klein was the only Cowlesman who carried on the macroeconometric programme beyond the 1940s, even if the Cowles had already abandoned it. Second, his pluralistic approach in terms of economic theory allowed him not only to use the Walrasian framework appraised by the Cowles Commission and especially by T.C. Koopmans, but also the Marxian and Keynesian frameworks, enriching the process of model specification and motivating economists of different stripes to make use of the nascent econometrics. Third, Klein differentiated himself from the rigid methodology praised at Cowles; while the latter promoted the use of highly sophisticated methods of estimation, Klein was convinced that institutional reality and economic intuition would contribute more to econometrics than the sophistication of these statistical techniques. Last but not least, Klein never gave up what he thought was the political objective of econometrics: economic planning and social reform.
This dissertation develops several econometric techniques to address three issues in financial economics, namely, constructing a real estate price index, estimating structural break points, and estimating integrated variance in the presence of market microstructure noise and the corresponding microstructure noise function. Chapter 2 develops a new methodology for constructing a real estate price index that utilizes all transaction price information, encompassing both single-sales and repeat-sales. The method is less susceptible to specification error than standard hedonic methods and is not subject to the sample selection bias involved in indexes that rely only on repeat sales. The methodology employs a model design that uses a sale pairing process based on the individual building level, rather than the individual house level as is used in the repeat-sales method. The approach extends ideas from repeat-sales methodology in a way that accommodates much wider datasets. In an empirical analysis of the methodology, we fit the model to the private residential property market in Singapore between Q1 1995 and Q2 2014, covering several periods of major price fluctuation and changes in government macroprudential policy. The index is found to perform much better in out-of-sample prediction exercises than either the S&P/Case-Shiller index or the index based on standard hedonic methods. In a further empirical application, the recursive dating method of Phillips, Shi and Yu (2015a, 2015b) is used to detect explosive behavior in the Singapore real estate market. Explosive behavior in the new index is found to arise two quarters earlier than in the other indices. Chapter 3, based on the Girsanov theorem, obtains the exact finite sample distribution of the maximum likelihood estimator of structural break points in a continuous time model. The exact finite sample theory suggests that, in empirically realistic situations, there is a strong finite sample bias in the estimator of structural break points. This property is shared by least squares estimator of both the absolute structural break point and the fractional structural break point in discrete time models. A simulation-based method based on the indirect estimation approach is proposed to reduce the bias both in continuous time and discrete time models. Monte Carlo studies show that the indirect estimation method achieves substantial bias reductions. However, since the binding function has a slope less than one, the variance of the indirect estimator is larger than that of the original estimator. Chapter 4 develops a novel panel data approach to estimating integrated variance and testing microstructure noise using high frequency data. Under weak conditions on the underlying efficient price process and the nature of high frequency noise contamination, we employ nonparametric kernel methods to estimate a model that accommodates a very general formulation of the effects of microstructure noise. The methodology pools information in the data across different days, leading to a panel model form that enhances efficiency in estimation and produces a convenient approach to testing the linear noise effect that is conventional in existing procedures. Asymptotic theory is developed for the nonparametric estimates and test statistics.
This article studies the effectiveness of formative assessment techniques for an econometrics course. A large scale project with extensive formative assessment was included in the course, incorporating both summative and formative assessment. The specific assignment is for students to learn all steps to turn raw data obtained from the Interuniversity Consortium for Political and Social Research (ICPSR) using the statistical software SPSS into a viable thesis that is worthy of undergraduate conference presentations and publications. Learning gains from implementation of this project using extensive formative assessment are measured by changes in student course grades.
Defence date: 9 July 2015 ; Examining Board: Prof. Peter Hansen, Supervisor, EUI; Prof. Juan Dolado, EUI; Prof. Christian Brownlees, Universitat Pompeu Fabra; Dr. Christiane Baumeister, University of Notre Dame. ; This thesis comprises three essays. The first two chapters address topics in commodity markets and their interaction with derivative and other asset markets. The third essay deals with the effects to and from fiscal policy that arise due to the structure of the relationship between central and regional governments. Finance and applied econometrics constitute the common thread for these articles. The first two take a financial economics and financial econometrics perspective, while the third essay addresses a topic of public finance with an empirical approach. The first chapter offers an explanation for volatile oil prices. Using information from options and futures I document economically large jump tail premia in the crude oil market which can be related to investors' \fear". These premia vary substantially over time and signiffcantly forecast crude oil futures and spot returns. The results suggest that oil futures prices overshoot (undershoot) in the presence of upside (downside) tail fears in order to allow for smaller (larger) risk premia thereafter. The second essay relates the comovement of stock and commodity prices to increased participation of financial investors in commodity future markets. I present a partial equilibrium model in which demand for futures by financial investors transmits stock market shocks into commodity prices via a time varying risk premium. Empirically, I find that commodity index investors react systematically to stock market shocks by adjusting their commodity risk exposure. In the third chapter, joint with Abián García Rodríguez, we investigate the relationship between fiscal decentralization - the share of government spending and taxation carried out at the the subnational level - and fiscal policy effects. Using a cross-section of countries, we document a positive relationship between decentralization and the effectiveness of fiscal policy as measured by the size of fiscal multipliers. We also present a case study for the decentralization process in Spain and find that it had a positive impact on output growth.
The foremost place of R. Frisch in the modern development of economics and especially econometrics is assessed. Some of his achievements are briefly described: on land rent theory, marginal utility of income, bunch map 'technique' model building, planning, programming and Pareto optima. Some remarkable facts of Frisch's life are also reported, e.g. his short stay in a concentration camp during the war and his crusade against Norway joining the E.E.C. on the ground that the country had better remain an outside exampie of democracy and social justice.
open ; . ; Statistica Economica ; Applied Microeconometrics with particular interest in Spatial Econometrics and Network Econometrics, Bayesian Econometrics, Economics of social interactions, Financial economics with particular interest in Payment Systems, Money Markets, Liquidity Management, Financial Networks, Central Banking ; open ; Rainone, Edoardo ; Rainone, Edoardo
This dissertation consists of three chapters that concern risk management and financial econometrics. Fannie Mae and Freddie Mac's implicit government guarantee is widely argued to cause irresponsible risk taking. Despite moral-hazard concerns, this paper presents evidence that Fannie Mae and Freddie Mac (the GSEs) more effectively managed home price risks during the 2000-2006 housing boom than private insurers. Mortgage origination data reveal that the GSEs were selecting loans with increasingly higher percentage of down payments, or lower loan to value ratios (LTVs), in boom areas than in other areas. Furthermore, the decline of LTVs in boom areas stems entirely from the segment insured by the GSEs only, and none of the decline stems from the segment co-insured by private mortgage insurers. Private mortgage insurers also did not lower their exposure to home price risks along other dimensions, including the percentage of high LTV GSE loans they insured. To quantify how the GSEs' portfolios would have performed under alternative home price scenarios, I build an insurance valuation model based on competing-risk hazard regressions, calibrated Hull and White term-structure model, and forecasting prepayment and default speeds. I find that the GSEs' risk management would have been sufficient for the historically average 32% mean reversion but insufficient for the realized 95% mean reversion between 2006 and 2011. My results highlight that post-crisis reform of the mortgage insurance industry should carefully consider additional factors besides moral hazard, such as mortgage insurers' future home price assumptions.The second chapter studies high dimensional time series, with application to estimating the mean variance frontier. One persistent challenge in macroeconmics and finance is how to draw inference from data with a large cross section but short time series. Financial econometric techniques all are designed for large time series and small cross-sections. However, financial data typically has a large cross section and short time series (large-N small-T). One particular large-N small-T impact is the underestimation of risk in the mean variance frontier. We propose a correction for the finite sample bias when the underlying returns are high dimensional linear time series. Our algorithm first corrects for the bias in eigenvalues of the asset return covariance matrix, and then estimate the contribution of each leading factor to the mean variance frontier. A cross validation method is employed to select the optimal number of leading factors. Performance of the proposed methods is examined through extensive simulation studies.The third chapter studies how expected home prices affect borrowers' default behavior. One of the penalties mortgage defaulters face is being locked out of the mortgage market and missing the home price appreciation. I find that this penalty deters some borrowers from defaulting. A higher future home price growth implies a lower ex-ante default probability. Furthermore, high credit score borrowers react more to past home price declines and future home price appreciation than low credit score borrowers. This suggests that high credit score borrowers are more likely to be strategic defaulters. A model is built to study the effect of changing the cooling off period. In high expected home price appreciation areas, a longer cooling-off period amplifies the impact of each foreclosure. In low expected home price appreciation areas, a longer cooling-off period reduces the number of foreclosures.
Defence date: 4 April 2016 ; Examining Board: Prof. Evi Pappa, EUI, Supervisor; Prof. Agustín Bénétrix, Trinity College Dublin; Prof. Christian Brownlees, Universitat Pompeu Fabra; Prof. Peter Hansen, EUI. ; The thesis consists of three essays in the fields of international finance and applied econometrics. The first chapter analyzes the co-movement of market premia for rare adverse events, addressing the important issue of contagion. The second chapter studies the impact of rare adverse events on the estimates of the risk-aversion coefficient and on household's portfolio composition. This chapter shows that the threat of a rare disaster justifies household's positive bond holdings. Finally, the last chapter studies if the information not contained in the domestic yield curve, but contained in the foreign yield curve helps to predict future dynamics of domestic yields. The first chapter proposes a novel approach to assessing volatility contagion across equity markets. More specifically I decompose the variance risk premia of three major stock indices into: crash and non-crash risk components and analyse their cross-market correlations. I find that crash-risk premia exhibit higher correlations than non-crash risk premia, implying the existence of volatility contagion. This suggests that investors believe that equity returns will be more highly correlated across countries during market crashes than during more normal times. The main result of the analysis holds when I apply other measures of co-movement as well as when I allow correlation to be time varying. Moreover I document that crash-premia constitute a large portion of the overall variance risk premia, highlighting the importance of crash-risks. Unlike the existing literature, my approach to testing the existence of volatility contagion does not rely on short periods of financial distress, but allows for crash-risk premia to be computed in tranquil times. The second chapter assesses the impact of the Peso problem on the econometric estimates of the risk aversion coefficient. Rietz (1988) and subsequently Barro (2006) showed that the introduction of the crash risk allows the canonical general equilibrium framework to generate data consistent equity premia even under low risk aversion of the representative agents. They argue that the original data used to calibrate these models suffer from a Peso problem (i.e. does not encounter a crash state). To the best of my knowledge the impact of their Peso problem on the estimation of the risk aversion coefficient has not to date been evaluated. This chapter seeks to remedy this. I find that crash states that are internalized by economic agents, but are not realized in the sample, generate only a small bias in the estimates of the risk aversion coefficient. I also show that the introduction of the crash state has a strong bearing on the household's portfolio composition. In fact, under the internalized crash state scenario, households exhibit positive bond holdings even in a frictionless environment. In the third chapter, co-authored with Andrew Meldrum and Peter Spencer, we show, using data on government bonds in Germany and the US, that 'overseas unspanned factors' - constructed from the components of overseas yields that are uncorrelated with domestic yields - have significant explanatory power for subsequent domestic bond returns. This result is remarkably robust, holding for different sample periods, as well as out of sample. By adding our overseas unspanned factors to simple dynamic term structure models, we show that shocks to those factors have large and persistent effects on domestic yield curves. Dynamic term structure models that omit information about foreign bond yields are therefore likely to be mis-specified.