Repository logo
 

Theses - Economics

Browse

Recent Submissions

Now showing 1 - 20 of 91
  • ItemOpen Access
    Identification and Estimation with Deconfounded Instruments
    Tien, Christian
    The primary contribution of this research is the introduction of a novel methodology, called common confounding (CC), for identifying and estimating the causal effects of endogenous (treatment) variables on an outcome variable with partially endogenous instrumental variables. A crucial estimation step called deconfounding recovers variation in the instruments, which is unassociated with some observed variables called proxies, and consequently with any unobserved variables that explain the association between the instruments and proxies. These unobserved variables are called common confounders of the instruments and proxies. If the instruments are excluded and exogenous conditional on the unobserved common confounders, the deconfounding step recovers excluded and exogenous instrument variation, where conditioning on the proxies as covariates would generally not. In this sense, the deconfounding step discards more instrument variation than conditioning on the same proxies as covariates would. While this discarding of instrument variation naturally incurs a cost in terms of estimation precision, deconfounding may permit the identification and estimation of causal effects with instruments that violate exclusion or exogeneity due to such common confounders. The deconfounding step is at the heart of the identification proofs and estimation theory developed in this research. Although the linear model, with its simplicity, is often used for illustrative purposes, all identification frameworks in this thesis are semiparameric, featuring a low-dimensional causal effect or parameter of interest and nonparametric nuisance functions. This research explores three principal frameworks for instrument deconfounding: moment-based in Chapter 1, nonparametric with index sufficiency in Chapter 2, and nonparametric bridge functions in Chapter 3. The moment-based framework allows for probabilistic deconfounding, where the deconfounding step can be interpreted as a probabilistic construction of hypothetical exogenous instruments from partially endogenous observed categorical instruments. In the index-sufficient and bridge function frameworks, point identification results for structural causal quantities are provided under two sets of standard parametric assumptions in instrumental variable (IV) approaches: the linear separability of a disturbance in the outcome model [Newey and Powell, 2003] and first-stage strict monotonicity [Imbens and Newey, 2009]. In Chapter 3, no further parametric assumptions are imposed compared to traditional IV, such that unlike in Chapters 1 and 2, a standard exclusion and exogeneity assumption is required for the proxies with respect to the treatment variable. The setting in Chapter 3 illustrates transparently how the CC approach bridges the identification assumptions of nonparametric IV and proximal learning (PL) [Cui et al., 2020]. The CC approach was in part inspired by recent advances in the proximal learning literature [Miao et al., 2018]. Proximal learning extends the literature on nonclassical measurement error models with mismeasured confounders [Hu, 2008, Kuroki and Pearl, 2014], which, similar to mixture models [Hall et al., 2003, Allman et al., 2009, Bonhomme et al., 2016], feature independence conditions between some observed variables conditional on the unobserved variable. Unlike all identification results in mixture models, identification in the CC approach does not require the conditional independence of three observed variables. Instead, under minimal exogeneity and exclusion requirements, the crucial deconfounding step extracts excluded and exogenous variation in the instruments, which is then used to identify some average structural effects of an endogenous treatment on an outcome variable. In Chapters 1 and 2, the identification theorems are complemented by comprehensive estimation theory. In the moment-based framework with nonparametric nuisances in Chapter 1, under some regularity conditions, a double robust and Neyman orthogonal score can be constructed and utilised for debiased machine learning. The estimation of typical causal quantities in the index-sufficient framework in Chapter 2 motivates novel semiparametric estimation theory, combining debiasing with respect to sequentially dependent nuisance functions [Singh, 2021, Chernozhukov et al., 2022] and strong identification subject to nuisance functions, which are defined as solutions to possibly ill-posed inverse problems [Bennett et al., 2022]. An extensive simulation in Chapter 1 sheds light on some expected behaviours of estimators based on deconfounded instruments, while two empirical applications in Chapters 1 and 2 demonstrate the appeal of this approach in practical settings with partially endogenous instruments. In the empirical application of Chapter 1, the CC approach is used with local linear moments to estimate the average treatment effect of substance use on antisocial behaviour in adolescents with peer behaviour instruments. In the empirical application of Chapter 2, pre-college GPA measures are deconfounded with respect to the unobserved common confounder ability and used to infer the causal effect of obtaining a BA degree on net worth later in life in a linear model. This thesis equips applied researchers with comprehensive identification and estimation frameworks for the more nuanced CC approach to instrument exclusion and exogeneity. Deconfounding enables the use of partially endogenous instruments to identify new causal effects and can be employed to challenge instrument exogeneity assumptions in previous applied research. The novel identification and estimation approach developed in this thesis disrupts the conventional dichotomy of exogenous versus endogenous instruments, a well-known limiting factor for instrumental variable methods.
  • ItemOpen Access
    Essays on the Macroeconomics of Labor Markets
    Freund, Lukas; Freund, Lukas [0000-0002-0972-1061]
    This thesis contributes to the study of firms and labor markets in macroeconomics. The first two chapters argue that analyzing how workers with varied skills are organized in production – opening the “black box” of firms – provides new insights about macroeconomic outcomes and trends. The first chapter develops an analytically tractable theory of team production and formation. The second chapter demonstrates how to empirically discipline this model using micro data and shows that it explains why firms play a prominent role in the rise of wage inequality observed in Germany over recent decades. The final, co-authored chapter focuses on the business-cycle dynamics of labor markets and examines how firms’ behavioral adjustments to uncertainty shocks transmit to aggregate unemployment. The first chapter formulates a tractable theory of the firm as a “team assembly” technology. Most production processes are too complex for any one individual to know how to perform all required tasks well, so firms assemble groups of workers with specialized skills. Workers evidently differ also in their level of talent and the most successful firms appear to employ the most talented employees. I formulate a theory that parsimoniously captures and connects these observations. I show that an important feature of team production are complementarities across coworkers’ talents: the marginal contribution of one employee’s talent to output is higher when matched with other talented workers. This creates an incentive for positive assortative matching based on talent. While search frictions prevent perfect sorting, deepening skill specialization reinforces complementarities, leading to an equilibrium in which some firms have “superstar teams” of talented workers, while other firms employ less productive workforces. The second chapter is a quantitative application of the theory developed in the first chapter, showing that this model helps explain the so-called "firming up of inequality": most of the rise in wage inequality in advanced economies since the 1980s has occurred between, rather than within, firms. Suggestive evidence indicates that skill specialization has intensified since the mid-1980s. I develop a theory-guided strategy for inferring the strength of coworker talent complementarities from matched employer-employee data on wages and labor market histories. Applying this method to administrative panel data for Germany, I find that complementarities as well as talent sorting have strengthened since the mid-1980s, aligned with the theory’s predictions. According to exercises conducted using an estimated version of the model, this mechanism explains a quantitatively significant share of the observed increase in the between-firm share of wage inequality. It also contributes to elevated firm-level productivity dispersion. The third chapter – which is based on joint work with Wouter Den Haan and Pontus Rendahl published in the Journal of Monetary Economics – examines how shifts in firms’ demand for labor due to aggregate uncertainty shocks affect unemployment at business-cycle frequency. Empirically, unemployment tends to worsen in times of heightened economic uncertainty. One possible explanation is that uncertainty may raise firms’ option value of adopting a “wait and see” approach compared to posting vacancies. This mechanism is intuitively appealing, influential in the literature, and commonly viewed as having been formalized in the canonical search-and-matching model. Our first contribution is to prove that option-value effects actually play no role in the standard model. As this model assumes free entry, the expected value of vacancy posting is, and will always remain, zero. Hence, there is no point in waiting. Second, and constructively, we show that when the mass of entrepreneurs is finite and there is heterogeneity in firm-specific productivity, a rise in perceived uncertainty robustly increases the option value of waiting and reduces job creation. The paper thus reconciles common intuitions, theory, and evidence about the labor market effects of uncertainty shocks.
  • ItemOpen Access
    Essays in Spatial Economics, Trade, and Climate Change
    Salgado Baptista, Diogo
    In this thesis, I study topics on climate change, trade, and agglomeration. I utilize modern quantitative spatial modelling tools and structural estimation techniques to investigate how space and spatial frictions affect economic outcomes, with a particular focus on climate change. In Chapter 1, I quantify the relative importance of three fundamental sources of agglomeration benefits for the spatial distribution of the US economy: the access to ideas, labor, and goods. I develop a quantitative spatial model to structurally recover industry-location productivities and estimate agglomeration spillover parameters for 26 tradable industries. I decompose the observed spatial distribution of employment into the separate contributions of the different forces and simulate counterfactual economies in which they are absent. Knowledge spillovers have the most sizable effect on aggregate economic activity, whose elimination generates a 26 percent reduction in an index of spatial concentration. This is followed by access to goods, with a 20 percent reduction, and then access to labor with 6 percent. The relative importance of each force depends crucially on the type of industry and spatial range considered. In joint work with John Spray and Filiz Unsal, chapter 2 develops a quantitative spatial general equilibrium model with heterogeneous households and locations to study households’ vulnerability to food insecurity from climate shocks. In the model, households endogenously respond to negative climate shocks by increasing off-farm labor, importing additional food and temporarily migrating to earn additional income to ensure sufficient calories. Because these coping strategies are most effective when trade and migration costs are low, remote households are more vulnerable to climate shocks. Poorer households are also more vulnerable because more of their income and consumption is derived from the agricultural sector. We calibrate the model to 77 districts in Nepal and estimate the impact of historical climate shocks in 2011-2022 on food consumption and welfare. We estimate that, on an annual basis, floods, landslides, and storms combined generated GDP losses of 2 percent, welfare losses of 1.5 percent for the average household and increased the rate of undernourishment by almost 7 percent. In counterfactual simulations, we show the role of better access to migration and trade in building resilience to climate shocks. Chapter 3 quantifies the welfare impact of climate variability and assesses the role of trade integration as a climate adaptation strategy. Climate change involves not only changes in mean climatic conditions, but also in the degree of climate variability, i.e. how much weather fluctuates year-to-year around the mean. In this chapter, I measure the effect of weather fluctuations on agricultural yields by combining weather data from a 0.5ºx0.5º grid with agricultural production series for 23 different crops in 1961-2022. I then use climate projections from general circulation models to assess the change in climate variability in the future under an RCP8.5 emissions scenario. I calibrate a quantitative trade model with multiple sectors for all 54 African countries and quantify the impact of productivity shocks from weather fluctuations on household consumption. According to the results, climate variability is expected to generate annual consumption-equivalent losses of 1.31 percent in 2015-2100 for the average African household and of 6.7 percent among the top five most affected countries. If all African countries were assigned trade costs equivalent to the 90th percentile of trade openness, the welfare impact of future climate variability would be reduced by 27 percent, on average. However, differences in comparative advantage in agriculture affects countries' exposure to climate variability, leading to heterogeneous welfare effects across space. For instance, households among the top five countries that benefit the most from trade integration see decreases of more than 75 percent in welfare losses, while those among the bottom five experience a 50 percent average increase in them.
  • ItemOpen Access
    Essays in Microeconomic Theory
    Langtry, Alastair; Langtry, Alastair [0000-0003-1709-9265]
    This thesis provides a series of essays on network theory and political economy. The work on networks is largely focused on social interactions within communities, and how they impact consumption of status goods, and provision of public goods. The two chapters on political economy provide novel mechanisms for explaining lobbying behaviour and extensions of property rights, respectively. The first chapter examines how social networks affect the provision of public goods within a community. Here, networks spread information about whether people contribute to a public good. This mechanism can generate incentives for cooperative behaviour without repeated interactions. It finds a critical threshold in network connectivity at which the level of public good provision changes sharply. This threshold is common to everyone, even though people are heterogeneous in terms of how costly they find it to provide public goods and in their network position. The second chapter examines a model of reference dependent choice where reference points are determined by social comparisons. An increase in the strength of social comparisons, even by only a few agents, increases consumption and decreases welfare for everyone. Strikingly, a higher marginal cost of consumption can increase welfare. In a labour market, social comparisons with co-workers create a big fish in a small pond effect, inducing incomplete labour market sorting. Further, it is the skilled workers with the weakest social networks who are induced to give up income to become the big fish. The third chapter also studies social comparisons. It adapts ideas from social identity theory to set out a new framework for modelling conspicuous consumption. Notably, this approach can explain two stylised facts about conspicuous consumption that initially seem at odds with one another, and to date have required different families of models to explain each: (1) people consume more visible goods when their neighbours’ incomes rise, but (2) consume less visible goods when incomes of those with the same race in a wider geographic area rise. The first fact is typically explained by ‘Keeping up with the Joneses’ models, and the second by signalling models. The fourth chapter views lobbying as a contest between the government and many different special interest groups. The government fights lobbying by interest groups with its own political capital. In this world, a government wants to `sell protection' -- give favourable treatment in exchange for contributions -- to certain interest groups. It does this in order to build its own `war chest' of political capital, which improves its position in fights with other interest groups. And it does so until it wins all remaining contests with certainty. This stands in contrast to existing models that often view lobbying as driven by information or agency problems. The fifth chapter presents a new rationale for a self-interested economic elite voluntarily extending property rights. When agents make endogenous investment decisions, there is a commitment problem. Ex post, the elite face strong incentives to expropriate investments from the non-elite (who don’t have property rights), which dissuades investment. Extending property rights to new groups can resolve this problem, even for those not given property rights, by making public good provision more attractive to the elite. Unlike other models of franchise extensions, extending property rights in my model does not involve the elite ceding control to others. Rather, it changes the incentives they face. Chapter three is joint work with Christian Ghiglino (we both contributed equally). The rest are my own work. A version of chapter two is published under the same title as Langtry, A., 2023. American Economic Journal: Microeconomics, 15(3), pp.474-500. A version of chapter four is published under the same title as Langtry, A., 2024. Journal of Public Economics, 231, p.105068.
  • ItemOpen Access
    Essays in Experimental Economics
    Barak-Halatova, Darija
    This thesis has three chapters and employs experimental methods to study human behaviour. Chapter 1 presents a framed interactive online experiment conducted on Amazon Mechanical Turk (MTurk) during the first COVID-19 lockdown. The experiment studies social distancing using a repeated game where subjects interact in groups of 5 on a fixed network. We find that both the network position and the disease contagiousness matter for social distancing. Furthermore, fines for non-adherence to social distancing are effective at promoting this behaviour, whereas informational nudges have a limited impact. Finally, we find that political ideology is strongly correlated with subjects’ distancing choices. Instrumental variable (IV) analysis suggests that ideology in the US may be causally related to one’s propensity to practice social distancing, with political conservatives exhibiting lower adherence. Chapter 2 builds upon the methods and findings of Chapter 1. Here, we study the effectiveness of contact tracing and quarantine programs aimed at containing the spread of infectious diseases, and further investigate the role of political ideology in driving behaviour in the US. Using a dynamic game, we investigate the relative effectiveness of mandatory and optional tracing and quarantine regimes. We find that a system where both contact tracing and quarantine are mandatory works best. However, even a fully optional program is better than no program at all, but making only one element mandatory has no added benefit. Interestingly, none of the programs have a consistent long-run effect on economic activity. Using a pre-registered IV, we find that political ideology is causally related to behaviour. Political conservatives are less likely to reduce their economic activity and sign up to the contact tracing programs. Using simulations, we show that this heterogeneity may be important for welfare of ideologically homogeneous groups. Chapter 3 presents a laboratory experiment studying stated beliefs and play in one-shot normal-form games. The experiment relies on 15 games, characterised by an explicit strategic structure -- ie strategic complementarity or substitutability. We find that subjects’ stated beliefs about the actions of their opponents are generally inconsistent with standard models of strategic thinking. Moreover, we observe that the shape of stated beliefs is systematically affected by the strategic structure of the game as well as the number of opponents faced by the player. Next, about 35-40% of subjects' decisions in the experiment are not consistent in the sense of expected payoff maximisation. Our data does not support the hypothesis that the inconsistencies are driven by `mistakes' committed in identifying best response to one's own stated beliefs, or lack of precision in stating one's true beliefs. Rather, it suggests that strategic uncertainty and risk are the key drivers behind the inconsistency. JEL codes: C91, C99, D01, D83, D91, I12, I18.
  • ItemOpen Access
    Essays on Cross-Sectional and Network Dependence
    Liu, Weiguang; Liu, Weiguang [0000-0001-8813-2726]
    Cross-sectional dependence is a common phenomenon in economic data. It has attracted increasing attention recently and puts forward new challenges. This dissertation consists of three chapters that deal with several important econometric problems that arise when crosssectional dependence is present. Chapter 1: We apply Stein’s method to investigate the normal approximation for both non-degenerate and degenerate U-statistics with cross-sectionally dependent underlying processes in the Wasserstein metric. We show that the convergence rates depend on the mixing rates, the sparsity of the cross-sectional dependence, and the moments of the kernel functions. Conditions are derived for central limit theorems to hold as corollaries. Chapter 2: We apply the limiting distribution theory for degenerate U-statistics, discussed in the previous chapter, to kernel smoothing based nonparametric specification tests, allowing for cross-sectional dependence in the underlying processes. This theory is then used to generalise the classical Fama-MacBeth regression test. Chapter 3: We generalise the tapering estimators for high-dimensional covariance matrices to allow for more complex and practical dependence structures, whilst allowing for measurement errors in the observations of the structure. We establish the convergence rate of such estimators under weak conditions on the measurement errors and we argue that it is often beneficial to include auxiliary structural information even if it is measured with errors.
  • ItemOpen Access
    On the Local Economic and Political Consequences of Controversial Policies
    Savu, Alexandru; Savu, Alexandru [0000-0002-3299-3065]
    This thesis explores several questions in political economy that have come to the forefront of public discourse over the last fifteen years. Since the great financial crisis of 2008, the political landscape in Europe has been marked by major, controversial developments, such as the implementation of severe austerity measures throughout the continent, and growing nationalistic tendencies in many countries, perhaps best represented by the United Kingdom’s withdrawal from the European Union. That said, despite the magnitude of such developments, their economic and political consequences remain critically understudied, with several pressing questions unaddressed in the existing literature, such as: What are the long-lasting effects of austerity on the socio-political beliefs of those affected? Is there a link between austerity exposure and growing support for ideologies outside the mainstream such as right-wing populism? Do those who implement austerity suffer an electoral penalty for doing so, and what factors potentially mitigate this penalty? And perhaps more fundamentally, how do people learn and update their political values and beliefs in times of great uncertainty? For sure, part of the reason why the process of tackling such queries has been sluggish has to do with the fact that, empirically, providing reasonable estimates for the causal effects of any policy change is extremely challenging, seeing that appropriate counterfactual scenarios are almost impossible to come by (e.g., how would a society’s political preferences have evolved had austerity not been implemented?). And without credible estimates, informed policy adjustments become more difficult still. My thesis aims to make a contribution - by providing much needed empirical evidence on the economic and political consequences of several noteworthy policies and developments that took place over the last fifteen years. In particular, one common element (and indeed the key element) linking all three chapters in this dissertation is my focus on local effects - that is, exploring how various local outcomes (chiefly, public spending, vote shares and turnout) respond to a number of policy changes described below. By using local areas (municipalities and constituencies) as the main units of observation in my analyses, I am able to construct better counterfactuals, and therefore provide more convincing causal estimates, which serve the broader purpose of addressing some of the aforementioned questions. In Chapter 1, The Local Political Economy of Austerity: Lessons from Hospital Closures, I focus on the question regarding austerity’s political effects, and argue that accounting for how local political agents respond to centrally-implemented austerity can help us better understand austerity feasibility, as well as its effects on public finances. Exploiting geographic variation in austerity exposure created by a highly-impactful 2011 reform in Romania whereby sixty-seven public hospital were discontinued, I document a significant increase in local "voter-friendly" infrastructure spending in the policy’s catchment areas. This effect can be explained by an electoral mechanism, whereby such changes are implemented by local politicians affiliated with the national politicians responsible for austerity in order to recuperate from the policy’s negative electability effects. Overall, my results suggest that accounting for the electorally-driven responses of local governments can contribute to our understanding of the broader economic and political effects of austerity. Building on these findings, in chapter 2, Austerity, Turnout and Populism: The Case of Local Fiscal Rules, my co-author Salvatore Lattanzio and I explore the political effects of austerity further by homing in on the link between austerity exposure and populism support. Once more exploiting local geographic variation in austerity exposure created this time by a set of fiscal rules implemented in Italy in 2013, we document a marked rise in support for Italy’s radical-right parties - the Northern League and the Brothers of Italy - in the affected municipalities. This result directly adds credible causal evidence to an ongoing debate on the economic roots of right wing populism, and has broader implications for the politics of fiscal rules and austerity more broadly. Finally, in chapter 3, Do Labels Polarise: Theory and Evidence from the Brexit Referendum, my co-author Su-Min Lee and I investigate how people learn and update their preferences regarding major policies such as Brexit. Our focus in this article is on social learning and contextual effects, and we hypothesize that people’s political beliefs are directly affected by those held by individuals in their geographic vicinity. Our main contribution in this work is to offer credible causal evidence corroborating this hypothesis. To do so, we once again exploit local geographic variation. In this case, we use the binary classification of local constituencies as "Leave" ("Remain") depending on whether the local Leave vote-share recorded in the 2016 referendum was above (below) fifty percent, and document a two percentage-point decrease in the anti-Brexit Liberal Democrat vote share in "Leave"-labelled relative to "Remain"-labelled constituencies, mirrored by an increase for the Conservatives. More broadly, these results constitute novel evidence for contextual information signals causally contributing to geographical polarisation - once more, a highly debated subject over the past fifteen years.
  • ItemOpen Access
    Analysing the One Certainty that Rules Us All: Trade (and Economic) Policy Uncertainty
    Hong, Tacye
    My PhD thesis consists of 4 papers that analyse uncertainty, specifically trade policy uncertainty, by examining its best measurement and its impacts on firms’ international and domestic operations. The first paper, “Improving the Trade Policy Uncertainty Index”, studies how best to measure trade policy uncertainty. I show that Baker et al.’s (2016) and Caldara et al.’s (2020) newspaper-based Trade Policy Uncertainty (TPU) indices misclassify and omit a substantial number of articles. I use a new set of search terms to construct an improved TPU index for the U.S. from 1987 to 2023 and provide a detailed mapping between major trade policy events in the U.S. and the new index. The second paper, “The Effects of Trade Policy Uncertainty on Exporters and Multinational Firms”, is a theoretical paper examining the effects of trade policy uncertainty on the entry and exit of exporters and multinational enterprises (MNEs) in foreign markets. In this paper, I use a two-country DSGE model, where export and MNE continuation costs are lower than entry costs, to analyse the macroeconomic effects of a shock to trade policy uncertainty. In the third paper, “Panic at the Costco: Buffer Stock under Uncertainty”, I build a Small Open Economy model where a distributor keeps buffer stock inventories for the household and producers, and inventory is driven by a preference shifter that depends on the available stock à la Görtz et al. (2022). The fourth paper, “The Volatility of Economic Policy Uncertainty”, is joint work with Prof. Paul Kattuman, where we analyze the volatility of uncertainty, measured using Baker et al.’s (2016) newspaper-based Economic Policy Uncertainty index, in a T-GARCH framework. We then examine the spillovers in both the level and volatility of economic policy uncertainty across countries from 1997 to 2023.
  • ItemOpen Access
    Essays on Search and Screening Frictions
    Mylius, Felix
    Frictions form an integral part of the job search. Search and screening costs make the matching process for workers and firms costly, and asymmetric information can significantly contribute to mismatch: workers cannot observe who else applies to a vacancy, whereas firms cannot observe what other jobs their candidates have applied to. Moreover, even upon screening their candidates, firms only get an imperfect signal of the candidate's skill level. I consider some of these frictions in my theoretical frameworks in each chapter of my thesis. The first chapter studies whether workers can benefit from higher application costs. While this might sound counterintuitive since applicants are harmed by paying more, they, in fact, benefit from reduced congestion: if costs are negligible, workers apply excessively, overloading firms with applications. Thus, with higher costs, firms receive fewer applications from uninterested applicants, which allows them to focus their screening efforts on interested candidates. This means that higher costs increase the chance that a worker matches with their preferred firm. When characterising the conditions under which workers benefit from cost increases, I show that not only how many applications are directly withdrawn through a cost increase matters, but also the nature of application strategies, given that those are either strategic substitutes or complements. In the case of strategic complements, cost increases are more impactful. My findings provide a novel perspective on application costs, challenging the conventional view that higher costs would harm applicants. The fact that lower search barriers make workers apply to firms they are less interested in also plays a central role in the second chapter, albeit in a different context. I address the phenomenon that job referrals are still as essential in labour markets as they used to be over the past decades. Since the internet has made searching and applying for jobs much easier, one would expect people to rely less on social contacts to find a job nowadays. To find an answer to this puzzle, I explore a new channel that makes referred candidates stand out among all applicants: a higher likelihood of accepting a job offer. This trait becomes particularly advantageous whenever firms face large uncertainty over whether their candidates would accept their job offer. If search barriers vanish and workers apply to more firms, a referred candidate expects to face more competitors. On the other hand, if workers apply to more firms, they are, on average, less interested in each firm they apply to, which makes referred candidates stand out more. This means the chances of getting a job offer through a referral can increase if search barriers decrease. My third chapter is based on a joint project with Dr Keith Chan from the Hong Kong University of Science and Technology. We study the impact of economic downturns on the skill composition of unemployed workers. Since screening is imperfect, this question is fundamental for firms that are considering posting vacancies. Therefore, numerous papers have studied this topic theoretically and empirically. However, their results contradict each other, meaning there is no consensus on the net impact of shocks. To reconcile this division, we provide a tractable framework to understand (i) what factors drive the skill composition of unemployed workers and (ii) how productivity shocks affect those drivers. To provide a comprehensive analysis, we allow for productivity shock changes in both frequency and severity. We find that changes in either dimension induce effects that have opposite directions, making the net change of the average skill of an unemployed worker ambiguous. Moreover, we characterise the circumstances under which the net impact is positive or negative.
  • ItemOpen Access
    Essays on the Economics of Debt, Default and Housing Markets
    Hannon, Andrew
    This thesis is composed of three chapters on different topics of macro-finance. Thematically, they are linked by their focus on household credit constraints - be they exogenous regulatory constraints, as in Chapter 1, or endogenous constraints as in Chapters 2 and 3. The first chapter is joint work with Juan Castellanos Silvan and Gonzalo Paz-Pardo. We propose a joint model of the aggregate housing and rental markets in which both house prices and rents are determined endogenously. The key part of the model is that households can choose their housing tenure status (renters, homeowners, or landlords) depending on their age, wealth, and income. We show how the reliance on heterogeneous household landlords generates an upward sloping supply curve for rented accommodation. With the model in hand, it can be used to study the introduction in Ireland in 2015 of macroprudential policies that limited loan-to-value (LTV) and loan-to-income (LTI) ratios of newly originated mortgages. The introduction of stringent LTV and LTI ratios mitigates house price growth, but increases rents and reduces homeownership rates. As a result, middle-income households are negatively affected. The second chapter stays with the theme of housing markets but introduces endogenous default. The main innovation is to consider delinquency as an important stage of default. In some European countries, mortgage delinquency rates are much higher than foreclosure rates. The stock of delinquent mortgages peaked at 9\% of GDP in the Eurozone periphery and the average length of a delinquency spell was over 10 months. This fact has been largely neglected in the macro-finance literature. This chapter provides a framework for understanding why high levels of persistent mortgage delinquency can emerge as an equilibrium outcome during a housing market crisis. Banks tolerate delinquency because the gain to foreclosing is less than the option value of continuing with the delinquent loan. By nesting a straightforward game between debt-distressed households and banks within a quantitative macro-housing model, the option to enter delinquency is shown to significantly attenuate (by roughtly half) the consumption drop during a crisis. Importantly, I show that the ability of households to gain insurance through delinquency is significantly impacted by the degree of recourse available to banks upon foreclosure. The model features realistic lifecycle dynamics, tenure choice between renting and owning, endogenous liquidity in the housing market and defaultable, long-term debt. The third and final chapter leaves housing markets behind but sticks with the theme of default, credit risk, and interest rates. Using the Brazilian administrative credit registry data with the universe of all consumer loans originated by banks in the country from 2013 to 2019, the chapter documents high borrowing interest rates, which vary systematically with individual characteristics. In particular, even after controlling for several observable individual attributes - such as income, debt, occupation, and default probabilities, low-income individuals pay higher interest rates than high-income borrowers. We quantitatively analyze a consumer credit market with these characteristics observed in Brazil in a model with endogenous default, and where consumers face idiosyncratic income and expenditure shocks. We perform counterfactual analyses to assess the impact of different financial reforms on borrowing rates, consumption inequality, consumption insurance, and welfare. We find that reforms aiming to reduce intermediation costs and bank market power have sizeable average and distributional welfare implications.
  • ItemOpen Access
    Essays on the Transmission of Monetary and Macroprudential Policies
    Patozi, Alba
    Financial markets both influence and contain important information on the transmission of a range of macroeconomic policies. This dissertation explores the link between financial markets and the transmission of monetary and macroprudential policies. The first chapter, co-authored with Kristina Bluwstein, evaluates the effect of macroprudential policy announcements on systemic risk. We construct a new dataset of macroprudential policy announcements for the United Kingdom and estimate their effect on systemic risk, using a high-frequency identification approach. First, by examining a sample of the largest UK-listed banks, we identify macroprudential policy announcement shocks that were unanticipated by the financial markets. Second, we study the effects of market-based macroprudential policy surprises in a local projection. We find that a perceived macroprudential policy tightening contributes to a substantial reduction in systemic risk in the short run, with effects persisting for several months. The reduction is mostly attributed to the reaction in equity and bond markets. The second chapter estimates the sensitivity of green firms to monetary policy. I document an upward trend in environmental performance among publicly listed companies over the last decade. I then evaluate the implications of firms becoming ‘greener’ for the transmission of monetary policy on asset prices, credit risk and firm-level investment. I show that green firms, with high environmental scores, are considerably less affected by monetary policy shocks compared to their brown counterparts, with low environmental scores. Moreover, I find that dependence of monetary policy responses on firm-level greenness is not due to intrinsic differences in firms’ characteristics or differences in firms’ social and governance performance, but can be attributed to investors’ preferences for sustainable investing. The third chapter examines the impact of preferences for sustainable investing on the transmission of monetary policy. I consider a stylized theoretical framework where investors derive additional utility from their holdings of green assets and demonstrate two key findings. Firstly, investors’ preferences for sustainable investing dampen the semi-elasticity of green asset prices to monetary policy shocks. Secondly, contractionary monetary policy shocks result in a tilt of investors’ portfolios towards green assets. Empirical evidence supports both predictions. Specifically, I find that green firms held by index funds with ESG mandates exhibit a lower sensitivity to monetary policy shocks compared to brown firms. Additionally, I find that the share of green assets in the portfolios of institutional investors does indeed rise in response to higher interest rates. Moreover, by analysing mutual fund flow data, I uncover evidence of an "active" portfolio rebalancing channel among institutional investors.
  • ItemEmbargo
    Essays in Macroeconomics and Finance
    Xu, Yiming
    This thesis contains four chapters, which are at the intersection of macroeconomics and finance, specifically, the implications of leasing for capital and finance allocation efficiency. I focus on operating leases, which account for a significant proportion of the overall productive physical assets used by US firms (from the asset side), and are important external financing sources for firms (from the liability side). However, they were treated as off-balance-sheet items before the 2019 lease accounting rule changes in ASC 842 – hence they are important sources of “unmeasured” asset and liability. The first chapter argues that leasing is an important mechanism for mitigating credit constraint-induced capital misallocation, yet this channel has been widely overlooked in the current macro-finance literature. This chapter demonstrates and quantifies this novel channel through a dynamic general equilibrium model, which features heterogeneous firms, collateral constraints, and an explicit buy versus lease decision. Furthermore, it provides guidance on the empirical measurement of capital misallocation: ignoring leased capital and the mitigation effect can result in significant overestimations of the level and cyclicality of measured capital misallocation. Strong empirical evidence is documented to support the model implications. The second chapter studies leasing’s distinctive role in enhancing total factor productivity (TFP) through facilitating entry and technology adoption. As a form of more collateralizable financing, leasing provides financing for physical capital required in a more productive sector, and raises the expected payoff of entering this sector. This chapter analytically characterizes this extensive-margin role of leasing in terms of efficiency gains. Quantitatively, it shows that approximately 5% of TFP gains can be achieved from leasing along this channel. The next chapter disentangles the sources of capital misallocation in the US when leased capital is explicitly factored in. Using the method of David and Venkateswaran (2019), this chapter obtains new estimates for various sources of lease-adjusted misallocation – including adjustment costs, uncertainty, and firm-specific factors that are correlated with productivity, or permanent to firms. Moreover, this chapter finds that the reduction in overall measured misallocation when adjusting for lease is largely explained by the effect of lease-adjustment through the latter two firm-specific factors. Instead of studying the “true" production side, the final chapter focuses on firms’ liability (financing) side and examines the allocation efficiency of finance. A large overestimation of measured finance misallocation (Whited and Zhao, 2021) is documented when lease-induced debt is ignored among US manufacturing firms. Appropriately adjusting for lease-induced debt leads to a large inefficiency reduction in real value-added. Leasing improves the allocation of finance by raising the total amount of finance as well as by alleviating inefficient debt-equity combinations across firms. Finally, this chapter finds that factoring in lease-induced debt lowers both the level and dispersion of finance costs, consistent with the mitigation effect of lease-adjustment on finance allocation efficiency.
  • ItemOpen Access
    Trade Shocks and Trade Policy: Firm Export Behaviour and Competition in International Markets
    Prayer, Thomas
    This dissertation explores the effects of international trade policy cooperation on firm export behaviour and competition in international markets. It is motivated by three facts. First, firms are increasingly exposed to international trade. World trade has grown by an average of 5% over the last 30 years, and the world trade to GDP ratio has increased by more than a third over this period to around 52% in 2020. Second, firms are subject to a large and growing number of trade policy interventions. The number of trade agreements notified to the WTO, for example, has increased more than tenfold over the past 30 years, to well over 300. And third, international trade agreements do more than just influence the behaviour of the firms they benefit. They shape the competitive environment of the markets they affect. The first chapter argues that mutual recognition agreements have both direct and indirect effects on firms’ export behaviour. These agreements make it easier for international firms to demonstrate that their products meet the minimum requirements of a market, and so streamline market access. I build a multi-country model of trade with oligopolistic competition and variable markups that explicitly models the conformity assessment sector and allows firm decisions to be non-separable across markets. The model highlights three separate direct effects of mutual recognition agreements on firms, an effect on firms’ fixed costs, an effect on firms’ marginal costs, and an effect on the price of conformity assessment, and shows that each of these direct effects also has an indirect effect on the intensity of competition in final goods markets. I simulate my model with standard parameters from the literature and show that mutual recognition agreements benefit firms which are affected by their direct effects, but hurt firms which are exposed only to their indirect effects. I also show that both of these effects are particularly relevant for the most productive firms, and that the effects of mutual recognition agreements without rules of origin are essentially the same for firms based in signatory countries and third-country firms. The second chapter estimates the effects of mutual recognition agreements on disaggregated firm exports. I build on the model presented in the first chapter and develop a new empirical approach which focuses on the experience of third-country firms to circumvent reverse causality concerns and distinguish between the direct and indirect effects of mutual recognition agreements. I then compile a new dataset on the product coverage and implementation timeline of thirteen mutual recognition agreements and combine it with the universe of firm exports for thirteen emerging and developing economies. My results show that firms which benefit from cost reductions as a result of a mutual recognition agreement export around 15% more, while firms which are primarily exposed to the agreement’s broader effects on a market’s competitive environment export up to 15% less. Both the direct and indirect effects of mutual recognition agreements also matter for firms’ extensive margin decisions, export volumes, export prices, the frequency and variety of firm exports, and firms’ import behaviour, as well as for their export performance in unaffected markets. The third chapter explores firms’ markup decisions in international markets and highlights the implications of differences in the intensity of competition between different sets of firms. This chapter is joint work with Meredith Crowley and Lu Han and develops a new multi-country model of trade with variable markups in which firms from the same origin compete more fiercely with each other than with firms from other origins. This gives rise to a rich oligopolistic structure in which an exporter’s markup adjustment after a trade policy shock depends on two market share reallocation effects: (1) an *across-origin* reallocation effect which captures changes in overall competitive pressures for all firms from a given origin in the destination market and (2) a *within-origin* reallocation effect which captures changes in the competitive pressure an exporter faces from its compatriot firms from the same origin. To explore the implications of this model empirically, we combine data on trade agreements and tariffs with detailed administrative customs datasets for eleven emerging and developing economies. We find that the two reallocation effects move in opposite directions after a bilateral tariff liberalisation: while firms face less competition from other origins and the preferred origin as a whole gains market share, each exporting firm faces more competitive pressure from its compatriots due to additional entry and loses market share within-origin. Overall, our results suggest that the within-origin reallocation effect dominates and exporters *reduce* their markups in response to a bilateral trade liberalisation.
  • ItemOpen Access
    Essays in Modern Macroeconomics
    Wales, Daniel
    This PhD thesis consists of a short introduction followed by three papers. Each paper examines a different topic within the broad area of modern monetary and international macroeconomics. The first paper, Product Quality, Measured Inflation and Monetary Policy, written in collaboration with Alex Rodnyansky and Alejandro van der Ghote, fills a gap in the New Keynesian literature, which has largely ignored product quality adjustments. This paper proposes a tractable model of a New Keynesian (NK) economy where, in addition to the standard price and quantity channels, firms are able to endogenously adjust the quality of their products in response to shocks. This new model, featuring endogenous product quality changes subject to adjustment costs, nests the canonical New Keynesian model, which is frequently used as the starting point for policy analysis by central banks. In this framework, endogenous product quality choices imply a larger slope than the traditional NK Phillips curve as, for a positive productivity shock that lowers marginal costs, quality-adjusted prices decline because firms are simultaneously able to increase the quality of their products. Allowing firms to adjust product quality also amplifies the economy’s response to productivity shocks. Following a positive productivity shock the natural real interest rate decreases by more as households look to smooth a larger increase in consumption, which is boosted by a rise in both the quantity and quality of the goods they consume. As a result, monetary policy responds by altering the nominal interest rate by more for a given productivity shock. Model misspecification of imperfectly observable quality adjustments matters more for macroeconomic stabilization than the mismeasurement of those adjustments. With no misperception of product quality by the monetary authority, the principles for optimal monetary policy are, nonetheless, unchanged as the product quality extensions to the canonical NK model preserve divine coincidence. My second PhD paper, The Impact of Large-Scale Asset Purchases on Wealth Inequality examines the relationship between monetary policy and household wealth inequality through changes in the size and composition of the central bank’s balance sheet. I focus on the impact on household wealth inequality through the financial portfolio rebalancing channel of monetary policy transmission. I construct a theoretical model that has multiple assets (of differing liquidity), banks and heterogeneous agents, who experience idiosyncratic labor productivity shocks. This model is carefully calibrated to reproduce theoretical levels of wealth inequality which match those observed in the US Survey of Consumer Finances. I use the model to replicate the changes in the Federal Reserve’s balance sheet which arose in the aftermath of the 2007/2008 financial crisis. This shows that an expansion of the central bank’s balance sheet can materially alter the distribution of wealth, causing inequality to increase, while even extreme changes in the composition of the central bank’s balance sheet (for example through maturity extension) have little effect. This arises as central bank purchases of longer term assets cause households to hold additional liquid financial wealth. Liquid financial assets are unevenly distributed in the population, and hence wealth inequality measures increase. When the model is calibrated to match the Federal Reserve’s Large Scale Asset Purchases (LSAPs) from 2008 until 2014, wealth inequality increases by 3.8%, as measured by the Gini coefficient, suggesting this channel leads to a significant increase in wealth inequality. The final PhD paper, The Rise of Harrod-Balassa-Samuelson, begins by documenting two stylised facts. Firstly, over the past 70 years the positive cross-country relationship between aggregate consumer prices and real output per capita has strengthened (i.e. a rise in the Harrod-Balassa-Samuelson effect), as demonstrated using data from the Penn World Tables. Secondly, border frictions have increased over the same time frame, with international borders effectively becoming wider and an increasing failure of the Law of One Price (LOOP). I construct my own dataset of city-level relative prices using national sources across five continents to document the increasing failure of the LOOP. I then use a two-country endowment model with a domestic distribution services sector to construct an equilibrium failure of the LOOP. An increase in the relative size of the distribution services sector can simultaneously explain both stylized facts, while the standard explanation (a higher share of non-traded goods) may only explain the first. Furthermore, I extend the model to include production by monopolistically competitive firms, before solving and calibrating the model to closely replicate the two stylised facts.
  • ItemOpen Access
    Expectations in Financial Markets
    Kalsi, Harkeerit
    Uncertainty pervades financial markets. How financial market participants form expectations when faced with uncertainty is therefore central to the study of financial markets. This thesis contains three chapters each highlighting the role of expectations formation in determining outcomes in financial markets. The first chapter, coauthored with Nicholas Vause and Nora Wegner, builds a theoretical model of self-fulfilling fire sales motivated by the dash for cash of March 2020. Investment funds fear being hit by a future liquidity shock and can choose to preemptively liquidate their bond holdings. However, funds face uncertainty about how many other funds will choose to preemptively liquidate. If funds wait to see if the liquidity shock crystalises, they risk selling their bonds into a depressed market if other funds have already chosen to liquidate. This creates the risk of a self-fulfilling fire sale where funds choose to preemptively liquidate because they expect that other funds will liquidate. Following the global games literature (Carlsson & van Damme 1993, Morris & Shin 1998), we derive the probability of a self-fulfilling fire sale and extend the model to include a central bank providing a market backstop. The central bank can choose the quantity of assets it is willing to purchase and the discount (relative to the bond return) that it charges to purchase the bond. When we introduce the central bank, we show that if the central bank can credibly commit to (i) set its discount low enough and (ii) the quantity of asset purchases high enough, then it can eliminate self-fulfilling fire sales. Moreover, it can achieve this without having to purchase any bonds. The aggressive policy works via expectations. In particular, it makes the pessimistic beliefs that drive the fire sale impossible for funds to rationalise because the funds know that the central bank stands ready to provide liquidity via asset purchases if needed. Whereas in the first chapter I assume agents can costlessly absorb all available information, in the second chapter (solo-authored), I follow the rational inattention literature (Sims 2003) and relax the assumption that information is costlessly obtained. This enables me to examine whether fragilities can build up in the financial system simply because agents pay insufficient attention to each other. I build a model where bank values are interdependent within a finan- cial network. However, banks cannot costlessly observe other banks’ values. Instead, banks must choose to pay attention to developments in the value of other banks. Because paying attention incurs a cost, banks may optimally choose not to allocate significant attention to certain banks. In the model, banks that believe they have a higher value choose to supply more credit. Therefore, if inattentiveness causes banks to incorrectly infer their own value, their credit supply decisions will be distorted relative to the optimal. I show that banks that are moderately important for determining values in the network cause the greatest distortion in credit supply because banks do not deem them important enough to pay high levels of attention to, yet they are still important for determining bank values. This suggests that we should be more cautious about dismissing all but the most interconnected banks as being important for ensuring financial stability. Thus far, agents have known the objective probability distributions relevant for decision making. In the third chapter, coauthored with Harjoat Bhamra and Raman Uppal, we follow Knight (1921) and distinguish between risk (known probabilities) and Knightian uncertainty (unknown probabilities). We argue that geopolitical uncertainty can often be viewed as Knightian uncertainty rather than risk and our objective is to examine the effects of this uncertainty. We do this by constructing a dynamic stochastic equilibrium production model of a world economy with two countries. Each country is characterised by a traded and a non-traded goods sector and a representative investor with Stochastic Differential Utility who is averse to Knightian uncertainty. We model geopolitical uncertainty as a loss of confidence in the correct model for the shocks to the efficiency units of capital where the investors cannot assign probabilities to the alternative models for the shocks. We solve this model in closed form and show how uncertainty operates by reducing households’ perceived expected returns on capital which, in turn, distorts portfolio and consumption choice decisions. We then examine the implications of these distortions for trade flows, exchange rates, growth, and the level of social welfare. We show that our model can match stylised facts of the UK economy following the Brexit referendum.
  • ItemOpen Access
    Essays in volatility modelling
    Ding, Yashuang
    This thesis mainly concerns some novel developments in volatility modelling. We first derive the diffusion limits of two recently proposed (discrete time) volatility models. Subsequently, we propose a new model that allows for conditional heteroskedasticity in the volatility of asset returns and incorporates current return information into the volatility nowcast and forecast. Our model can capture most stylised facts of asset returns even with Gaussian innovations and is simple to implement. Moreover, we show that our model converges weakly to the GARCH-type diffusion as the length of the discrete time intervals between observations goes to zero. Finally, we generalise our model and propose a new class of volatility models in which we can directly model the time-varying volatility of volatility. We also derive some statistical properties regarding this class of models. Empirical evidence shows that this class of models has better fits as well as more accurate volatility and VaR forecasts than other common GARCH-type models.
  • ItemOpen Access
    Essays in Monetary Economics and International Finance
    Ostry, Daniel
    The rapid rise of U.S. interest rates over the past year has important implications for firms' borrowing costs and investment decisions, for global financial stability and the U.S. dollar exchange rate, and for investors' pricing of the U.S. safety premium across asset classes. These three themes are, respectively, the subject of the three chapters of this thesis. They are bound together by a focus on the role of (U.S.) monetary policy for financial intermediation and asset pricing. Chapter 1, Firm Financial Conditions and the Transmission of Monetary Policy, is co-authored with Thiago Ferreira and John Rogers. We investigate how a firm's investment response to monetary policy depends on its financial conditions, finding a major role for its excess bond premium (EBP)--the component of its credit spread in excess of default risk. Strikingly, while monetary policy easings compress credit spreads more for firms with higher EBPs--i.e. for firms faced with tighter financial conditions--it is lower-EBP firms that invest more. We rationalize these findings with a model in which lower-EBP firms have flatter marginal benefit curves for capital, reflecting their more-resilient investment prospects. Consistent with our model, we show that the pass-through of monetary policy to aggregate investment depends on the distribution of firm EBPs, which varies over time. Chapters 2 and 3 study different aspects of the relationship between monetary policy and the U.S. dollar's safe-haven status. In Chapter 2, entitled Tails of Foreign Exchange at Risk (FEaR), I investigate this relationship during periods of severe stress in global financial markets (disasters). I first develop a model in which the unwinding of carry trades by speculators and a flight-to-liquidity by hedgers jointly determine exchange rate dynamics in disasters. Reflecting these two forces, the dollar experiences an amplified appreciation against high-interest-rate currencies in disasters, and a dampened depreciation, or even an appreciation, against low-interest-rate ones. I then test these predictions by assessing the relative importance of interest differentials and Treasury liquidity premia for explaining the tails of both the exchange rate distribution and the distribution of speculators' and hedgers' portfolio positions. Overall, my analysis quantifies the extent of, and substantiates a mechanism for, the dollar's safe-haven status in disasters. In Chapter 3, U.S. Risk and Treasury Convenience, which is co-authored with Giancarlo Corsetti, Simon Lloyd and Emile Marin, we investigate whether U.S. monetary policy has eroded the U.S.'s safety premium over time and across asset classes. We first document that, over the past two decades, investors in equity markets revised-up their assessment of U.S. risk relative to other advanced economies, driven by perceptions of greater long-run risk. Analytically, we use a no-arbitrage framework to link U.S. relative long-run risk, which we infer from bond and equity premia, to long-run exchange-rate risk and the convenience (liquidity) premium on long-maturity U.S. Treasuries. Taking theory to the data, we find that an increase in U.S. long-run risk leads to a persistent fall in the long-run convenience of U.S. Treasuries, in line with a (perceived) worsening of U.S. fundamentals. Further, we show that expansionary (unconventional) U.S. monetary policy induces both an increase in U.S. long-run risk and a decrease in the Treasury premium. Overall, our results suggest that the rise and fall, respectively, of U.S. long-run risk and long-maturity Treasury convenience yields over the past 20 years may be two sides of the same coin and may be the consequence of easy U.S. monetary policy.
  • ItemOpen Access
    Essays on Production Structure and Economic Integration
    Smitkova, Lidia
    In this dissertation, I present three chapters that study the linkages between the structural makeup of economies and the process of trade- and financial liberalization. In the first chapter I examine the role of trade and external deficits in explaining the patterns of structural change in twenty developed and developing economies between 1965 and 2000. First, for each country, I break down the time series of manufacturing value added share into a secular trend and a trade-induced deviation from the trend. I show that national differences are in large part due to trade. Second, I investigate changes in sectoral productivity, trade costs and trade deficits as the driving forces behind the patterns in the data. To do this I build a multi-sector Eaton and Kortum (2002) model and simulate the effects of different shocks on the manufacturing value added shares in the sample. While calibrating the model, I develop a novel method of identifying trade cost- and productivity shocks, which makes use of symmetry restrictions on sectoral trade cost shocks. I calibrate the model at a two-digit level of disaggregation, which permits me to study not only the changes in the manufacturing share, but also its composition at a sub-sectoral level. I find that open economy forces are responsible for 32% of the observed change in the manufacturing shares in my sample, and for 39% if the composition of the manufacturing sector is taken into account. Focusing on individual shocks, I show that for the aggregate manufacturing share, trade cost- and aggregate trade deficit shocks played the biggest role, whereas the productivity shocks mattered more in driving the composition of manufacturing. In the second chapter, I study financial liberalization between economies that differ in their overall competitiveness. I first show that if firms compete oligopolistically, then competitiveness --- relatively low aggregate unit costs of production --- is a feature of an economy with a fatter tailed productivity distribution and relatively more very large --- `superstar’ --- firms. Embedding this setup in a two-country model with heterogeneous agents and non-homothetic saving behaviour, I show that if the home is more competitive, then: (1) it enjoys a higher aggregate profit rate than foreign; (2) its autarkic interest rate is lower than that in foreign; (3) should the two economies undergo financial liberalization, the capital will be flowing from home to foreign; (4) if one of the sectors is non-tradable, the capital inflows push up the wages in foreign, leading to further losses of competitiveness and to current account overshooting. In the third chapter, I calibrate the quantitative version of the model developed in Chapter 2 to eight European economies on the eve of the Global Financial Crisis. I show that the competitiveness gap can explain 27% of variation in the current account imbalances incurred in the period. I conclude by discussing policies for rebalancing.
  • ItemOpen Access
    Essays on the behaviour of political and financial markets
    Auld, Thomas
    This thesis considers the behaviour and relationships between financial and prediction markets around elections. We begin by reviewing the literature. There are many small studies of individual elections and events, particularly of the 2016 UK European Union referendum. However, no studies that consider multiple events, nor present theories that apply in a general setting, are found. We believe this is a gap in the literature. Chapter 1 begins with a study of the Brexit referendum. Using a flexible prior and Bayesian updating, we demonstrate a major violation of semi-strong market efficiency in both the betting and currency markets on the night following the vote. It appears that it took a full three hours for prices to reflect the information contained in the publicly available results of the referendum. Chapter 2 presents a model linking the prices of financial and binary options in the prediction markets in the overnight session following an election. Starting from basic assumptions we find that prices in both markets should be cointegrated. Under risk neutrality the relationship is linear. However, departures from this assumption result in a non-linear cointegrating relationship. We test the theory on three recent elections. Strong support for the theory is found for two events. The linear cointegrating model fits the data from the night of the EU referendum remarkably well. However, departures from risk neutrality are needed to explain the behaviour observed on the night of the 2016 US presidential election. Chapter 3 considers pricing relationships in the weeks and months leading up to an election. Again using economic assumptions, we derive a relationship between asset price returns and changes in the prices of betting market binary options linked to an election result. This model is extended to equities using the ubiquitous Fama–French 5 factor model. The result is a 6 factor characteristic model, where the additional factor is related to political risk. We test the model on six recent elections. Using daily data, strong support is found for the theory for four events and weak evidence for one. The remaining election does not appear to be informative for asset prices. Interesting relationships are also uncovered between firm characteristics and political sensitivity. This is achieved by exploring the political factor loadings of the different equities under study. The main contributions of this thesis are, one, using a flexible Bayesian approach to demonstrate that, without a shadow of a doubt, any `bubble’ in opinion for remain continued well into the night of the EU referendum, and two, presenting pricing models of prediction and financial markets that apply in general settings and have strong support in the data. We also show that on nights after elections, betting markets lead financial markets on the scale of minutes to tens of minutes. This is consistent with, and an extension of, the conclusion of the existing literature that prediction markets have superior forecasting ability. Whether or not this lead–lag relationship occurs at other times prior to political events is an open research question.
  • ItemOpen Access
    Essays on Networks and Industrial Organization
    Kalbfuss, Joerg
    Chapter I: Cohesive Anarchy --- In a conflict, a small force may overcome a larger one if the latter fails to coordinate. To understand how incentives drive such failures, I study Tullock contests between a cohesive faction and a group embedded in a network. The group’s strength equals the sum of its members' efforts, where links measure their pairwise complementarity or interference. I characterize equilibria for general networks, and find only few members are likely to contribute. Furthermore, the prize and a network measure of interconnectedness jointly improve the internalization of spillovers by the group, so its performance varies with the stakes -- escalation induces cohesion. Chapter II: Spectral Oligopolies --- We study how demand interactions incentivize multiproduct oligopolists to reduce their variable production costs. Our starting point is an equivalence between multimarket competition over dependent products and single-market competition over independent bundles from which we derive three insights. First, heterogeneous innovativeness begets a core-fringe separation. While strong innovators dominate clusters of complementary markets through demand-side economies of scope, others retreat towards niches whose isolation attenuates the impact of investments. Second, the translation from innovative advantage into profits is strongest in `contractor' graphs, the antipodes of expanders. These graphs comprise clusters which are densely linked within but scarcely without, yet synergize well both by themselves as well as collectively. Consequently, the parts and the whole make for attractive and relatively independent investment targets. Third, as demand interactions scale up, both the market concentration and consumers' share of the surplus rise under broad conditions, so market-share based indices of concentration tend to suggest losses for consumers when the opposite is the case. We construct a generalized Herfindahl index which overcomes this limitation. Chapter III: Dominant Firms --- Many consumer industries evolve into partial oligopolies where firms with and without market power coexist. I develop a framework of dynamic competition which explains this pattern. Starting from a market with a continuum of firms, companies stochastically adjust their product offerings through the accumulation of thin-tailed innovations. In conjunction with discrete-choice founded demand, disruption becomes possible: if the spread of tastes relative to innovations' volatility falls below a threshold, occasionally a firm separates from the continuum with an outstanding product. As a result, this dominant firm accrues a positive market share and profit margin until a future innovator supplants it. During these cycles of disruptive turnover, the fringe provides a `seedbed': since incumbents emerge as frontrunners of the ongoing race of innovations, their products' qualities increase in the number of candidate firms from which they are drawn. Through this mechanism, the fringe’s measure is a first-order generator of surplus over time even if its output is not.