2016 Conference - Session 7

Session 7 - Friday, March 18, 2:00 - 3:30pm

A.7: Influences of Wally Oates: Extensions of Fiscal Federalism and Environmental Regulatory Design and their Implications for Benefit Cost Analysis

Chair: George Parsons, University of Delaware

Presentations:

1.    A Review of the Contributions of Wallace E. Oates and their Implications for Benefit Cost Analysis, Al McGartland* and David A. Evans,* US Environmental Protection Agency

We provide an overview of Wallace E. Oates’ contributions and their implications for policy analysis and benefit cost analysis on the provision of public goods. Oates made seminal contributions in the fields of both public finance and environmental economics. In public finance he explored the relationship among different levels of government in a federalism system on the provision of public goods of varying scales. This work in fiscal federalism considered the possibilities of public good spillovers across jurisdictions, mobility, the allocation and form of taxation, and the capitalization of the benefits and costs of programs into asset values.  He was the first to show that the effects of government policies can be capitalized in property values – findings incredibly relevant to federalism arguments, environmental economics, and benefit cost analysis more broadly.  In addition to applying the insights from fiscal federalism, he expanded our understanding of the performance of various approaches to environmental regulation.  He helped us all think about the economics of many environmental policy issues, including: whether green subsidies performed as well as pollution taxes, the pros and cons of emission charges vs. pollution standards, the implications of environmental federalism, and the distributional consequences of environmental policy. We argue that his insights help identify which benefits and costs are particularly relevant to understand given the objectives of a regulation, how to use theory to limit the consideration of the scope of options to be evaluated using benefit-cost analysis, how to consider the role of decision makers at various levels of government, and that they emphasize the importance of accounting for long-run changes in benefit-cost analysis.

2.     Economy-Wide and Sectoral Climate Policies and how they Interact: Results from EMF 24, Allen Fawcett,* US Environmental Protection Agency

The Energy Modeling Forum 24 (EMF 24) study, originally published in The Energy Journal (Vol. 35, Special Issue 1, 2014), focused on the interactions between different climate policy architectures and advanced energy technology availabilities in the U.S.  The study included a set of policy scenarios designed to compare economy-wide market-based and sectoral regulatory approaches of potential U.S. climate policy, and compared the results of seven different models.  This presentation highlights some of the key insights from the study, in particular: the relative cost effectiveness of economy-wide carbon pricing policies, sectoral policies, and combined policies; the impact of alternative cost metrics; and, the importance of baseline assumptions.

3.    Environmental Valuation across Time: the Implicit Price of Water Quality through the Recent Recession, Patrick J. Walsh,* Charles Griffiths, Dennis Guignet, Heather Klemick

The recent recession caused large swings in home sales prices around the country, which has caused some concern with hedonic property analyses that rely on home sales data. The theory underlying hedonic analysis assumes the market is in equilibrium, so non-equilibrium behavior could impact the validity and interpretation of the results of a hedonic analysis. Boyle et al. (2012) examine this issue and identify several variables that can help identify non-equilibrium conditions. Bin et al., (2014) investigate the hedonic analysis of water quality over a period that spans the recent recession. Results indicate that even during the bust time when home sales are decreasing rapidly, the value of water quality is actually higher than normal, implying that the recession did not crowd out people’s WTP for environmental values. Our paper builds on the Bin et al., (2014) paper by analyzing multiple housing markets during the recent recession. Our data includes over 200,000 homes sales across 14 counties in Maryland and focuses on water quality. The recession had a very different effect across counties, which allows us to further explore the impact of the recession and potential non-equilibrium behavior on environmental valuation. We split the data into several different phases of the housing market cycle and compare across phases. Results indicate that although there is still evidence of positive WTP for water quality during bust periods, there is also significantly more variation in the estimates during that time period, generally resulting in less significant coefficients. While the recession does not appear to be crowding out environmental values, as some past literature has suggested, there is still some concern with non-equilibrium behavior. Our analysis is similar to previous analyses of the Tiebout model of local finance, which holds that individuals seek out neighborhoods that reflect their demands for local services, which should be capitalized in both property prices and local tax rates. Wallace Oates proposed some early empirical tests of the Tiebout model (Oates, 1969) and published several papers (and book chapters) related to it. Although he demonstrated violations of the pure Tiebout model, he found that it has several important implications for behavior (Oates, 1981). Our paper indicates the local capitalization is likely a dynamic process that can change over time, and may suggest caution when evaluating the Tiebout hypothesis using limited data.

B.7: Applying Behavioral Insights in Benefit-Cost Analysis

Chair: Sharon Brown-Hruska, NERA Economic Consulting and Tulane University

Presentations:

1.     How Much Relevance Does Reality Imply? (Re)Considering the Endowment Effect, Timothy Brennan,* UMBC and Resources for the Future

The endowment effect - that someone’s willingness to pay for a gain or accept a loss depends on whether that person treats a good or level of income as something they already have - figures in debates about how to conduct benefit-cost analysis. As the endowment effect is often seen as a threat to conventional ways of doing economics, discussions of the endowment effect are often polarized. Social practices indicate the reality and significance of the endowment effect; advertising and Buddhism are useful, if differentiated, examples.

The reality of an endowment effect is consistent with neoclassical principles and thus does not explain WTP/WTA differences, risk-preferring behavior below the endowment, or kinks in a demand curve at the endowment point. The above examples suggest that changing peoples’ minds regarding their endowment takes more than reframing a question (WTP vs. WTA) or redistributing coffee mugs to seminar students. Differences in responses to WTP vs. WTA may well be based on different responder interpretations of the endowment, but if so that imposed that WTP and WTA involve different questions, not reframing of the same question, and thus different responses are consistent with (economic) rationality. A similar form of the effect—not selling an item at a price above what one would be willing to pay to the obtain that item—could reflect either real options or specific history that makes something more valuable once owned.

The reality of an endowment effect, however, requires taking seriously the issue of what is being asked of winners and losers when a change in policy is being considered. If pre-policy setting, especially if long standing, indicates perceived endowments, BCA should be based on winners’ WTP and losers’ WTA, creating a status quo bias for policy, just as in behavior.

2.     Reference Dependence and the Choice of Welfare Measure: WTP or WTA When Beneficiaries Pay the Costs and When They Do Not, Jack Knetsch,* Simon Fraser University

In spite of dictates of standard theory calling for the WTA measure to assess losses, and reductions of losses, in people's welfare, nearly all benefit and cost assessments, regardless of their sign or nature, continue, in practice, to be measured by people's willingness to pay for them. Unfamiliarity, skepticism over the available evidence of large disparities between people's valuations of many gains and losses, and the lack of accepted estimation methods to assess WTA values, are undoubtedly partially responsible for this seeming violation of accepted principles. However, the absence of widely accepted theory-based criteria for choosing the most accurate/useful measure in particular cases seems to also be a major contributor. This lack has led to a variety of seemingly plausible assertions, such as, "WTP, rather than WTA, is of course the preferred approach to monetization when a proposed public policy will compel people (taxpayers, workers, investors, consumers) to bear the costs of implementing it." -- suggestions that may often be at odds with their intent. This paper reports on analyses of choice of measure criteria when valuations are subject to reference dependence and exhibit significant WTA/WTP disparities. These will include criteria for what appear to be the more problematic cases in which the same people benefiting from a change are those responsible for bearing its costs -- while many analysts may accept WTA valuations of environmental damages caused by a foreign private firm, the same may not extend to similar losses imposed by a local government (i.e., taxpayer) owned enterprise. The suggested criteria will be illustrated with several value-of-statistical-life (VSL) estimates arising in differing contexts. To the extent that these improved choice-of-measure criteria can improve welfare outcomes of proposed projects and policy changes, they should reduce the bias stemming from current practice.

3.     The Losses from Lemons, David Simpson,* US Environmental Protection Agency

George Akerlof’s seminal paper “The Market for Lemons,” added “asymmetric information” to a list of market failures that already included imperfect competition, public goods, and externalities. In cost-benefit analysis we calculate Harberger triangles to inform antitrust policy, and estimate the marginal external damage from pollution to compute Pigovian taxes. In contrast to the extensive literature on estimating the benefits of correcting other market failures, however, I am aware of little work estimating “the losses from lemons”. I take this topic up using the simple, illustrative two-type model Akerlof introduced.

Social losses arise when an asset that would be more valuable to a potential buyer than it is to its current owner does not change hands. This can happen if the buyer is uncertain as to the quality of the asset offered for sale and the seller is unable to signal its true quality. Using incentive compatibility conditions, I am able to derive a simple expression for the “loss from lemons” and bound it. This bound is only a small fraction of the value of the asset itself. While I confine my attention to the simple model, my finding may prove to be general. Intuitively, losses arise because sales are not made, and sales are not made if would-be sellers have reservation prices that are not that much less than potential buyers would pay; that is, if the efficiency loss from not selling is small. This suggests that the legitimate societal concern over markets compromised by asymmetric information may arise more from an offense to our sense of equity than from efficiency concerns.

While I do not conduct statistical work, I motivate my analysis and results by reference to research on markets for potentially contaminated real estate.

4.     Rational Benefit Assessment for an Irrational World, W. Kip Viscusi,* Vanderbilt Law School and Ted Gayer, Brookings Institution

Behavioral economists have identified certain biases in decision-making that lead people to make decisions that harm themselves, but there is insufficient guidance for estimating benefits in the presence of such behavioral failures. This gap in principles and standards for benefit-cost analysis has led government agencies at times to adopt arbitrary and excessive benefit valuations. This article describes an approach to incorporating behavioral market failures into benefit estimation, first by advocating a higher level of scrutiny to use before applying behavioral findings from narrow contexts to broader populations subject to regulation, and then by comparing the outcomes from the self-harming behavior to a policy reference point in which people are assumed to be fully informed and to act fully rationally in their own self-interest. This approach, which is grounded on systematic, well-documented, and context-specific findings of behavioral failings, would reduce instances of agencies assuming that behavioral findings in some contexts provide sufficient rationale for overriding consumer preferences in other contexts. It would also establish a consistent approach to government policy, for example by creating symmetry between advancing policies that seek to discourage consumption of products for which consumers underestimate the health risks and fostering accurate risk beliefs to address erroneous individual choices based on risk overestimation.

C.7: The Effect on Benefit Estimates of Discarding Scientific Input Data

Chair: Dima Yazji Shamoun, Mercatus Center at George Mason University

Discussants: Kerry Krutilla, Indiana University School of Public & Environmental Affairs and George Gray, The George Washington University

Presentations:

1.     The Effect on Benefit Estimates of Discarding Data from Human Chamber Studies, Richard Belzer,* Regulatory Checkbook

Since the 1980s, several controlled chamber studies on human volunteers have been performed by USEPA and industry; the most recent study related to ozone is Schelegle et al. 2009. A common research protocol is used, including previously described clinical pulmonary function tests. The most recent study (Schelegle et al. 2009) reported statistically significant decrements in pulmonary function at 70 ppb ozone. Two to four maneuvers were conducted for each test. In accordance with the standard clinical protocol, each test was represented by a single value, however. Other maneuver data were discarded prior to statistical analysis. In this paper, discarded data are simulated based on reported results and alternative values for inter-maneuver variance. Differences between concentrations are shown to be sensitive to the missing data. Benefit estimates are analogously affected. REFERENCES Schelegle et al. 2009. 6.6-hour inhalation of ozone concentrations from 60 to 87 parts per billion in healthy humans. Am Journal Resp Crit Care Med 180:265-272.

2.     The Effect on Benefit Estimates of Discarding Data from Observational Epidemiology Studies, R. Jeffrey Lewis,* ExxonMobil Biomedical Sciences, Inc.

Since the 1980s, several observational epidemiology studies have been performed by various academic and government research teams to estimate pulmonary function decrements in various subpopulations from differences in ambient air concentrations for various pollutants. The current clinical pulmonary function protocol (Miller et al. 2005), or a predecessor, was adopted or adapted. In each case, multiple “maneuvers” were performed for each test, and a single value representing that test was recorded and used for statistical analysis. Other maneuver data were discarded prior to statistical analysis. In this paper, discarded data will be simulated based on reported results from an observational epidemiology dataset (where data can be obtained) and alternative values for inter-maneuver variance. Differences between concentrations are shown to be sensitive to the missing data. Benefit estimates are analogously affected. REFERENCES: Miller et al. 2005. Standardisation of spirometry. Eur Resp J 26:319-338.

D.7: Evaluating the Impacts of U.S. Border Enforcement Activities: Methodology Roundtable Discussion

Chair: Seth Renkema, US Customs and Border Protection

Panelists to include:

1.   Joseph Cordes, The George Washington University

2.   Mary (Katie) Foreman, Econometrica, Inc.

2.   Alan Fox, U.S. International Trade Commission

3.   Bryan Roberts, Institute for Defense Analysis

4.   John Whitley, Institute for Defense Analysis

E.7: Benefit-Cost Analysis of Research and Emerging Issues

Chair: Jan Lewandrowski, USDA

Discussant: Andrew Estrin, US Food and Drug Administration

Presentations:

1.     The Economic Benefits of Genomics Research: New Assays for Foodborne Pathogens, Brian Morrison,* Industrial Economics, Incorporated; Amelia Geggel; and Mary McGee

The Genomics Research and Development Initiative (GRDI) coordinates Canada’s federal science departments and agencies in the field of genomics research. Its long-term goals are to protect and improve human health, develop new treatments for chronic and infectious diseases, protect the environment, manage agricultural and natural resources in a way that is sustainable, and thus support the health and economic welfare of all Canadians.

The GRDI has funded genomics research at a number of federal departments and agencies, including Health Canada, since 1999. Scientists at Health Canada have achieved success on a variety of research initiatives. As is often the case, however, it can be difficult to quantify the long-term implications of emerging science, and measuring its benefits in economic terms is a challenge. Developing a better understanding of these benefits is an important consideration in Treasury Board decisions concerning future funding of genomics research.

This presentation summarizes Health Canada’s first step in attempting to quantify the economic benefits of its genomics research programs. It presents a case study of an emerging GRDI success story: the identification and validation of whole genome sequencing techniques that facilitate the tracking of Campylobacter and Listeria, pathogens that are important causes of food poisoning. The study examines the potential benefits of new Campylobacter and Listeria assays in improving the ability of food safety agencies to detect and trace the sources of these pathogens. These benefits are quantified with respect to potential reductions in the incidence of illness and death attributable to consumption of contaminated food, and valued in accordance with Treasury Board guidance on the economic analysis of changes in health risks.

2.     Quantifying Breakeven Price Distribution in Stochastic Techno-Economic Analysis, Wallace E. Tyner,* Xin Zhao, and Guolin Yao, Purdue University

Techno-economic analysis (TEA) is a well-established modeling process in which benefit-cost analysis (BCA) is used to evaluate the economic feasibility of emerging technologies. Most previous TEA studies focused on creating reliable cost estimates but returned deterministic net present values (NPV) and deterministic breakeven prices. Nevertheless, the deterministic results cannot convey the considerable uncertainties embedded in techno-economic variables, such as capital investment, conversion technology yield, and output prices. We obtain distributions of NPV, IRR, and breakeven price. The breakeven price is the most important indicator in TEA because it is independent of scale and communicates results effectively. The deterministic breakeven price is the price for which there is a 50 percent probability of earning more or less than the stipulated rate of return. For an investment under relatively high uncertainty, it is unlikely that investors would provide financing to a project with a 50 percent probability of loss. The point estimate breakeven price, therefore, does not represent the threshold under which investment would occur. In this study, we introduce the stochastic techno-economic analysis in which we incorporate Monte Carlo simulation into traditional TEA. A case of cellulosic biofuel production from fast pyrolysis and hydroprocessing pathway is used to illustrate the method of modeling stochastic TEA and quantifying the breakeven price distribution. The input uncertainties are translated to outputs so that the probability density distribution of both NPV and breakeven price are derived. Two methods, a mathematical method and a programming method, are developed to quantify breakeven price distribution in a way that can consider future price trend and uncertainty. We analyze two scenarios, one assuming constant real future output prices, and the other assuming that future prices follow an increasing trend with stochastic disturbances. We demonstrate that the breakeven price distributions derived using our methods are consistent with the corresponding NPV distributions regarding the percentile value and the probability of gain/loss.

3.     Cost-Benefit Analysis of Research, Development and Innovation Infrastructures: An Exploratory Evaluation Framework, Emanuela Sirtori,* CSIL Centre for Industrial Studies; Massimo Florio; and Chiara Pancotti

Policy makers have growing expectations on research, development and innovation (RDI) infrastructures, as an essential component of technological and scientific progress, and hence economic growth. The stakes associated with their selection and evaluation are therefore high. Cost-benefit analysis of RDI infrastructures is a new field. The intangible nature of some benefits and the uncertainty associated to the achievement of research results have often discouraged the use of a proper CBA for RDI infrastructures. The new Guide for the CBA of investment projects adopted by the European Commission (2014) gives some instructions to appraise RDI projects, but admits that due to lack of experience and best practices, the methodological framework still needs to be improved. Our paper aims at fine-tuning and expanding the appraisal techniques recommended by the European Commission, in order to provide policy makers, researchers and project analysts with practical suggestions on how to perform a proper socio-economic analysis of RDI infrastructure projects.

We break down benefits into two broad classes: i) use benefits, held by different categories of infrastructure’s users, such as scientists, firms, students and general public visitors, and ii) non-use benefits, denoting the social value for the discovery potential of the RDI infrastructure regardless of its actual or future use. We argue that the social value of discovery can be estimated with contingent valuation techniques. Another significant feature of our approach is the stochastic nature of the CBA model, intended to deal with the uncertainty and risk of optimism bias in the estimates. The methodological approach laid down in our paper has been already tested with two case studies and is going to be discussed in a workshop involving the European Commission, the European Investment Bank, the European Strategy Forum on Research Infrastructures and several other stakeholders.