2016 Conference - Session 5

Session 5 - Friday, March 18, 9:00 - 10:30am

A.5: Assessing Costs and Benefits of EPA Regulations

Chair: Ann Ferris, US Environmental Protection Agency

Discussant: Anne Smith, NERA Economic Consulting


1.     The Costs of the New U.S. Ozone Standard, Alan Krupnick* and Josh Linn, Resources for the Future

High concentrations of ground-level ozone, commonly known as smog, pose serious threats to a large and diverse swath of the U.S. population. The U.S. EPA has recently lowered the limit from 75 parts per billion (ppb) to 70 ppb, citing adverse health effects that occur at levels lower than the previous limit and providing future health benefits to many more people. The costs of meeting pollution standards have always been contentious, and the case of ozone is no exception. The new limit will impose additional costs on the U.S. economy, the estimates of which were hotly contested before the rule was finalized, with industry estimates exceeding EPA's by at least five times.

Most of the cost controversy circled around the differences in how EPA and industry valued the mitigation measures needed to meet the tighter standard. To calculate the cost of the alternative standard, EPA had to value the cost of almost half the needed NOx reductions using "unknown" technologies. Here is where the estimates from industry wildly diverge.

Our analysis indicates that EPA’s cost estimates are likely to be closer to the mark than industry cost estimates. In part this is due to the more realistic emissions reductions estimates assumed by EPA. It is also true because many policies—such as cap-and-trade programs and gasoline taxes—can reduce emissions reductions at a relatively low cost, but industry critics ignored the efficiencies of these market-based options.

2.     A Cost-Effectiveness Analysis of Agricultural Greenhouse Gas Mitigation Measures in Denmark, Alex Dubgaard,* University of Copenhagen

This presentation describes a cost-effectiveness analysis (CEA) of agricultural GHG mitigation measures in Denmark. The agricultural CEA is part of an appraisal at the national level of measures to realize a policy goal of a 40 percent reduction in total Danish GHG emissions by 2020 compared to 1990. A total of 31 agricultural GHG mitigation measures are included in the assessment. The applied approach bears a certain resemblance to a cost-benefit analysis in the sense that the CEA is conducted at a net cost basis where ancillary benefits associated with GHG mitigation are subtracted from the costs of implementing these measures.

Particular focus is placed on the methods used to estimate implementation costs and ancillary benefits. These estimates should reflect the welfare economic costs of GHG mitigation in terms of changes in consumption possibilities for Danish society. This implies that cost estimates at factor prices must be converted to the consumer price level, which is done through multiplication by a so-called standard conversion factor – specified as 1.325 by the Danish Ministry of Finance. Also, the calculations incorporate estimated tax distortion costs – specified by the Danish Ministry of Finance as 20 percent of the tax revenue. The ancillary benefits comprise reductions in nitrate and ammonia emissions. Using a shadow price approach these benefits are evaluated at the estimated marginal social costs of abatement under existing Danish policy programs to reduce nitrate leaching and ammonia evaporation.

The CEA identified 11 agricultural GHG mitigation measures which can be considered as cost-effective at the national level. Together these measures represent a GHG reduction potential equal to about 25 percent of the targeted reduction in total Danish GHG emissions by 2020.

3.     The Role of Health Co-Benefits in EPA Regulatory Impact Analyses, Scott Bloomberg,* NERA Economic Consulting

In this presentation, I will present EPA's health co-benefits as presented in EPA's regulatory impact analysis (RIA) for the Clean Power Plan. I will discuss how the health co-benefits are calculated (in general and in this RIA), starting from the reductions in PM2.5 and ozone precursor emissions through the monetization of changes in health outcomes. My review will highlight the interesting questions regarding the use of already-regulated pollutants as a basis for co-benefits in regulatory impact analyses.

B.5: Addressing Uncertainty in Benefit-Cost Analysis

Chair: Aaron Kearsley, US Food and Drug Administration


1.     Understanding the Uncertainty of an Effectiveness-cost Ratio in Education: A Bayesian Approach, Yilin Pan,* Center for Benefit-cost Studies in Education, Columbia University

Despite wide-ranging support of the message that both effectiveness and cost should be taken into account for program selection, it is still unclear whether it is sufficient to compare the alternatives only based on a single, scalar efficiency measure, i.e., one cost-effectiveness ratio estimate. The ratio estimate conveys information about what happened, one time, in the specific evaluation settings. However, if the program is replicated, it is almost impossible to obtain the same cost-effectiveness ratio due to measurement error, time-to-time and site-to-site variability, or other factors that contribute to uncertainty. Therefore, compared to a single cost-effectiveness ratio estimate that tells what happened, more useful information for practitioners would be 1) the best guess for what to anticipate in terms of the trade-off between effectiveness and cost, and 2) the comparatively worst-case and best-case scenarios. The underlying methodological challenge is to identify a probability distribution of an efficiency measure. Given the necessity to bridge the gap between what happened and what is likely to happen, this paper aims to explore how to apply Bayesian inference to cost-effectiveness analysis so as to capture the uncertainty of a ratio-type efficiency measure. The first part of the paper summarizes the characteristics of the evaluation data that are commonly available in educational research, discusses the ratio property, and proposes two estimators. The second section synthesizes two sources of uncertainty, and reviews the conventional quantitative methods that address the uncertainty of a ratio under each perception. The third part proposes two Bayesian models that differ in the assumption of site-level variability, and demonstrates the estimation, presentation and interpretation of the results using the comparison of two high school dropout prevention programs: New Chance and JOBSTART. The last section summarizes the strengths and limitations of the Bayesian method, and lists some directions for future research.

2.     Attitudes toward Catastrophic Risks, Christoph Rheinberger,* European Chemicals Agency; and Nicolas Treich, Toulouse School of Economics

Catastrophic risks, such as those posed by natural disasters, financial collapse, and industrial accidents have met with growing policy interest. Economists have recently devoted much attention to the modeling of climate catastrophes. In doing so, they typically start from the premises of a representative agent who seeks to maximize expected utility over an uncertain consumption path and thereby faces the risk of a catastrophe. In a famous paper, Martin Weitzman (2009: p. 9, Rev. Econ. Stat. 91) puts it this way: “The basic idea is that a society trading off a decreased probability of its own catastrophic demise against the cost of lowering the probability of that catastrophe is facing a decision problem conceptually analogous to how a person might make a tradeoff between decreased consumption as against a lower probability of that person’s own individually catastrophic end.” This means standard economics presumes that society should be catastrophe averse in the very same way the representative agent is risk averse with regard to aggregated consumption. In this paper, we introduce an alternative framework. We conceptualize catastrophes as social risks that bear a small chance of many people dying together. We characterize the catastrophic potential of a risk by the spread in the distribution of fatalities within the population at threat; our social planner therefore cares about the coincidence of fatalities in each possible state of the world. Our main objective is then to explore defensible attitudes toward catastrophe: How do we behave in the face of a looming catastrophe? And how should we behave in order to optimally protect ourselves against catastrophes? We collect insights from decision theory, behavioral economics, psychology, social choice and risk management studies to reflect upon these questions.

3.     Nuclear War as a Global Catastrophic Risk: Analysis Issues, James Scouras,* Johns Hopkins University Applied Physics Laboratory

This paper explores challenges in applying both risk analysis and benefit-cost analysis to evaluate measures intended to reduce the risk of nuclear war. Like many other global catastrophic risks, large uncertainties in both likelihood and consequences of nuclear war, as well as the benefits and costs of measures intended to reduce either dimension of risk, complicate the evaluation of mitigation strategies. Moreover, nuclear war has unique characteristics that set it aside from natural catastrophes and even from other anthropogenic catastrophes. In particular, there is a critical linkage between the likelihood of nuclear war and its anticipated consequences. The strategy of mutual assured destruction exploits this linkage by maintaining the specter of horrific consequences in order to keep the likelihood of large nuclear war low. Also, nuclear strategy intentionally maintains uncertainty in the potential for smaller nuclear wars to lead to larger nuclear wars, thereby reinforcing the taboo against any scale nuclear war. However, nuclear strategy may be changing as we face the possibility of nuclear war arising from non-state actors and new nuclear states, against which traditional deterrence may be more prone to failure.

4.     Uncertainty in Estimates of Benefits for BCA, George Gray,* The George Washington University

The tools of human health risk assessment are often used to estimate benefits for benefit-cost analysis (BCA).   The benefits addressed by risk assessment may include reductions in morbidity, mortality, or other environmental effects.  However, current practice in risk assessment grew up to address regulatory questions focused on standard setting (e.g., pesticide residues, soil cleanup standards, air quality standards) and not BCA.  Science policy judgments are made in the face of the many uncertainties involved in risk assessment.  Regulatory risk assessment, in general, uses conservative science policy approaches to serve the needs of standard setting and similar decisions.  However, these science policy choices may not be appropriate for use in BCA.  This presentation will detail the interplay of science, science policy and analysis in risk assessment and identify specific cases in which current practice fails to meet the needs of practitioners of BCA.

C.5: Discounting Methods

Chair: Ali Gungor, US Coast Guard

Discussant: Richard Zerbe, University of Washington


1.     Hyperbolic Discounting in Benefit-Cost Analysis, Charles Moss,* University of Florida; Troy Schmitz, Arizona State University; Dwayne Haynes and Andrew Schmitz, University of Florida

We revisit our Schmitz, Haynes, and Schmitz (2013) and Schmitz and Haynes (2015), where the latter emphasized the role of interest rates in discounting. We used the 2004 U.S. Tobacco Buyout as a case study. The 2015 study improved upon the 2013 study by including present value calculations in benefit-cost ratios over two distinct periods. We further this analysis by applying hyperbolic discounting to individual components that are a part of a given benefit-cost ratio, within a general equilibrium framework. Importantly, we use hyperbolic discounting to account for cases where the benefits and/or costs of a policy may not be realized until sometime in the future, which is an extension of its traditional use as it relates to consumers’ motivation to constrain their own future choices (Laibson, 1997; Diamond and Kӧszegi, 2003; and Dasgupta and Maskin, 2005). This analysis can be extended to varying markets where the long term impacts of policies are evaluated.

2.     Declining Discount Rates, Hurdle Rates, and Intergenerational Equity in Policy Analysis, Daniel Wilmoth,* SBA Office of Advocacy

Some economists have argued that uncertainty about the appropriate discount rate implies that policies should be evaluated using a discount rate that declines with time. The implications of a declining discount rate for intergenerational equity are explored by investigating the relationship between declining discount rates and compensation criteria. Under some circumstances, the use of a declining discount rate corresponds to switching between two criteria so that the criterion most favorable to future generations is always applied. Under other circumstances, net benefits under a declining discount rate may be positive although neither criterion is satisfied. These issues make the use of declining discount rates objectionable, and an alternative method for addressing uncertainty about the appropriate discount rate is developed. The private sector addresses similar uncertainty through the use of hurdle rates, and the simultaneous use of hurdle rates from each end of the probability distribution is shown to be both more equitable and more reliable than the use of declining discount rates. The use of such hurdle rates corresponds broadly to the analyses currently performed by federal agencies in the US, where regulatory impacts are discounted using rates of both three percent and seven percent. However, those values were not chosen to address the general uncertainty analyzed here, and their suitability as hurdle rates is discussed.

3.     Mazur Discounting and the Private Benefits Paradox, Brian Mannix,* GW Regulatory Studies Center

In recent years, federal regulatory agencies have used risk-free social discount rates to assign large “private benefits” to energy efficiency regulations. The paradox is that the individuals and businesses who experience these benefits reveal, through their choices, that they would prefer not to. The paradox can be resolved by a discounting procedure first suggested by economist Michael Mazur, the author of OMB's original guidance on Regulatory Impact Analysis, shortly before his death in 1989.

D.5: The Regulatory Process, from Design to Analysis to Execution

Chair: Christine Kymn, US Small Business Administration


1.     What Would a Redesigned Regulatory System Look Like? An Agency Theory and Public Choice Perspective, Patrick McLaughlin,* Mercatus Center at George Mason University

Much research on the merits and demerits of the regulatory system takes the current regulatory system as its starting point and suggests specific reforms that address identified problems. Instead, we start from a “constitutional” perspective – that is, if we were building a regulatory system from scratch, what would we build? We use economic principles to consider the benefits and costs of different designs of a regulatory system in a theoretically “greenfield” jurisdiction. We use this thought experiment to develop a “model” regulatory system. A primary focus of our analysis relates to the review of agency theory’s contribution to solving the principal-agent problem that is inherent in delegated lawmaking (such as regulation), which we synthesize with foundational public choice literature on the design of institutions for collective decision-making and bureaucratic behavior. We then consider how the current federal regulatory system compares to our “model” system, and what reforms could get us closer to it. By virtue of this comparison, we highlight several features of the existing regulatory system that can be targeted for reform, including missing elements, redundancies, superfluous elements, misaligned incentives, and failures of oversight.

2.     Complexity and the Regulatory Process, Stephen Jones,* Mercatus Center at George Mason University

Regulatory complexity may be beneficial. As economic complexity mounts, a more complex regulatory code may be necessary to address the wider scope of concern. However, a more complex regulatory code also has greater administration costs and could decrease aggregate compliance rates because the costs of understanding a complex system of regulations is nonzero. Law and economics scholars, such as Kaplow (1995), Parisi (2001), and Tullock (1995), therefore argue that the legal code should be as complex as it needs to be, but no more. A potential interpretation is that the marginal regulation exerts a negative effect on the regulatory stock by making the stock more complex. Absent a process to compare the marginal effect of a more complex regulatory code with its purported benefit of fitting to new conditions, regulation may be oversupplied. We use the regulatory database, RegData, to construct novel metrics of regulatory complexity. Our results suggest that the regulatory process has no tendency towards such an optimum, consistent with the idea that regulation, in the aggregate, is oversupplied.

3.     Evaluating the Quality and Use of Regulatory Impact Analysis: The Mercatus Center’s Regulatory Report Card, 2008-13, Jerry Ellig,* Mercatus Center at George Mason University

The Mercatus Center at George Mason University initiated the Regulatory Report Card project in 2009 to assess how well executive branch agencies conduct and use regulatory analysis and identify ways to motivate improvement. Evaluation criteria reflect the regulatory analysis principles articulated in Executive Order 12866 and OMB Circular A-4. Evaluations of 130 economically significant, prescriptive regulations proposed between 2008 and 2013 reveal that the quality and use of analysis are low on average and highly variable. Agencies rarely make provisions for retrospective review when they issue regulations. Factors associated with better analysis include presence of a presidentially-appointed OIRA administrator rather than an acting administrator and high-impact regulations with benefits or costs exceeding $1 billion. Administrations of both parties tolerate worse analysis from agencies that are more likely to share their policy preferences. “Midnight regulations” and regulations left for the next administration to finalize have lower-quality analysis. The quality of analysis is also correlated with statutory constraints on agency decision-making criteria. There is no significant difference in the quality of analysis based on which party controls the presidency, the existence of statutory or judicial deadlines, or general constraints on agency decision-making authority, such as requirements that the agency must issue a new regulation or a statute prescribing the form, stringency, or coverage of the regulation. Little of the variability in quality is associated with agency-specific factors. Finally, after controlling for other factors, there is no evidence that civil rights, environmental, financial, security, or safety regulations have lower-quality analysis than economic regulations.

4.     Objections to Regulatory Reform: Counter Arguments, Richard Williams,* Mercatus Center at George Mason University

Despite the fact that we have not had significant changes to the Administrative Procedures Act in 75 years, a number of thoughtful objections have been raised to the dozens of bills now working their way through Congress.  Looking closely at these objections, we find that many do not merit rejecting meaningful reform.   Reform efforts need to start with authorizing legislation, continue into the production of regulations by agencies, and finally review and modification or elimination of existing regulations.  This paper will examine common objections to reform efforts and whether those objections withstand scrutiny.

E.5: International Development and Finance

Chair: Gareth Harper, Optimity Advisors

Discussant: Glenn Jenkins, Queen’s University


1.     A Redistribution Mechanism and Network Approach in Microcredit, Can Sever,* University of Maryland

In this paper, I propose a joint liability mechanism in microcredit. It is based on an income redistribution scheme among peers. In case of risk neutral agents, it does not affect expected utilities, whereas it increases social welfare in egalitarian terms. Assuming the observability of outputs, but private efforts, the mechanism is able to yield the equilibrium eliminating moral hazard problem in case of costly effort. Since it acts as an income smoothing mechanism, it also improves individual utilities when borrowers are risk averse. Extending the environment to a two-period world with market and reinvestment opportunities, it creates a new credit channel and increases expected utilities. Despite the fact that the social network is the key in microlending, there is a lack of theoretical papers which applies network tools to microfinance. To address this gap, I finally incorporate the network approach into the mechanism, considering the role of key players in a social network. Under the presence of the redistribution mechanism, if microlending penetrates to key players in the network, people who are not eligible for microcredit due to their position in a social network may have access to credit. While doing this, the mechanism also increases utilities of existing peers, and hence welfare increases in both individual and social levels. This paper produces policy recommendation illustrating that ’true’ mechanisms may improve welfare for specific network structures.

2.     Poverty Alleviation through Innovation in the Value Chain for Small Rudiments in the Somali Region of Ethiopia, Mikhail Miklyaev*, Cambridge Resources International

The traditional value chain for small rudiments in the Somali Region of Ethiopia is to sell live animals to meat packers located near to the capital of the country, or to move live animals to the coast for export to the countries of the Gulf. As they are trekked to border markets, the result is a tremendous loss of weight and death of the animals. The innovation of this project is to slaughter the animals in a modern meat packing plant in the pastoral region of Faafan village, Somali State, and then export chilled or frozen meat to the Gulf countries.  Until now the security situation in the region has prevented investors setting up such a facility in this region. With USAID assistance such a plant has been build and is successfully operating.  The financial feasibility of the facility is essential for the project success, and it has proven to be highly profitable with a net present value at a 12% discount rate of about equal to the intial investment cost. The main purpose of this analysis, however, is to estimate the economic returns and the net benefits created for all the project stakeholders, namely: the small holder livestock producers, the livestock traders, the private operator, the labor employed by the facility, and the Government of Ethiopia. An integrated investment appraisal has shown that the initial benefits to the herders of the pastoral region are at least three times that of the meat packing plant. Due to the large supply of live animals and the profitability of this first facility, other investors are expected to enter, this competition for the live animals will further benefit the animal producing pastoralists of the region.

3.      A Cost-Benefit Analysis of Local Production of Ready to Use Therapeutic Foods In Uganda, Glenn Jenkins,* Queen’s University and Eastern Mediterranean University

The prevalence of malnutrition, vitamin-A deficiency, and anemia is high in Uganda. Of children under 5 years of age, 33 percent are stunted and 5 percent wasted. The rate of anemia among women and children is as high as 50 percent. The aim of this study is to identify if a 5 year off-take contract that would provide a sufficient incentive for the private sector to establish a factory for the production of Ready to Use Therapeutic Food (RUTF) production. At the present time this RUFT has been largely imported from Europe. The base-line analysis revealed that the financial incentive for the investment would exist if the off-take price is at the RUTF’s world price level. The proper structure of the deal would also result on significant benefits arising to more than 4,000 HIV/AIDs infected farmers supplying pea nuts to the factory. The government of Uganda would also benefit by US$1.54 mill over the 10-year life of the project. The critical challenge to the production of in Uganda is to control the level of aflatoxins that are associated with the peanut input to the production of RUFT.