2017 Conference - Session 8

Session 8 - Friday, March 17, 3:45 - 5:15pm

A.8: Measuring the Effectiveness of Regulations

Chair: Sofie Miller, The George Washington University

Presentations:

1. Retrospective Evaluation of Chemical Regulations; Susan Dudley*, The George Washington University

OECD countries rely on regulatory tools to manage potential risks from exposure to targeted chemicals. While it is standard practice to analyze and estimate how proposed regulations might affect regulated entities, consumers, citizens, etc., before they are issued, regulators have devoted much less analysis to evaluating the impacts of their regulations once they are in effect. This background paper for the OECD Workshop on Socioeconomic Impact Assessment of Chemicals Management draws on experience in OECD countries, primarily the United States, to examine the practices used to understand the likely impacts of regulations aimed at reducing chemical risks both before and after they are issued.

2. Improving Retrospective Review; Reeve Bull*, Administrative Conference of the United States

This presentation will offer a brief history of retrospective review efforts in the United States, identify some of the limitations of past regulatory lookback initiatives, and explore potential reforms designed to produce a more robust, enduring system of retrospective review.  It will begin with a quick analysis of past regulatory lookback efforts, which date to the Carter Administration.  It will explain how each of these programs relied on agencies’ self-review of regulations, which caused such efforts to falter.

The presentation will then explore potential improvements.  Reform proposals tend fall into two categories: (1) an independent body overseeing retrospective reviews and (2) enhanced stakeholder participation.  There have been several permutations of reforms involving expanded independent oversight, including expanding OIRA’s role in retrospective review, creating a new entity housed in either the executive or legislative branch, and periodically convening a “regulatory improvement commission.”  A number of possible reforms involving enhanced stakeholder participation have also been put forward.  These range from implementing an improved process for soliciting public comments to allowing groups of stakeholders to propose alternative regulatory regimes.  The presenter recently published an article in the Administrative Law Review exploring the latter proposal, and the talk will examine how this might work in practice.

Finally, the presentation will offer a comparative perspective, examining recent retrospective review initiatives in the European Union and various British Commonwealth nations.  It will explore the REFIT and evaluation initiatives in the EU, which seek to design new regulations to facilitate subsequent retrospective review.  It will also explore various regulatory budgeting proposals that have emerged in the UK, Canada, and Australia, all of which aim to provide strong incentives for retrospective review by placing a ceiling on overall regulatory costs.

3. Building Retrospective Evaluation Capacity to Enhance EPA's Prospective AnalysesNicholas Hart*, The George Washington University

Since the inception of the USEPA, considerable emphasis has been placed on the use of policy analysis tools that aim to prospectively inform environmental policy decisions, including cost-benefit analysis and risk assessment used for regulatory actions.  However, compared to the amount of such ex ante analysis conducted at the USEPA before a decision is reached, relatively little evaluation of these same environmental policies is produced after implementation to inform future policy development or to modify existing policies. This research discusses EPA's history producing retrospective program evaluation, including retrospective regulatory reviews.  The presentation will outline key factors affecting EPA's ability to expand production of program evaluation and highlight the potential benefits of more widespread applications.  

4. Improving the Elicitation of Professional Judgements for Use in Regulatory Benefits Analysis; Jennifer Baxter*, Margaret Black, and Henry Roman (Industrial Economics, Inc.); Arturo Rios (U.S. Coast Guard)

In order to estimate the benefits of federal regulations, analysts require quantitative estimates of the effectiveness of regulatory requirements. Developing or obtaining such estimates is often the most difficult component of benefit-cost analyses, as certain key parameters needed to estimate effectiveness may be uncertain. One option for filling data gaps, addressing inconsistencies in the literature, or adjusting estimates derived from one context (e.g., a training program focused on motor vehicle safety) for use in another context (e.g., a training program focused on maritime safety) is the use of formal, structured expert elicitation. However, this tool is resource intensive, and such elicitations often require a year or more to complete. For regulations that are unlikely to result in large costs or benefits or raise major concerns about equity, a substantial investment of time and resources in formal, structured expert elicitation may not be appropriate. To address this issue, U.S. Coast Guard explored options for developing a streamlined approach for obtaining professional judgements that draws on the best practices of formal, structured elicitation. Its goal is to enhance the quality, reproducibility, and transparency of the process and its results, while accounting for time and resource constraints. In this presentation, we describe a pilot study implementing this streamlined approach, and discuss the results, lessons learned, and suggestions for improvements.

B.8: Defining the Boundaries for Benefit-Cost Analysis

Chair: John Mendeloff, University of Pittsburgh

Presentations:

1. 'Three Basic Postulates' Revisited: A Sufficient Statistics Approach to BCA; Don Kenkel*, Cornell University

In his seminal paper, Harberger (1971) advocates a framework for applied welfare economics that uses the demand price and supply price as measures of benefits and costs. The framework is now conventional and guides the current practice of BCA, for example in regulatory impact analyses of federal regulations. However, the current practice of an important line of empirical economics research no longer focuses on estimating the demand curves and supply curves that underlie the calculation of consumer and producer surplus. Instead, a typical study estimates a reduced-form equation that shows the impact of a policy variable on some socially desirable outcome such as better health or more schooling. Reduced-form research emphasizes clean identification of the causal treatment effect of the policy. Reduced-form research is often less informative about whether the policy improves the allocation of resources, i.e. whether the social benefits from more of the desirable outcome are worth the social opportunity costs. Reduced-form studies do not necessarily discuss the market or individual optimizing failures that mean the pre-policy equilibrium is inefficient. Also, BCA is often not conducted or left for back-of-the-envelope calculations. In this paper, I discuss the link between reduced-form econometric research, the sufficient statistic approach to welfare economics, and BCA. Chetty (2009) derives formulas that show the social benefits of policies in terms of high-level elasticities that can be estimated through reduced-form empirics. The sufficient statistics/ reduced-form approach also captures the social opportunity costs of policies. I discuss other links between the sufficient statistics approach and the current practice of BCA, for example the parallels between the multipliers in Chetty’s approach and the shadow prices commonly used in BCAs. In the final sections I discuss the advantages of more structural econometric estimation to extrapolate policy effects and conduct ex ante BCA of new policies.

2. Bridging the Partial and General Equilibrium Divide; Scott Farrow*, University of Maryland Baltimore County

Advances in theoretical and computable general equilibrium modeling brought their conceptual foundations in line with standard microeconomic constructs. This reduced the theoretical and empirical gap between welfare measurements using a partial or a general equilibrium approach. However, the separation of the partial and general equilibrium literatures lingers in many applications which this manuscript seeks to bridge. The now shared conceptual foundations, the importance of functional specification, the role of common price movements, and closure rules are discussed. The continuing US Government exclusion of secondary effects from welfare measures in some applications is questioned.

3. Wellville or Funville: You Got A Problem?; Clark Nardinelli*, U.S. Food and Drug Administration

The economic analysis of a public policy should start with the identification of the market, government, or behavioral failure that caused the problem, thereby making the policy potentially effective and welfare-enhancing. In the market failure part of a benefit-cost analysis, we identify what is broken; without a failure of some kind, nothing is broken that a policy intervention can fix. One curious feature of many benefit-cost analyses, however, is that the market, government, or behavioral failure causing a problem is given short shrift in the benefit-cost analysis. Many analyses simply identify a failure in broad terms and move quickly on to the problem the policy will solve. This reluctance to fully define and explain what is broken arises not because market or other failures are difficult to find; rather, it arises because failures are too easy to find. The economist seeking justifications for policy effectiveness has an almost limitless catalogue of failures to choose from. The cornucopia of failures available to the analyst has the perverse effect of truncating the identification and analysis of the failure. In this paper, I illustrate the ease of finding failures with a social puzzle, the different behaviors of people in two small towns, Wellville and Funville. I will identify an apparent problem arising from the observed difference and describe the wide array of market, government, and behavioral failures available to explain what we observe.

4. Distinction between Benefit-Cost Analysis and Cost-Benefit Analysis in Law and Economics; Scott Farrow*, University of Maryland Baltimore County on behalf of Richard Zerbe, University of Washington

The tension between deontological and economic thinking is old and pervasive. This tension lessens when the sphere of economics is more carefully delineated. This article aims to distinguish between cost benefit analysis (CBA) and benefit-cost analysis, (BCA). BCA recognizes rights and moral sentiments as values insofar as they are reflected in the willingness to pay to obtain them and the willingness to accept payment in return for giving them up. CBA is often limited to analyzing only the monetary, fair market value of property. BCA provides a more accurate measure of well-being, drops the CBA grounding in the PCT, and reflects moral sentiments in valuation. 

C.8: Applications in Energy and the Environment

Chair: Martha Rogers, Brattle Group

Presentations:

1. Cost Efficiency of Payment Systems for Forest Carbon Sequestration Incorporating Spatial and Temporal Heterogeneities; Seong-Hoon Cho*, University of Tennessee

Concern is growing about climate change and its threats to human health, the environment, and ecosystems. Establishing new or expanding forest areas through afforestation, reforestation, and mitigation of deforestation by providing incentives to landowners can be an effective policy tool for offsetting greenhouse gases. Many studies have focused on the efficiency of different incentive payment approaches intended to account for the spatial variations in the benefits of forest-based ecosystem services and opportunity costs of forestland. Although spatial heterogeneity has received much attention, few, if any, studies have explicitly focused on the potential for payment programs for ecosystem services that account for both spatial and temporal heterogeneity to improve cost efficiency. The objective of this research is to assess the spatial and temporal heterogeneities in the costs of supplying forest-based carbon storage to help identifying spatial targeting of incentive payments under different time periods. We developed a case study that aimed to achieve the objective based on one of 179 Bureau of Economic Areas, which consists of 17 Tennessee counties and 1 Kentucky county over three time periods (i.e., 1992-2001, 2001-2006, and 2006-2011). Our empirical results show that there are spatial and temporal heterogeneities in the cost efficiency of carbon storage. Our findings are triggered by the difference in dynamics of the response of forestland changes to the change in net return of forest land that is reflected in the difference in the transition probabilities of forestland from sustaining forests and afforestation of non-forested lands by space and time. The cost-efficiency maps for each of the three periods can be used as a reference for spatial targeting of incentive payments under different periods. For example, policymakers can anticipate regional budget allocation based on the cost-efficiency maps to predict variations of impact areas under a hypothetical budget scenario under different periods.

2. Ambiguity Aversion and the Expected Cost of Rare Energy Disasters: An Application to Nuclear Power Accidents; Romain Bizet* and François Lévêque, MINES ParisTech

Assessing the risks of rare disasters due to the production of energy is paramount when making energy policy decisions. Yet, the costs associated with these risks are most often not calculable due to the high uncertainties that characterize their potential consequences. In this paper, we propose a non-Bayesian method for the calculation of the expected cost of rare energy disasters that accounts for the ambiguity that characterizes their probabilities of occurrence. Ambiguity is defined here as the existence of multiple and conflicting sources of information regarding the probabilities associated with these events. In other words, this method generalizes cost-benefit analysis to situations of uncertainty characterized by ambiguous probability distributions: it provides a rational way of taking into account the existence of multiple probability distributions (such as public beliefs or probabilistic risk assessments) that are often associated with rare energy disasters.

We then apply this method to the particular case of nuclear accidents in new builds. Our results suggest that the upper-bound of the expected cost of such accidents is 1.7€/MWh, which is consistent with most of the recent estimates. This expected cost may rise to 7€/MWh when the macroeconomic shock caused by a nuclear accident is taken into account. Our numerical results suggest that, even under maximum pessimism, the expected costs of nuclear accidents remain small when compared to the total LCOE of nuclear new builds. Another policy implication of this paper is that public perceptions as well as technical expertise ought to be taken into account by policy-makers in cost-benefit analysis when it looking at particular risks such as nuclear accidents. The method we propose allows to combine these two sources of information, and could also be used to assess other catastrophic risks, such as oil spills or dam failures.

3. The Effect of Water Quality Characterization on Recreation Demand Model Results; William Wheeler*, U.S. Environmental Protection Agency

The three main characterizations of water quality seen in the economic literature are direct measures of water quality, water quality indexes (WQIs) which condense multiple water quality measures into a single number, and designated uses intended to signal whether or not a water body is achieving quality levels necessary for different uses. A tension in the literature is that stated preference studies of water quality benefits seem to rely more frequently on indices or designated uses to characterize water quality while revealed preference analyses tend to use direct measurements of water quality. We are not aware of any work that systematically investigates how the use of these different water quality metrics affects estimated willingness-to-pay (WTP).

This paper attempts to determine which water quality metrics best explains behavior, using lake water quality and recreator behavior data from the Iowa Lakes Study. This study matched detailed water quality data for 21 parameters for 129 Iowa Lakes with a survey data (of 3,859 households) on use of these lakes. Direct water quality measurements, water quality index values calculated from those measurements, and the achievement of designated uses are included in separate repeated choice mixed logit travel cost recreation demand models.

Preliminary results indicate that the use of the direct measures of water quality and using the water quality index imply similar WTP for water quality improvements. For example, the per-trip value of one unit changes in Total Nitrogen (TN) and Total Phosphorus are valued at between $0.01 and $0.06 using direct parameters and WQI. We also find that survey respondents prefer lakes with better designated uses, especially those lakes that are described as high quality. But respondents do not appear to value achievement of the water quality criteria necessary to achieve those uses. The results are somewhat puzzling and deserve further research. 

D.8: Issues in Valuing Outcomes in BCAs of Social Program

Chair: Anne Gordon, Mathematica Policy Research

Presentations:

1. Reducing the Double Counting of Benefits: A Bayesian Approach to Model Multiple Outcomes in Education Evaluations; Yilin Pan*, Columbia University; and Wenjie Joanna Zhang, George Washington University

Education interventions usually involve multiple goals and objectives. For example, a reading program is aimed at not only increasing students’ knowledge and skills in reading, but also fostering students’ non-cognitive skills and promoting common values of the society. The multiplicity of objectives leads to the complication of aggregating multiple outcomes in cost-benefit analysis. A solid cost-benefit analysis should rest on a good estimation of the impact of each outcome conditional on other outcomes in the model. However, in evaluation studies (including experimental and quasi-experimental designs, either parametric or non-parametric), the effectiveness of an intervention on different outcomes is usually estimated separately. The outcomes of interest, as dependent variables, are plugged into the estimation model one at a time. Therefore the reported impact on each outcome is marginal effectiveness, rather than conditional effectiveness. When marginal effectiveness is used for cost-benefit analysis, the total benefit could be overestimated due to the double counting of the joint benefits induced by the correlation of multiple outcomes. To address this issue, this paper proposes a Bayesian approach that models the multiple outcomes of an intervention as a multivariate normal joint distribution defined by a mean vector and a variance-covariance matrix. The dependency of these outcomes is accounted for in the correlation matrix, and each element of the mean vector is modeled as a regression. Using the data of the Head Start Impact Study, this paper also demonstrates the application, estimation and interpretation of the Bayesian model.

2. Private Investments in Public Preschools: the Use of CBA in Determining the Feasibility of Pay-for-Success Contracts; Judy Temple*, University of Minnesota

In the last five years, private investors in the U.S. have contributed well over $100 million to expand state and local provision of promising cost-effective, preventive interventions. Through the use of social impact financing combined with Pay-for-Success contracts, investors agree to finance social program expansion with the understanding that states, cities or counties will pay them back in later years. These payments to the investors are made possible when cost-effective preventive interventions generate future government cost savings. In the fall of 2016, the U.S Department of Education sponsored a grants competition to help support feasibility studies of Pay for Success contracting for preschool programs. The funded applications will be made publicly available in late 2016. This paper investigates the use of benefit-cost analysis in promoting this type of private investment in public services and provides some assessment and guidance on how economic evaluation is and can be used to identify the taxpayer benefits arising from expansion in preschool programming. The use of benefit-cost analysis to promote early childhood investments has been discussed in detail in Temple and Reynolds’ 2015 paper in the Journal of Benefit-Cost Analysis. The proposed presentation extends this discussion to new developments across the U.S. since that paper was published.

3. The Impact of Job Satisfaction on Subjective Well-Being; Tapas Ray*, U.S. Centers for Disease Control and Prevention

Objective: The benefits of being employed and at the same time satisfied with job is estimated in terms of workers' subjective (evaluative and hedonic) well-being. While earned income is considered a determinant of well-being and the impact of work environment on job satisfaction is established, the potential effect of job satisfaction of income earners on their well-being has not been recognized. To address this gap, we examined the association between job satisfaction and well-being. We also find out the income equivalent of the benefit of job satisfaction.

Methodology: We analyzed responses from 1, 77,701 US respondents to the 2013 Gallup-Healthways Well-Being survey. This survey is part of Gallup’s Daily Tracking Survey of 1000 U.S. adults, ages 18 and over, and features questions on various political, economic, health, and well-being topics (Gallup Daily Methodology, 2013). Following Kahneman and Deaton (2010) and Deaton and Stone (2013), we measured subjective well-being in terms of current and future life evaluation (evaluative well-being); daily positive emotional experiences (hedonic experiences) in terms of feelings of happiness, smiles, and enjoyment; and, daily negative experiences in terms of sadness, anger, worry, and stress. We used job satisfaction as a binary explanatory variable. After controlling for demographic characteristics, and health and socioeconomic factors, we estimated the marginal effect of job satisfaction on workers’ subjective well-being. Following Fuziwara et.al. (2011, 2014), we also measure the benefit of job satisfaction in terms of (EV) income.

Results: We found a significant and positive relationship between job satisfaction and subjective well-being both in terms of higher life evaluation scores and higher odds of positive hedonic experiences. After controlling for covariates, compared with unsatisfied workers, satisfied workers had higher current (13%) and future (7%) life evaluation scores, and were twice as likely to experience positive emotional feelings of happiness and enjoyment and lower levels of sadness. Income equivalency of job satisfaction is high.

4. Project Finance through Impact Investment in the Social Sector; Jay Mackinnon* and Bahman Kashi, Limestone Analytics

In this paper, we discuss some of the questions that emerge from the study of impact investment in the social sector, particularly in the form of impact bonds.

We will begin with a discussion of where impact bonds exist in the spectrum of performance based contracts, and what features differentiate traditional grants, outcomes based aid, results based finance, impact bonds and conditional cash transfer. We will then move to a discussion of how the two major varieties of impact bonds, SIBs (social impact bonds) and DIBs (development impact bonds), are differentiated. We will discuss why, in theory and in practice, SIBs are relatively easy to assess under a cost-benefit analysis frameworks, while DIBs have specific challenges inherent in their structure that make cost-effectiveness analysis a more suitable tool for their evaluation.

Our third section will focus on a discussion of the challenges inherent in selecting metrics for performance-based contracts in the social sector. We define characteristics of contracts that change for better or worse as one moves from output focused metrics to impact focused metrics. We conclude with a broad summary of some of the other areas of study for those interested in the role of impact investment as a project financing mechanism in the social sector. These include the renewed focus that impact investment entails for quantitative rigor in the social sector, the danger of perverse incentives in contracting, the challenges involved in pricing impacts and the potential transaction costs and verification costs that might make impact bonds infeasible. These issues will be discussed, among others, in a forthcoming technical report on impact investing in the social sector.

E.8: Issues in Costs and Benefits in Agriculture Policy

Chair: Elisabeth Newcomb, U.S. Food and Drug Administration

Discussant: Sandra Hoffmann, U.S. Department of Agriculture

Presentations:

1. Corn or Cattle? Comparison of Ecosystem Services under Different Land Uses; Haochi Zheng* and Stefano Potter*, University of North Dakota

Tallgrass prairies are one of the most productive ecosystems in North America, but 99% of the original extent has been converted, primarily to agriculture. This conversion comes at the expense of the societal value derived from the ecosystem services they provide. The Sheyenne National Grassland of southeastern North Dakota is managed by the USDA Forest Service and represents the largest publicly owned tract of tallgrass prairie remaining in the United States. In this paper, we use various geospatial tools combined with benefit transfer method to quantify the economic value of ecosystem services including agricultural production, water regulation, soil erosion, soil organic carbon and biodiversity on the Sheyenne grassland and surrounding private agricultural land. Under the benefit cost analytical framework, we evaluate the trade-offs of ecosystem service values and commodity values in either cropping or cattle ranching to gain insights on the economic efficiency of different land uses. Our results show that, except for the economic value of crop production, all other ecosystem service values on the Sheyenne National Grassland are much higher than on the surrounding private land. Overall, when considering both ecosystem service values and commodity values, the grassland is more economically valuable than cropped private land which indicates that land use policy should consider ecosystem service values in addition to commodity values when making sustainable land use decisions.

2. The Impact of Groundwater Depletion on Farm Exit Rates in the High Plains; Bern Dealy*, U.S. Food and Drug Administration

The High Plains Aquifer underlies 175,000 square miles of land in the US and spans eight states. Farming on the High Plains contributes significantly to production in the agriculture industry and thus the US food supply. Much of the farming on the High Plains relies on the aquifer for irrigation. Water pumped from the aquifer represents over one fourth of all water used in US agriculture production (Houston et al., 2013). Heavy pumping and low recharge have depleted the aquifer considerably over the last 60 years. In the long-run, as the aquifer continues to deplete, farming in the region is expected to change considerably. Specifically, farms will have to adopt new technologies and farming practices or switch to less water-intensive crops in order to continue operations after irrigation is no longer viable. However, adaptation is costly. In the short-term, existing farm establishments may not be able to successfully pivot their operations and choose instead to exit. Historically, farm exits are linked closely to the life cycle of farmers and are offset by a similar number of farm entries (Hoppe and Kolb, 2006). However, evolving resource constraints have the potential to disrupt historical farm transition patterns, generating numerous policy implications at the federal and local level. This study uses longitudinal data constructed from the USDA Census of Agriculture joined with ground water and climate data to investigate the impact of groundwater depletion on farm exit rates in the High Plains region of the US. Preliminary results suggest that even when accounting for heterogeneity between farm operations and their operators, groundwater depletion has a significant impact on the likelihood of farm exit.

**The views expressed in this article are those of the authors and are not intended to represent the opinions of the Food and Drug Administration.

3. Reliable Reduction in Agricultural Runoff under Environmental Uncertainty; Zhiyu Wang*, University of Minnesota

Agricultural runoff is a major source of nonpoint water pollution. Because of environmental variability, a target for water pollution reductions cannot be met with certainty. Most studies use an average target for water pollution reductions over time. However, they sidestep a thorough consideration of stochastic water pollution. In this paper, I derive a probabilistic target for water pollution reductions. This target looks at the reliability of a reduction, where reliability means achieving a given target with a level of probability regardless of environmental changes. I further derive a robust solution that protects against the worst case of all possible probability distributions of water pollution, and apply it to the Wolf Creek watershed in Iowa. The results demonstrate a positive relationship between abatement costs and reliability levels. Compared with an average reduction over time, achieving a 41% reduction in total nitrogen (TN) with a 70% reliability level will increase the abatement cost by $44.9 million. I also examine the Margin of Safety (MOS) in Total Maximum Daily Load (TMDL) requirements. In order to reliably reduce pollution, the current values of the MOS should change with changes in reduction targets.