2016 Conference - Session 3

Session 3 - Thursday, March 17, 2:00 - 3:30pm

A.3: “Storm'd at with Shot and Shell”: How Economists and Non-Economists Collaborate During Policy Development (Roundtable Discussion)

Chair: Bradley Brown, US Food and Drug Administration

Panelists:

1. Neil Eisner, Consultant

2. Allen Fawcett, US Environmental Protection Agency

3. Ali Gungor, US Coast Guard

4. Clark Nardinelli, US Food and Drug Administration

5. Stuart Shapiro, Rutgers University

The theme of this panel is practical advice from experienced regulatory analysts and developers on best collaborative practices between economists and non-economists. The intention of the panel is to provide regulatory analysts and developers with insights and tools for working together to develop policies, and the benefit-cost analyses that inform them.

The panel will address issues and best practices for economists working and communicating with non-economists on regulatory impact analyses during policy development. Panelists will be asked to address both the information-gathering phase and analysis-presenting phase of the regulatory analysis process. Panelists may also be asked to offer advice on specific examples of interactions from the moderator and audience.

B.3: Smoking and Vaping: Public Policy towards Cigarettes and E-cigarettes

Chair: Laura Stanley, George Mason University

Presentations:

1.     Valuing the First-Hand Health Benefits of Tobacco Regulations, Amber Jessup,* Department of Health and Human Services

As the U.S. Food and Drug Administration (FDA) implements its new authorities to regulate tobacco products, the Agency has had to confront some relatively unsettled questions in benefit-cost analysis. The primary benefits of tobacco regulations come from reducing mortality and morbidity among people who quit smoking or are deterred from ever starting. Available estimates of first-hand health benefits vary widely, from $20-30 per pack not smoked (Cutler 2002, Sloan et al. 2004, Gruber and Koszegi 2001) to $100-200 per pack (Viscusi and Hersh 2008). Having well-founded monetary estimates of the first-hand health benefits of reducing smoking is important, as FDA’s ability to finalize and implement new regulations depends on how these and other benefits to consumers compare to costs borne by businesses and government. This paper presents new estimates of first-hand health benefits of reducing smoking, incorporating a number of methodological advancements. First, we incorporate recent statistical estimates of expected increases in life expectancy and improvements in health-related quality of life, allowing these to differ by gender and the age at which a regulation changes a person’s smoking status. Second, we incorporate recent draft guidance from the U.S. Department of Health and Human Services on valuing reductions in mortality and improvements in health-related quality of life in terms of quality-adjusted life years. Third, we value the lifetime health benefits of reducing smoking in present-discounted value terms, using both the standard counterfactual in which people smoke until they die and an alternative counterfactual in which they have some probability of quitting on their own. In brief, our estimates also show wide variation due to uncertainties about values of key variables. However, they are decidedly higher than $20-30 per pack and mostly fall in a $50-100 range. Implications for regulatory impact analysis of tobacco regulations are discussed.

2.     Risk Beliefs and Preferences for E-Cigarettes, W. Kip Viscusi,* Vanderbilt University

Drawing on evidence from a new nationally representative survey, this article examines several measures of risk beliefs for e-cigarettes. For both lung cancer mortality risks and total smoking mortality risks, respondents believe that e-cigarettes pose risks that are lower than the risks of conventional tobacco cigarettes. However, people greatly overestimate the risk levels of e-cigarettes compared to the actual risk levels. Risk beliefs for conventional cigarettes receive at least a two-thirds informational weight in the formation of e-cigarette risk beliefs. Public perceptions of nicotine levels of e-cigarettes are closer to the beliefs for conventional cigarettes than are their health risk perceptions. Consumers’ desired uses of e-cigarettes are more strongly related to health risk perceptions than perceived e-cigarette nicotine levels. The overestimation of e-cigarette risks establishes a potential role for informational policies.

3.     Optimal Taxes on E-Cigarettes, Kyle Rozema,* Northwestern University School of Law

We study a model of optimal taxation on e-cigarettes, a healthier but addictive substitute to cigarettes. The model we develop has three key features. First, we account for heterogeneity in the population of smokers in terms of their predilection for nicotine addiction and preferences for e-cigarettes relative to conventional cigarettes. This allows the population of conventional cigarette smokers, e-cigarette users, and non-nicotine users to emerge endogenously in the model as a function of relative tax-inclusive prices for conventional cigarettes, e-cigarettes, and numeraire consumption. Moreover, it also allows us to evaluate the distributional impacts of e-cigarette taxation in addition to the direct efficiency costs. Second, we account for the possibility that smokers may only partly internalize the public health gains from switching from conventional cigarettes to e-cigarettes, reflecting behavioral failures. This is important because conventional smokers are less likely to switch to e-cigarettes on their own because they only partly internalize the harm caused by conventional cigarette consumption. Third, our model captures the possibility for e-cigarette consumption by non-smokers to ‘gateway’ to conventional cigarette consumption. This may offset the potential gains from smokers who switch from conventional cigarettes to e-cigarettes. To calibrate the model, we plan to estimate key elasticities of conventional and e-cigarette take-up using the Nielsen Homescan Consumer Panel dataset. While preliminary, the results from the optimal tax model suggest two policy recommendations. First, even under upper bound assumptions for the size of the gateway effect, the optimal size of e-cigarette taxes appears to be modest relative to cigarette taxes, although these results are very preliminary. Second, to the extent that the gateway effect is nontrivial, policy makers should act quickly to increase e-cigarette taxes.

4.     The Challenges of Estimating the Benefits of Graphic Warning Labels on Cigarettes, Don Kenkel,* Cornell University

Cigarette packs sold in the U.S. currently must show one of four rotating text Surgeon General’s warnings about the health consequences of smoking. The Family Smoking Prevention and Tobacco Control Act of 2009 authorized the Food and Drug Administration (FDA) to require graphic warning labels (GWLs). Similar to labels in Australia, Britain and Canada, the proposed GWLs would contain graphic images and would cover fifty percent of the front and rear panels of each pack. The FDA’s benefit-cost analysis of the GWL rule is controversial and the rule itself has been challenged in the courts. In this paper we discuss the challenges of estimating the benefits of GWLs. In particular, we focus on the research design challenges to develop credible estimates of the impact of GWLs on smoking outcomes. The FDA’s analysis and other studies use a quasi-experimental design and compare smoking outcomes before-and-after the enactment of GWLs. Because GWLs are enacted nationally, these studies use other countries as the untreated control group. We discuss the validity of the assumptions required for this approach including; the comparability of the countries; common pre-trends in smoking; and the absence of other policy changes. We extend previous research to use alternative control groups: adjacent states and provinces in the US and Canada; and synthetic control groups. We also consider evidence from small-scale randomized experiments that gauge people’s immediate reactions to different GWLs. Although these experiments do not raise the same questions of internal validity faced by the quasi-experimental studies, the experimental results might lack external validity and provide unreliable evidence about the impact of GWLs in the real-world.

C.3: Benefit-Cost Analysis Applied to Infrastructure Issues

Chair: Art Rios, US Coast Guard

Presentations:

1.     Real Options and Cost-Benefit Analysis of Infrastructure: A Simplified Decision-Tree Approach to Value Flexibility, Thomas van der Pol,* CPB Netherlands Bureau for Economic Policy Analysis

Real options theory stresses that flexible investment strategies are often superior to more rigid investment strategies. This also applies to infrastructure investments. However, the value of flexibility is ignored in the deterministic practice of cost-benefit analysis of infrastructure, which is common in the Netherlands and many other countries. This paper argues that a simplified approach to decision-tree analysis has the most potential to bridge the gap between real options theory and the deterministic practice of cost-benefit analysis of infrastructure. This paper elaborates on how simplified decision-tree analysis, a ‘real options light method’, can help to incorporate the key elements of real options theory, such as new information, multiple decision-moments and probabilistic states of the future. This is illustrated with numerical examples and two case-studies about Dutch road and flood risk infrastructure. The merits and limitations of the simplified decision-tree analysis are discussed and compared with contingent claims and other real options methods. These employ features like risk differentiation and stochastic assumptions that are often not understood by policy-makers and are difficult to communicate. Simplified decision-tree analysis, in contrast, is easier to understand and communicate, and fits better in the practice of cost-benefit analysis of infrastructure.

2.     Value for Funding: Evaluating Infrastructure Financing Alternatives in a Fiscal Context, John Ryan* and Julie Kim, Stanford University Global Projects Center

Public-private partnerships (P3s) are an important alternative source of financing for America’s much-needed investment in public infrastructure. However, P3s are complex transactions and it is often difficult to evaluate their true costs and benefits in comparison to more traditional public-sector procurement methods.

Value for Money (VfM) is currently the standard BCA methodology for P3 comparative evaluation. VfM is a deterministic project-level analysis that can surface a P3’s intrinsic efficiencies and cost savings in project construction and operation.

While a VfM analysis is always necessary, it is frequently not sufficient. When the public sector’s long-term fiscal situation is constrained or stressed (as is now the case for many U.S. state and local governments) an additional level of analysis that explicitly considers fiscal context is required for a complete picture. Such an analysis will need to consider public-sector factors beyond the project itself. Since many of these factors are largely uncertain over the long-term, the analysis is intrinsically stochastic.

Stanford University’s Global Projects Center is developing a new standard methodology, called ‘Value for Funding’ (VfF), to guide infrastructure financing comparative evaluation analysis. VfF focuses on a project’s impact on fiscal factors in a stochastic framework. This presentation will introduce basic VfF methodology and illustrate the concepts with hypothetical examples of the differential fiscal impact of various P3 and traditional alternatives. The presentation will also include a report on empirical research done to-date.

3.     Flood Insurance Take-up and Housing Prices: An Empirical Agent-Based Model Approach, Okmyung Bin,* East Carolina University and Tatiana Filatova, University of Twente Faculty of Management and Governance

Floods are one of the most common and widespread natural disasters in the United States, and yet the damage from flood events is usually not covered by homeowner’s insurance policies. Flood coverage is offered federally through the National Flood Insurance Program (NFIP), established by the National Flood Insurance Act of 1968. Under current provisions, if communities choose to adopt minimum floodplain management policies, their residents become eligible for this insurance backed by the federal government. Federally regulated or insured lenders in the United States are mandated to require flood insurance on properties that are located in areas at high risk of flooding. Despite the existence of this mandatory flood insurance requirement, take-up rates for flood insurance have been low and the federal government’s exposure to uninsured property losses from flooding remains substantial. In this paper we employ an empirical adaptive agent-based model to simulate the impacts of the flood insurance requirement on housing market under the scenario of the complete take-up. Our approach combines the empirical hedonic analysis with the computational economic framework to examine capitalization of insurance premiums in housing prices. A bilateral housing market allows exploring a shift between simulated hedonic equilibria while directly tracing the dynamics of implicit prices of flood risk over time. Results indicate that the requirement of flood insurance would lead to decreases in housing prices. The effect is more pronounced for the Special Flood Hazard Areas than for the less risky areas.

4.     Capturing or Illustrating the Highly Unlikely in a Regulatory Context, Erik Gomez,* Ali Gungor and Rose Odom, United States Coast Guard

Regulation is an intended set of government actions aimed at obtaining a socially optimal outcome. Sometimes a set of rules have the easily quantified goal of reducing or eliminating existing perilous risks, like making sure a plane engine meets mechanical adequacy. However, other rules’ goals are far more difficult to quantify due to their abstract objective, such as  reducing unforeseeable dangers like terrorist attacks, for example. So, how does one analyze the highly unlikely when standard statistical methodologies seem inadequate, at best? Some insights may be garnered from the ‘Black Swan Theory,’ a paradigm guided by the notion that by their very nature, the highly unlikely is nearly impossible to mathematically predict.

The proposed presentation will provide an overview of a rulemaking project (Dynamic Positioning Systems) to help illustrate the general approach that Coast Guard implemented in a regulatory analysis as it relates to the estimation of highly improbable events. Specifically, it critically evaluates the status-quo practices of benefit estimation, provides a framework for addressing ‘un-predictable’ events, and ultimately considers the validity of the ‘Black Swan’ paradigm in a regulatory context.

D.3: Equity & Efficiency Concerns in Environmental Policy

Chair: Nicholas Mastron, The George Washington University

Presentations:

1.     Income Inequality and Carbon Emissions: Evidence from State-level Data, John Voorheis,* University of Oregon

A wide-ranging literature has suggested that there may be a relationship between economic inequality and environmental degradation, but has come to no consensus on the direction of the effect or credible identification of causality. I propose a way forward by combining new data with a new (to this literature) identification strategy. I leverage recently available State-level data on carbon emissions and income inequality over the period 1980-2012, combined with a simulated IV strategy to identify the causal effect of inequality on emissions. I find that increases in income inequality lead to decreases in the level of energy-related CO2 emissions and emissions per capita, concentrated in the electricity generation and transportation sectors. These results suggest that there may be a trade-off between addressing climate change and reducing income inequality.

2.     Spatial Aspects of the Social Costs of Emissions: County, State, and Regional Results for the United States, Jinhyok Heo,* Cornell University and Robert P. Strauss

The Estimating Air Pollution Social Impact Using Regression (EASIUR) model and the Air Pollution Social Cost Accounting (APSCA) model were recently developed as an easy-to-use tool for estimating the public health costs (or social costs) of emissions in the United States. The EASIUR model was derived using regressions on a large dataset created by CAMx, a state-of-the-art air quality model, to estimate the social costs of emissions per ton of air pollutant emitted. Building upon the EASIUR model, the APSCA model allows one to identify emission sources for a given (receptor) location and to quantify their contributions efficiently and in unprecedented detail. The two models closely reproduce the social costs predicted by the sophisticated CAMx but without CAMx’s high computational costs. They currently utilize national estimates of the value of a statistical life (VSL). In this study we propose to utilize county level personal income data for 2005 to allow the statistical cost of a life to vary spatially. When compared to the VSL method, this approach allows one to make detailed statements about the distributional effects of policy measures associated with changes in emissions. More specifically, we are interested in introducing equity measures into air quality policy. Using measures to evaluate the vertical and horizontal equity of tax policy, we will quantify the distributional effects of a major air regulation as a proof-of-concept.

3.    Assessing Costs and Benefits from Implementing Real-Time Pricing of Electricity in Cypriot Power Market, Sener Salci,* Queen's University

This paper analyzes the impacts of real-time electricity pricing (i.e. marginal cost pricing for end consumers) in the Cypriot electricity market on power prices, peak and off-peak capacities, emissions from electricity generation, and renewable energy sources such as wind and solar. We apply the model to the real electricity market using real market data, such as hourly load demand and power supply data of the island. The results from the model show that dynamic pricing of electricity will increase capacity utilization during off-peak hours, decrease peak capacity, reduce power (costs) prices in Cyprus for poor off-peak users, reduce emissions from electricity generation, and increase the use of wind resources in the island. With introducing renewables such as wind and solar, we find that peak capacity decreases further so that capacity credits from solar and wind have a greater load factor as a percentage of peak demand. We find that there is a potential gain from smart metering even at small consumer response, and/or with a higher participation to the program. Therefore, the country should switch to smart metering and shift away from an average pricing of electricity, and authorities should let market participants react to changes in electricity prices. Costs of such programs outweigh benefits depending on range of demand elasticities and participation to the program. Based on our benefit estimates from dynamic pricing, we also recommend that relevant authorities provide customers with accurate expectations about their bill savings from such programs so that the program will yield higher benefits to cover the cost of implementation.

 

E.3: Improving the Cost-Effectiveness of Hazard Mitigation Aid (Roundtable Discussion)

Chair: Brian Mannix, GW Regulatory Studies Center

Panelists to Include:

1. Frits Bos, CPB Netherlands Bureau for Economic Policy Analysis

2. Joseph Cordes, The George Washington University

3. Sarah Lane, Millennium Challenge Corporation 

4. Adam Rose, University of Southern California

It is always a good idea to construct buildings and infrastructure to be resistant to damage from the natural hazards to which they may be exposed: hurricanes, floods, earthquakes, and tsunamis. In practice, many of these investment decisions are made in the wake of a natural disaster, when aid money is available, the risks are obvious to all, and reconstruction is proceeding urgently. Benefit-cost analysis has been used successfully in the U.S. and elsewhere to ensure that hazard mitigation funds are directed to projects with positive net benefits, but there are challenges in extending these methods to areas of the world where the availability of hazard data is more limited. This panel will explore the state of the art, and exchange ideas for advancing and improving the use of benefit-cost analysis for hazard mitigation in the context of international aid for development and disaster relief. Audience participation in the discussion will be encouraged.