2018 Conference - Session 8

2018 Conference - Session 8

SESSION 8 | Friday, March 16 | 3:45-5:15PM

A. 8 Soda Taxes: Benefits, Costs, and Distributional Issues

Chair: Clark Nardinelli, U. S. Food and Drug Administration

Discussant: Elizabeth Botkins, U. S. Food and Drug Administration

Presenters:

  1. Regressive Sin Taxes, with an Application to the Optimal Soda Tax; Benjamin Lockwood, University of Pennsylvania
  1. How Well Targeted are Soda Taxes?; Rachel Griffith, Institute for Fiscal Studies (IFS)
  1. The Pass-Through of the Tax on SSBs in Boulder, CO; David Frisvold, University of Iowa

B. 8: Costs and De-Regulation

Chair: Randall Lutter, University of Virginia

Discussant: Brian Mannix, The George Washington University

Presenters:

  1. Deregulatory Cost-Benefit Analysis; Caroline Cecot, Antonin Scalia Law School, George Mason University

Cost-benefit analysis ('CBA') was once considered a tool for implementing conservative regulatory policies in part because benefits-which could justify increasing the stringency of regulations-were difficult to monetize. As advancements have been made in monetization, CBA has shed some of its conservative associations and its framework is gaining acceptance as an essential part of reasoned agency decisionmaking. This Article argues that when CBA is deployed thoughtfully, the analysis-a limit on irrational government action-is as much a limit on deregulation as a limit on regulation. This stabilizing effect of CBA across presidential administrations stems from the additional difficulties in recalculating and explaining costs and benefits to support the new policy. In fact, CBA constraints might prove most effective in preserving some of the Obama Administration's regulations, at least those supported by well-conducted CBAs, from the regulatory rollbacks envisioned by the Trump Administration. The Article discusses three implications of the stabilizing influence of CBA in an age of analysis. First, agencies will face incentives to conduct more analysis because regulations unsupported by CBA-or those supported by incomplete or underdeveloped CBA-will be easier to adjust in light of policy preferences. In particular, agencies will (1) justify regulation on CBA when possible; (2) conduct more complete CBA; and (3) rely less on unquantified benefits to justify rulemaking. Second, because a regulation's stringency depends on the value of costs and benefits, stakeholders will face incentives to promote research into these valuations. Third, as CBA becomes known as a constraint on deregulation as well as regulation, agencies will face pressures to engage in retrospective review. The Article also considers the desirability of such a stabilizing influence given concerns about democracy and accountability, agency bias toward rulemaking, and use of alternative methods of deregulation, such as nonenforcement of agency regulations. Overall, the Article contends that, by encouraging rational decisionmaking and only evidence-based changes in regulatory policy, CBA promotes predictability and plays a desirable stabilizing role in regulatory policy across presidential administrations.

  1. Understanding Regulatory Burden from Business’ Perspective; Stuart Shapiro, Rutgers University; Debra Borie-Holtz, Rutgers University

Much of the literature on the cost of regulations is either theoretical based on traditional microeconomics or large scale statistical studies (with the exception of some literature in the law and society tradition “ and that focuses mostly on the motivations for compliance or exceeding compliance). This literature has led to political rhetoric that either fatalizes the negative effects of regulation ("œjob-killing regulations") or minimizes it. In this paper we take a closer look at how businesses actually experience and perceive regulatory burden. We do this via two methods. We conducted a survey of more than 250 manufacturing businesses in the U. S. Midwest, asking them both about the burden of regulation, and about their attitudes toward government regulation of the economy. Manufacturing in the Midwest has been at the center of much of the debates about regulation and about the US economy. The Midwest also played a crucial role in the 2016 election where regulation was a central issue. We also did eight in depth interviews of such businesses going into much more detail about how they respond to regulation and how it affects their business decisions. These interviews were intended to add context to the survey and to the broader debate on regulatory impact. They were conducted with businesses of varying sizes and in various industries. Our results are still preliminary but we can say that businesses generally have a much more nuanced attitude about regulation than the rhetoric of the regulatory debate would indicate. Some predictions of theory are borne out; regulation serves as a disincentive for expansion in some cases, and incumbent firms that are complying with regulations generally do not see it as much of a burden. Firm size is a key variable in determining regulatory response. Other findings are more surprising. There is a complex interplay between perception and actual burden with each reinforcing the other. At the same time the perceived burden of regulation does not always match up with the amount of time or money firms spend on compliance. We will present the preliminary results of the survey and interviews in this paper.

  1. Estimating the Impact of Regulatory Costs on Small Businesses Using Five Key Financial Health Indicators; Ann Czerwonka*, Industrial Economics, Inc. ; Jennifer Baxter IEc; Arturo Rios, U. S. Coast Guard

Pursuant to the Regulatory Flexibility Act (RFA), as amended by the Small Business Regulatory Enforcement Fairness Act (SBREFA), as well as other relevant executive orders, Federal agencies are required to consider the impacts of their regulations on small entities, including small businesses. Specifically, agencies are required to consider whether small businesses have been disproportionately burdened by regulatory costs relative to their larger counterparts. Current methods used by most agencies for such evaluations rely on simple comparisons of direct compliance costs to business revenues or profits. These methods are not effective at capturing other potential impacts, however, such as increased incidences of technical insolvency and/or business closures that may result from regulatory burdens. To address this gap, we developed a model that relies on readily-available financial data for businesses of varying sizes in specific industries (defined by NAICS codes) and applies widely-used and respected metrics of entity-specific financial health. By entering anticipated compliance costs, analysts can calculate the changes in metrics that assess a company's financial structure, performance and solvency, including its debt-to-equity ratio, current ratio, times-interest-earned ratio, Beaver's ratio, and Altman's Z-score. Resulting metrics with the regulatory costs are compared to baseline and average target values (the latter of which can be tailored for each industry and size class) as an indicator of the degree to which the compliance requirements will increase a typical company's financial vulnerability and likelihood of bankruptcy. The model builds on existing tools currently employed by regulatory agencies to assess the affordability of financial penalties for non-compliance with existing regulations, and represents a more sophisticated approach to considering the impacts of regulations on small businesses.

D. 8: Theoretical Considerations in BCA and Regulatory Analysis

Chair: Daniel Wilmoth, U. S. Small Business Administration

Presenters:

  1. Meta-BCA: Optimizing the Level of Effort for Benefit-Cost Analysis; Stephen Newbold*, U. S. Environmental Protection Agency; Charles Griffiths; Elizabeth Kopits

A benefit-cost analysis should be made as simple as possible, but not simpler. This aphorism canbe put into practice by starting a benefit-cost analysis using readily available data, simplified models, and default assumptions and then adding complications”-collecting more data, estimating more regressions, employing more sophisticated simulation models, and the like”-only if those complications are worth the cost. This staged approach follows directly from applying the logic of benefit-cost analysis to the conduct of benefit-cost analysis itself, so we call this approach "œmeta benefit-cost analysis" (meta-BCA). In this paper we argue that routine application of meta-BCA could reduce the cost of regulatory analyses, increase the overall social net benefits of federal regulations, or both. To illustrate the advantages of this approach we develop a stylized analytical model of the optimal level of effort to apply to a benefit-cost analysis conditional on the prior distribution of net benefits and the marginal cost function for increasedprecision. We then use this model to estimate the potential increase in net social benefits from government regulations if the complexity of BCAs were routinely optimized compared to a benchmark case where a common level of effort is applied to all analyses. With the theoretical concepts in place, the next challenge is to translate the principles into practice. To begin this translation, the second half of the paper presents an artificial case study to illustrate how the approach could work in a more realistic setting. Our case study is framed as a generic version of a proposed environmental regulation that would set emission standards for a possibly toxic compound. Although highly stylized, its main features are designed to resemble the kinds of uncertain elements that appear in many real-world benefit-cost analyses conducted by economists in state and federal regulatory agencies on a regular basis. Our case study comprises three elements: 1) an exposure-response model, 2) estimated marginal willingness-to-pay for reducing the associated health risks, and 3) an estimatedabatementcostcurve. Wefirst conduct a preliminary, or "œstage-0," benefit-cost analysis. Thisstage also involves using prior probability distributions that describe the uncertainty associated with each of the three case study elements to generate a prior distribution over the expected net social benefits of the proposed rule. Next, we examine the conditions under which additional information would be worth the cost of collecting it in a more detailed "œstage-1" analysis for each uncertain element. We also discuss how our proposed approach relates to determinations of "œeconomically significant" regulations made by the Office of Management and Budget, and to break-even analysis, which is often used to supplement benefit-cost analyses when one or more potentially important categories of benefits cannot be reliably monetized.

  1. A Taxonomy for Improved Regulatory Evaluation; Kerry Krutilla, Indiana University, Bloomington; Keith Belton; John Graham; David Good

A number of regulatory reform proposals have been made to improve ex ante Regulatory Impact Analysis (RIA), and to encourage retrospective review. The Regulatory Accountability Act (RAA), a bipartisan legislative proposal, would require agencies to perform a judicially enforceable cost-benefit test for proposed regulatory actions, hold mandatory public hearings on regulatory proposals expected to have billion-dollar impacts, and comply with information-quality requirements for scientific, technical, and economic evidence. The RAA would also require that an agency incorporate a plan for retrospective review when new significant rules are proposed. The Trump administration has also taken a number of administrative actions that will increase retrospective regulatory evaluation and encourage the replacement of under-performing regulations. The diversity of regulatory scales and structures pose evaluation challenges, and may influence the ultimate success of current proposals to improve regulatory assessment. This research explores whether a taxonomy of evaluation-relevant characteristics can be identified to help support regulatory reform. To illustrate the kinds of issues this research is concerned with, some regulations impose large capital costs on an entire industry (e. g. , Mercury and Air Toxic Standards, the Clean Power Plan), and such costs will quickly become sunk after the regulation is promulgated. Because the operational cost savings of repealing this kind of regulation are relatively low, the emphasis should be on high quality ex ante analysis that provides sufficient prior screening. In such cases, the purpose of ex post review would be to review the accuracy of the ex ante analysis in order to improve future evaluations. Regulations that do not impose high capital costs would be better candidates for retrospective review motivated for regulatory rollback. Examples include behavioral requirements or work practice standards (e. g. , hours of service rules for truckers or track-side workplace standards to improve railroad safety) or certifications and procedural requirements (e. g. , pilot training requirements). However, these kinds of regulationsdiffer in the level of information available. Traffic safety regulations rely on large crash data bases; in contrast, the "œincident data" for episodic airline crashes or pipeline accidents do not contain many observations. This reality raises questions both about how to conduct the uncertainty analysis ex ante, and how to evaluate such regulations ex post. For example, are the absence of airline fatalities since 2009 a reflection of FAA regulation, market-oriented actions the industry itself took, both, or neither?This study will clarify and identify key structural differences that should matter for regulatory evaluation, and make recommendations for tailoring evaluations for different regulatory contexts. The study will be based on a review of the accumulating literature on retrospective review, and a survey of RIAs of 25 lifesaving regulations issued by the EPA and DOT from 2011 through 2016. Owing to its scope and granular focus, the study will add information to the current literature on regulatory evaluation methods and practices, and provide policy-relevant insight to support the implementation of regulatory reform initiatives.

  1. Better Rules of the Game: Introducing New Global Indicators on Regulatory Governance; Joseph Lemoine*, Global Indicators Group; Melissa Johns

Our paper presents a new database of indicators measuring the extent to which rulemaking processes are transparent and participatory across 185 countries: the Global Indicators of Regulatory Governance. The data look at how citizen engagement happens in practice, including when and how governments open the policy-making process to public input. The data also capture the use of ex ante assessments to determine the possible cost of compliance with a proposed new regulation, the likely administrative burden of enforcing the regulation, and its potential environmental and social impacts. The data show that citizens have more opportunities to participate directly in the rulemaking process in developed economies than in developing ones. Differences are also apparent among regions: rulemaking processes are significantly less transparent and inclusive in Sub-Saharan Africa, the Middle East and North Africa, and South Asia on average than in Organisation for Economic Co-operation and Development high-income countries, Europe and Central Asia, and East Asia and the Pacific. In addition, ex ante impact assessments are much more common among higher-income economies than among lower-income ones. And greater citizen engagement in rulemaking is associated with higher-quality regulation, stronger democratic regimes, and less corrupt institutions. The Global Indicators of Regulatory Governance grew out of an increasing recognition of the importance of transparency and accountability in government actions. Following the 2017 data collection effort, the indicators now cover: (i) transparency of rulemaking, (ii) public consultation in rulemaking, (iii) impact assessment, (iv) accessing laws and regulations, (v) reviewing laws and regulations (ex-post reviews) and (vi) challenging regulations. The team also maintains a global database containing documents related to regulatory impact assessment (RIA) issued by or for national governments, or publications studying RIA as it is applied by governments worldwide. We will also publish an in-depth study on RIA practices using the dataset: "œGlobal Indicators of Regulatory Governance: Worldwide Practices of Regulatory Impact Assessments" (expected second half of October 2017).

E. 8: Food and Agricultural BCA

Chair: Sandra Hoffmann, U. S. Department of Agriculture

Presenters:

  1. Economic Impacts Associated with Direct Marketing Initiatives by U. S. Farmers: A Quantile Decomposition of Sales; Timothy Park, U. S. Department of Agriculture

An emerging agricultural marketing issue is the promotion of direct marketing initiatives that are designed to expand producer margins. Initiatives at the U. S. Department of Agriculture have advocated for expanded direct marketing efforts. Major food retailers promote direct sales by farmers, claiming that local foods give farmers, ranchers, growers and producers maximum return on their investment. We develop a model of direct marketing initiatives by farmers and assess the unconditional impact of direct marketing on farm sales while also uncovering the heterogeneous effects that occur across the distribution of farm sales. Data from the Agricultural Resource Management Survey (from 2008 to 2013) are used to measure sales differences between farmers participating in direct marketing compared to those do not use this marketing option. One innovation is to assess the impact of information accessed through the Internet on farm sales. We distinguish between internet use for farm-related news and information (weather, farm, agricultural markets) and internet activity devoted to farm-related commerce (purchases, sales, banking, and on-line record accounting). We use an unconditional quantile regression (UQR) approach to measure the full impact of participation in direct marketing on farmer sales at specific quantiles in the sales distribution. We apply a method to decompose sales differences across the complete sales distribution for these two types of farmers (direct marketers and those not participating). There are two broad factors that contribute to differentials across the sales distribution. First, the composition effect accounts for differences in the characteristics of the two groups, such as education, experience, or farm structure, and input choices. The composition component uncovers how these characteristics influence differences in the quantiles of the marginal sales distribution. A second component (the structure effect) is based on differences in the estimated coefficients of the sales model for farmers and accounts for differences in the marginal impact of the explanatory variables on sales. The decomposition reveals how much of the sales differential associated with direct marketing efforts is driven by differences in farm experience or farm structure. The structure component is informative about how much of the unexplained gap is related to differing returns to education or farming experience. The factors that are identified that can assist marketing experts and extension professionals in guiding farmers who are considering initiating or expanding direct marketing activities.

  1. A Dynamic Systems Assessment of Benefits and Costs of Policies to Regulate Antimicrobial Use in U. S. Animal Agriculture; Guillaume Lhermie*, Cornell University; Don Kenkel*, Cornell University; Loren Tauer, Dyson School of Applied Economics and Management, Cornell SC Johnson College of Business; Charles Nicholson, Dyson School of Applied Economics and Management, Cornell SC Johnson College of Business; Yrjo Grohn, Department of Population Medicine and Diagnostic Sciences, College of Veterinary Medicine, Cornell University

Antimicrobials are used in humans and animals to cure bacterial infectious diseases. However, antimicrobial use (AMU) leads unavoidably to the selection of resistant bacteria, constituting a negative externality: antimicrobial resistance (AMR) compromises the efficacy of future antimicrobial treatments, and generates high additional public health costs. More specifically in animal production, AMU prevents or limits the damages associated with the occurrence of diseases on farms. Therefore, antimicrobials can be viewed as inputs into the production functions for livestock products. The potential transfer of resistant bacteria selected after on-farm AMU throughout the production process, via direct contact, the food chain, or the environment, constitutes a threat to public health. Given this threat, some governments implement policies to decrease AMU in conventional farming systems (e. g. , the European and USA ban of use of subtherapeutic doses that promote growth and improve feed efficiency). Policies that decrease AMU in farming impose opportunity costs on society because they reduce the current economic efficiency of the production of livestock-based food commodities. In general market equilibrium, the efficiency loss reduces farm profits and/or increases food prices. AMU policies create societal benefits because they slow development of AMR and maintain the future efficacy of antimicrobials to treat diseases in animals and humans. Thus, one policy tradeoff is between the costs of reduced current farming efficiency and the benefits of improved future farming efficiency and improved public health. We conduct a benefit-cost analysis to compare the opportunity costs of AMU policies to the value of the public health improvements, based on of the value of a statistical life and its extensions to value morbidity reductions. Given the potential differences in short- and long-term outcomes for multiple stakeholders interacting in a complex system, a dynamic assessment of the benefits and costs of AMU policies on farmers, retailers, animal and human health workers, and citizens is challenging yet highly relevant. We develop a systems simulation model that represents the dynamics of AMU in U. S. animal production (poultry, pigs, cattle), the relationship between AMU and AMR, and the impacts of three public policies designed to mitigate AMR. These policies include 1) prohibition of AMU, 2) a tax on AMU and 3) an exogenously-imposed reduction in AMU of 50%, and are compared to a status quo Baseline scenario. We evaluate the dynamics of benefits and costs associated with the three policies over a ten-year period, using various discount rates, also undertaking sensitivity analyses for the price of antimicrobials.

  1. A Comparison of Regulatory Impacts on Corn Farming Between the United States and European Union; Zhoudan Xie*, The George Washington University Regulatory Studies Center; Daniel R. Pérez; Aryamala Prasad

Wide variations in regulatory systems and approaches in the United States (U. S. ) and European Union (EU) make it difficult to compare the impact of regulation between the two jurisdictions. Such variations are prominent in regulation affecting the agriculture sector. This study focuses on corn farming to estimate the economic impacts"”both costs and benefits"”of major environmental and food safety regulations on corn production in the U. S. and EU. Although this study is primarily concerned with U. S. federal and EU-level regulations, two EU member states"”France and Spain"”are selected as case studies to illustrate the differences in translation and implementation of EU-level regulations at the country level. Using a "œtypical farm" approach as defined in the study, we demonstrate relative differences in regulatory burden for corn farmers among the U. S. , France and Spain. We begin by identifying and discussing regulations affecting corn farming in four categories: genetically modified (GM) crops, pesticides, fertilizers, and agri-environmental practices. We then proceed to quantify the incremental private costs and benefits for corn farmers resulting from the operational requirements associated with each regulation in each country, including a sensitivity analysis. We find that French and Spanish corn farmers face much higher regulatory burden than the U. S. corn farmers, primarily due to significantly higher regulatory costs associated with GM crop and pesticide regulations in the EU.

  1. Socio-Economics of Cassava Production in East Africa; Paul Mwebaze*, Commonwealth Scientific and Ind. Research Org (CSIRO); Sarina MacFadyen, CSIRO, Australia; Paul De Barro, CSIRO, Australia;  Christopher Omongo, National Crops Resources Research Institute, Kampala, Uganda; Anton Bua, National Crops Resources Research Institute, Kampala, Uganda; Andrew Kalyebi, National Crops Resources Research Institute, Kampala, Uganda; Fred Tairo, Mikocheni Agricultural Research Institute, Dar es Salaam, Tanzania; Donald Kachigamba, Department of Agricultural Research Services, Bvumbe, Malawi

Cassava is the second most important food crop in Africa after maize. It is a major staple crop for more than 200 million people in East and Central Africa, most of them living in poverty in rural areas. However, its production is undermined by several factors, particularly the problem of emerging pests and diseases. We conducted a comprehensive socio-economic study covering Uganda, Tanzania and Malawi to determine the status of cassava production with the following specific objectives: (1) What is the present status of cassava production and productivity? (2) What is the current adoption rate of improved cassava production technologies in the study countries? (3) What is the economic impact of the cassava white-fly on smallholder farmers? The primary data for this study was collected from cassava farmers-using a pre-tested survey questionnaire that was orally administered to individual farmers. A total of 1200 respondents were selected and interviewed using a multi-stage random sampling technique. We employ cost-benefit analysis and a stochastic frontier production model to analyse costs, returns and productivity of smallholder cassava producers. Here we present results and discuss the implications. Key words: Cassava, smallholders, disease, income, food security.

  1. Pollinator Valuation Measures and Policy Analytics; Peyton Ferrier*, U. S. Department of Agriculture; Randal R. Rucker, Montana State University; Walter N. Thurman, North Carolina State University 

Concern for the consequences of the loss of pollinators on the agricultural economy continues to influence debate regarding agricultural policy, pesticide regulation, and land conservation. Valuation studies from the ecology literature often ascribe large dollar values to the contribution of pollinators to the agricultural economy (Bauer & Wing, 2010; Gallai, Salles, Settele, & Vaissière, 2009; Morse & Calderone, 2000; Robinson, Nowogrodzki, & Morse, 1989). These estimates assign a value to the pollination services provided by honey bees and other insects by measuring the value of farm output lost if all pollinators were absent, and farmers made no offsetting adjustment to farm operations. Some studies have tried to reconcile these "œall-or-nothing" studies with the more common economic concepts of producer and consumer welfare (Melhim, Daly, & Weersink, 2016; Winfree, Gross, & Kremen, 2011) and to adapt them to applied policy analysis (Gallai & Salles, 2016). Although farms rely on wild or unpaid pollinators, markets for pollination services (i. e. , honey bee colony rentals) have been well-organized since at least the 1940s (Burgett, Daberkow, & Rucker, 2010; Cheung, 1973; Olmstead & Wooten, 1987; Rucker & Thurman, 2010). Cheung (1973) and Rucker, Thurman, and Burgett (2012) show how pollination service fee prices, after being adjusted for the value of honey co-products, reveal both the beekeeper's marginal cost of providing pollination services and the farm's marginal valuation of pollination services as an input. Despite the potential for ecosystem disturbances to affect pollination service supply, fluctuations in pollination service fees allow the market to adjust supplies to changing requirements of farms. We assess whether commonly cited valuations of pollinators can be resolved with welfare economic principles and show that cited pollination valuation methods, by failing to explicitly model farm production, do not allow acknowledge farms' abilities to substitute other inputs for insect-provided pollination services, plant crop varieties that are less dependent on pollination services, or save costs following early-season realization that yields are low. Valuations measures often fail to distinguish the average and marginal values of inputs in calculations (Winfree et al. , 2011 and Muth and Thurman, 1995). In some instances (e. g. Calderone 2012), the value of crops produced with seed requiring pollination are double counted in the valuation measures. In addition to these methodological concerns, we show that widespread concerns regarding the impending scarcity of pollination services is not reflected in current data. Specifically, we use new USDA data from the Cost of Pollination Survey and Colony Loss Survey to show that in recent years the number of U. S. bee colonies has been rising and that for most crops, pollination service costs have been relatively stable and small relative to total costs.

G. 8 “Economic Feasibility” Under the Safe Drinking Water Act: An Unexpected Opportunity for Regulatory Reform

Chair: Richard Belzer, Regulatory Checkbook

The Safe Drinking Water Act of 1974 directed USEPA (and indirectly, the states) to establish standards for drinking water contaminants that are "œeconomically and technologically feasible." Congress delegated to the Agency the legislative authority to define these terms. Engineering has held sway over the definition of "œtechnological feasibility," but economics has played a negligible role in defining "œeconomic feasibility. " The Agency's definition relies on the principle of "œaffordability," which it has defined as 2. 5% of median household income. In May 2016, California promulgated a drinking water standard for hexavalent chromium without making any reasoned determination that it was economically feasible, as California law required. Estimated household-level costs for small water systems exceeded $6,000 per year, an amount that a California Superior Court judge ruled was "œon its face, "¦ economically unfeasible for many people. The court vacated the drinking water rule in May 2017, directing the state to revisit the matter and re-propose a new regulation that was economically feasible. The state chose not to appeal the court's ruling and expects to complete this task in 18-24 months. This case provides an opportunity to reconsider first principles. Is "œaffordability" an appropriate metric for defining "œeconomic feasibility"? What is the proper role of economics, and particularly, benefit-cost analysis? This panel consists of three presentations: (1) a historical tour of how USEPA defined economic feasibility in the 1970s and developed its "œaffordability" principle; (2) a description of the difficulties that water systems have had (and continue to have) complying with the "œaffordability" principle; and (3) a proposal for a new definition of "œeconomic feasibility:" grounded in economics. A discussant will review these presentations and provide his own, independent insight.

Discussant: W. Kip Viscusi, Vanderbilt University

Presenters:

  1. Redefining ‘Economic Feasibility’ Using Economics: A Reform Proposal That Would Improve Both Efficency and Equity; Richard Belzer, Regulatory Checkbook
  1. Creation of EPA’s Small Entity Affordability Criterion: A Study of Ignorance Compounded by Bad Economics; David Schnare, Torcastle Law, LLC
  1. Practical Problems with U. S. EPA’s Affordability Guidance; Tracy Mehan, American Waterworks Association