Someone recently bought our

students are currently browsing our notes.


Regulation's Limits And Reforms Outline

Law Outlines > Legislation & Regulation Outlines

This is an extract of our Regulation's Limits And Reforms document, which we sell as part of our Legislation & Regulation Outlines collection written by the top tier of Harvard Law School students.

The following is a more accessble plain text extract of the PDF sample above, taken from our Legislation & Regulation Outlines. Due to the challenges of extracting text from PDFs, it will have odd formatting:

SECTION THREE: REGULATION'S LIMITS AND REFORMS ISSUES WITH RULES AND STATUTES Even if an agency, legislature, or the common law promulgates a particular regulation, there may be problems with implementation: (1) information provision is an inexpensive solution but it may not reduce risk; (2) rules may create regulatory paradoxes; (3) cost-benefit analyses may not be required, and if they are, they have problems of their own. I.

INFORMATION PROVISION a. General Considerations - information provision is the most popular form of administrative regulation as well as market solution; yet there are practical problems with information dissemination as well as simply understanding what has been disseminated. i. STEPHEN BREYER, REGULATION AND ITS REFORM (1982) - 548

1. Markets for information as a commodity may be skewed, but there are solutions. a. IP protection. Information is expensive to research but cheap to disseminate. Solution: protect dissemination with patent and copyright laws and regulate labels (e.g. require drug companies to label generic name for drug to show consumer competitor exists). b. Truthfulness regulation. Companies may lie. Solution: regulate truthfulness (e.g. with the SEC) or allow judicial remedies (e.g. rescind contracts). c. Rating systems. Consumers may not be able to evaluate quality. Solution: create a rating system, bar, prescribe information, etc. ii. Susan Rose-Ackerman, Progressive Law and Economics and the New Administrative Law, 98 YALE L.J. 341 (1988) - 549

1. Risk premiums. In Chicago model of risk reduction (pay workers more for riskier jobs), workers may not have adequate information regarding risk; firms may keep information secret.

2. Problems with Market Solutions: (1) it may take a long time for workers to observe injuries; (2) workers may not actually observe injuries when they happen; (3) the level of hazard may depend on the interaction between the workplace and workers, e.g. may be bad


iv. v.

vi. for workers who smoke; (4) workplace conditions change with new technology.

3. Problems with Provision Solutions: (1) workers may not be able to process percentages or data, and provision may be less effective than OSHA direct action; (2) collective action problems may lead workers and firms to misallocate value of safety, even with full information. Albert Nicholes & Richard Zeckhauser, "OSHA after a Decade: A Time for Reason," from CASE STUDIES IN REGULATION: REVOLUTION AND REFORM (Leonard W. Weiss &
Michael W. Klass eds., 1981) - 551

1. Three levels of information problems. Problems arise in the information-provision process at three points: (1) when information is made available; (2) when it is transmitted to affected parties; and (3) when it is (mis)understood by the parties.

2. Problems with risk premium. Workers can generally tell which industries are safer than others, but not whether metal stamp A or B is safer. W. Kip Viscusi, RISK BY CHOICE: REGULATING HEALTH AND SAFETY IN THE WORKPLACE (1983) - 551

1. Risk premiums and information. Workers generally understand health and safety risks across industries and have many sources of information, e.g. news, own injuries, coworkers, etc. Risk perceptions gained on the job have a powerful influence on workers' intention to quit.

2. Using information to avoid risk premiums. This means firms could reduce turnover costs in two ways: (1) by adopting technology whose risk is well-known ex ante, or (2) by incentivizing staying with low starting wages but highly increasing wages to eliminate quitprone workers. The best solution is (3): using technology whose risk workers don't know with option "(2)", so workers can't exercise risk-premium strategy. Howard Latin, Good Warnings, Bad Products, and Cognitive Limitations (1994)

1. Final problem with risk premiums. Tension exists between explicit information provision and information workers and consumers gain from experience, leading to cognitive dissonance. Cass Sunstein, Speech to HLS (Mar. 26, 2012)

1. Two systems in the human mind (System 1---automatic, intuitive, effortless and System 2---deliberative, calculative, statistical) explain the following heuristics: 2

a. Defaults: people are more willing to accept default choices. b. Channel Factors: people are more willing to move away from defaults if they have specific directions. c. Salience: people are more likely to spot salient factors (System 1) then deal with lots of information (System 2). d. Complexity: people avoid complexity. e. Affect Heuristic: people use experiential background to make decisions (e.g. old baseball scouts in Moneyball).[?]
f. Availability Bias: people are more likely to make decisions based on the information available.

2. By taking advantage of these heuristics and combining them with a systematic cost-benefit approach, agencies can save a tremendous amount of money while accomplishing more. a. "Plate not pyramid": the nutrition pyramid did not make as much sense to people as a visual "plate" of serving portions; this will increase compliance. b. Comparison Friction: new fuel labels on cars says annual fuel costs and savings due to fuel costs rather than city v. highway MPG. c. Smart Disclosure: energy bills that tell consumers what is drawing most energy allows them to make better choices. d. / these allow interested apps developers to create smart disclosure systems people will actually take advantage of, e.g. MBTA data and b. Generating Information Collectively - market and legislative approaches to information provision require companies or workers to offer information on their own; but there are structural reasons why these types of solutions do not always work. i. PETER DORMAN, MARKETS AND MORTALITY: ECONOMICS, DANGEROUS WORK, AND THE VALUE OF HUMAN LIFE (1996) - 557

1. Unions as information providers. are conflicted between safety and other demands, as well as between structural limits on risk-based advocacy. Options include: a. demanding contracts with specific risk-reduction policies (e.g. clauses giving workers right to refuse abnormally dangerous activities); 3

b. special worker committees to influence shop floor (but these only have power when union does); c. accepting dangerous conditions in exchange for hazard pay (but this is rare and bargaining generally seeks to eliminate risk, not get paid more for it). ii. Cass R. Sunstein, Informing America: Risk, Disclosure, and the First Amendment, 20 FLA. ST. U. L. REV. 653 (1993) - 561

1. Problems with information campaigns. Some consumer data shows that information campaigns help consumers about choices of goods to buy regarding health and safety. But these campaigns have problems: (1) they are expensive---$1.8 mil./life---and (2) they have bad effects: a. overload, b. disclosure requirements that deter disclosure, c. consumer illiteracy or old age, and d. public good issues (collective action problems with responding to information). c. California's Proposition 65: A Case Study in Information Provision - this case study exemplifies two problems with information provision to reduce risk: (1) not all or even the most dangerous risks are covered because of background biases, and (2) too many labels runs the opposite problem of excessive alarm followed by dismissal. i. Nicolle-Wagner v. Deukmejian, 230 Cal. App. 3d 652 (1991) - 567

1. Prop. 65 in California required that no one may "knowingly or intentionally expose any individual to a chemical known to cause cancer" without a "clear and reasonable warning." The implementing law made exemption for "naturally occurring" carcinogens like tobacco. P claimed there was no basis for this exemption. Court disagreed: one purpose of Prop. 65 was to preserve natural foods in food supply (cn. 21), and a "clear and reasonable warning" would be impossible if every fruit and vegetable were labeled (cn. 12). ii. Kip Viscusi, Predicting the Effects of Food Cancer Risk Warnings on Consumers, 43 FOOD DRUG COSM. L.J. 283 (1988)

1. Problem of excessive alarm. Requiring labels for when there is a 1/100,000 chance of getting cancer is too low for individuals to make meaningful changes. The label ("this is known to cause cancer") does not convey 4

uncertainty, and leads to overstated and bad choices. This could lead to excessive alarm and dismissal of the warning system. d. Understanding Information: What Psychologists Might Tell Us - there are several "framing effects" that trigger biases and heuristics, that make it psychologically difficult for people to internalize information about risks, especially when that information is ambiguous or causes cognitive dissonance. i. Jon D. Hanson, Douglas A. Kysar, Taking Behavioralism Seriously: The Problem of Market Manipulation, 74 N.Y.U. L. REV. 630 (1999) - 572

1. People do not behave "rationally," as economists or political scientists would predict.

2. "Framing Effects" push people away from classically expected utility curves. a. Belief Perseverance: after constructing an hypothesis or explanation for X, people tend to disregard contradictory evidence. b. Confirmation Bias: if people enter situation with differing beliefs about Y, ambiguous information about Y may further polarize people. c. Hypothesis-Based Filtering: people who believe Z will not only view ambiguous evidence as consistent with Z (confirmation), but will also view it as supporting Z. d. Entity Effect: hypotheses persist even after evidence that gave rise to them has been completely discredited. e. Motivated Reasoning: people use cognitive processes to arrive at conclusion they privately desired to arrive at all along.

3. These lead to recognizable irrationalities. a. Optimism Bias: people overestimate the chances of good things happening and underestimate risk. b. Cognitive Dissonance: people downplay information about themselves that contradicts beliefs already held. c. Illusion of Control: people predict things as though they have control over the outcome, e.g. when asked to guess whether red chip will be pulled out of bag 70% full of red chips, people guess "yes" 70% of the time (aiming for 100%
correct but achieving 49%) instead of 100% of the time (achieving 70% correct).

5 d. Hindsight Bias: reporting an outcome's occurrence increases its perceived probability of occurring again. e. Surprising Effect of Reasoning: elaborate theories and careful reasoning may foster overconfidence, especially when future cannot be predicted from present data.

4. Statistics are especially vulnerable to these biases because of two heuristics: a. Availability Heuristic: this leads people to remember available information and assume it exists more frequently, e.g. long lines, or plane crashes. b. Representative Heuristic: people judge frequency of X happening by degree to which X resembles a class, often leading to: i. (1) failure to ignore base rates (e.g. black guy 50% likely to be criminal when 10% of population is overstatement); ii. (2) law of small numbers (incorrectly determining that small numbers are less likely to replicate overall distribution); and iii. (3) the gambler's fallacy. c. Anchoring Heuristic: people adjust up or down from a convenient number. d. Affect Heuristic: people use experiential information to perceive a risk (e.g. technology and "dread") rather than deliberate and rational thought.

5. People actually make decisions with: a. Status-quo Bias / Endowment Effect: more likely to keep what one has rather than risk new option; b. Context Effect: irrelevant options (e.g. expensive wine near cheaper bottles) changes people's willingness to purchase "bargain"; c. Elastic Justification: people bias probabilities toward desired outcomes; d. Time-Variant References: people's short term willingness to delay rewards in exchange for higher reward in future is less than long-term willingness to delay same rewards, (e.g. 100 now or 200 in a year? 100. 100 in a year or 200 in two years? 200.). e. Reciprocal Preferences: A is more willingness to sacrifice to hurt B if B is playing game unfairly. 6

f. Impulsivity: visceral factors crowd out other goals. g. Framing. II.

REGULATORY PARADOXES a. How Regulation Can Fail: Theory - there are several regulatory paradoxes that agencies face when setting standards or attempting to reduce risk, the largest of which is "overregulation can lead to underregulation." i. JOHN MENDELOFF, THE DILEMMA OF TOXIC SUBSTANCE REGULATION 461 (1988)

1. Overregulation can lead to underregulation. Overregulation can lead to underregulation when standards are (a) too strict or (b) not extensive enough. If standards were lower, with attention to costs and benefits, they would lead to more extensive and worthwhile regulation. Strictness leads to slow standard-setting (caused also by court appeals, uncertain effects, high burdens of proof, and limited resources). This is made especially worrisome by the "iron triangle," discussed earlier.

2. E.U. v. U.S. This view is supported by Thomas Church & Robert Natamura, "Beyond Superfund" (1994), in which they argued that the European "little stick" approach has lower transaction costs than the U.S. "big stick" but with larger reliance on negotiation, lower levels of cleanup, and higher costs to the public. ii. John Mendeloff, Overcoming Barriers to Better Regulation, 18 LAW & SOCIAL INQUIRY 711 (1993) - 736

1. OSHA benefits. OSHA does some good things: (1) detection; (2) deterrence; (3) provides information to employers (4) and workers; and (5) grabs attention.

2. OSHA problems. OSHA has some problems: (1) responsive to unions and firms, neither of which care about efficiency; (2) requires commitment by highlevel management to do anything; (3) the analytical staff is small, so it heavily relies on other agencies and regulated industries for information; (4) data is unavailable on its own.

3. Need for flexibility. These problems may be adjusted as the agency becomes more flexible, thanks to the combination of less ideology from the White House and deferential courts after Cotton Dust and Chevron.


Buy the full version of these notes or essay plans and more in our Legislation & Regulation Outlines.