Course No.: 9200 710 801
Course ID: 17261
Tu 6:30-9:30 p.m.
|Professor Jay Dratler, Jr.||
Room 231D (IP Alcove)
|Copyright © 2000, 2001, 2002, 2003, 2004, 2005, 2006, 2008 Jay Dratler, Jr.|
|For permission, see CMI.|
ASHCROFT v. AMERICAN CIVIL LIBERTIES UNION (Ashcroft II)
Kennedy, J., delivered the opinion of the Court, in which Stevens, Souter, Thomas, and Ginsburg, JJ., joined. Stevens, J., filed a concurring opinion, in which Ginsburg, J., joined. Scalia, J., filed a dissenting opinion. Breyer, J., filed a dissenting opinion, in which Rehnquist, C. J., and O'Connor, J., joined.
Justice Kennedy delivered the opinion of the Court. [*2788]
This case presents a challenge to a statute enacted by Congress to protect minors from exposure to sexually explicit materials on the Internet, the Child Online Protection Act (COPA). 112 Stat 2681-736, codified at 47 U.S.C. § 231. We must decide whether the Court of Appeals was correct to affirm a ruling by the District Court that enforcement of COPA should be enjoined because the statute likely violates the First Amendment. * * * [*2789] * * *
COPA is the second attempt by Congress to make the Internet safe for minors by criminalizing certain Internet speech. The first attempt was the Communications Decency Act of 1996 . . . . [In Reno v. ACLU,] The Court held the CDA unconstitutional because it was not narrowly tailored to serve a compelling governmental interest and because less restrictive alternatives were available.
In response to the Court's decision in Reno, Congress passed COPA. COPA imposes criminal penalties of a $50,000 fine and six months in prison for the knowing posting, for "commercial purposes," of World Wide Web content that is "harmful to minors." § 231(a)(1). Material that is "harmful to minors" is defined as:
"Minors" are defined as "any person under 17 years of age." § 231(e)(7). A person acts for "commercial purposes only if such person is engaged in the business of making such communications." "Engaged in the business," in turn,
Since the passage of COPA, Congress has enacted additional laws regulating the Internet in an attempt to protect minors. For example, it has enacted a prohibition on misleading Internet domain names, 18 U.S.C. § 2252B, in order to prevent Web site owners from disguising [*2790] pornographic Web sites in a way likely to cause uninterested persons to visit them. See Brief for Petitioner 7 (giving, as an example, the Web site "whitehouse.com"). It has also passed a statute creating a "Dot Kids" second-level Internet domain, the content of which is restricted to that which is fit for minors under the age of 13. 47 U.S.C. § 941.
Respondents, Internet content providers and others concerned with protecting the freedom of speech, filed suit in the United States District Court for the Eastern District of Pennsylvania. They sought a preliminary injunction against enforcement of the statute. After considering testimony from witnesses presented by both respondents and the Government, the District Court issued an order granting the preliminary injunction. The court first noted that the statute would place a burden on some protected speech. The court then concluded that respondents were likely to prevail on their argument that there were less restrictive alternatives to the statute . . . . [namely,] blocking or filtering technology . . . .
The Government appealed the District Court's decision to the United States Court of Appeals for the Third Circuit. The Court of Appeals affirmed the preliminary injunction, but on a different ground. The court concluded that the "community standards" language in COPA by itself rendered the statute unconstitutionally overbroad. We granted certiorari and reversed, holding that the community-standards language did not, standing alone, make the statute unconstitutionally overbroad. Ashcroft I, 535 U.S., at 585. We emphasized, however, that our decision was limited to that narrow issue. We remanded the case to the Court of Appeals to reconsider whether the District Court had been correct to grant the preliminary injunction. On remand, the Court of Appeals again affirmed the District Court. The Court of Appeals concluded that the statute was not narrowly tailored to serve a compelling Government interest, was overbroad, and was not the least restrictive means available for the Government to serve the interest of preventing minors from using the Internet to gain access to materials that are harmful to them. The Government once again sought review from this Court, and we again granted certiorari.
"This Court, like other appellate courts, has always applied the abuse of discretion standard on the review of a preliminary injunction." Walters v. National Assn. of Radiation Survivors, 473 U.S. 305, 336 (1985) (O'Connor, J., concurring) (internal quotation marks omitted). . . . If the [*2791] underlying constitutional question is close, therefore, we should uphold the injunction and remand for trial on the merits. Applying this mode of inquiry, we agree with the Court of Appeals that the District Court did not abuse its discretion in entering the preliminary injunction. Our reasoning in support of this conclusion, however, is based on a narrower, more specific grounds than the rationale the Court of Appeals adopted. . . . [namely,] the reasons relied on by the District Court . . . .
The District Court, in deciding to grant the preliminary injunction, concentrated primarily on the argument that there are plausible, less restrictive alternatives to COPA. A statute that "effectively suppresses a large amount of speech that adults have a constitutional right to receive and to address to one another . . . is unacceptable if less restrictive alternatives would be at least as effective in achieving the legitimate purpose that the statute was enacted to serve." Reno, 521 U.S. at 874. When plaintiffs challenge a content-based speech restriction, the burden is on the Government to prove that the proposed alternatives will not be as effective as the challenged statute.
In considering this question, a court assumes that certain protected speech may be regulated, and then asks what is the least restrictive alternative that can be used to achieve that goal. The purpose of the test is not to consider whether the challenged restriction has some effect in achieving Congress' goal, regardless of the restriction it imposes. The purpose of the test is to ensure that speech is restricted no further than necessary to achieve the goal, for it is important to assure that legitimate speech is not chilled or punished. For that reason, the test does not begin with the status quo of existing regulations, then ask whether the challenged restriction has some additional ability to achieve Congress' legitimate interest. Any restriction on speech could be justified under that analysis. Instead, the court should ask whether the challenged regulation is the least restrictive means among available, effective alternatives.
In deciding whether to grant a preliminary injunction . . ., a district court must consider whether the plaintiffs have demonstrated that they are likely to prevail on the merits. . . . As the Government bears the burden of proof on the ultimate question of COPA's constitutionality, [*2792] respondents must be deemed likely to prevail unless the Government has shown that respondents' proposed less restrictive alternatives are less effective than COPA. Applying that analysis, the District Court concluded that respondents were likely to prevail. That conclusion was not an abuse of discretion, because on this record there are a number of plausible, less restrictive alternatives to the statute.
The primary alternative considered by the District Court was blocking and filtering software. Blocking and filtering software is an alternative that is less restrictive than COPA, and, in addition, likely more effective as a means of restricting children's access to materials harmful to them. The District Court, in granting the preliminary injunction, did so primarily because the plaintiffs had proposed that filters are a less restrictive alternative to COPA and the Government had not shown it would be likely to disprove the plaintiffs' contention at trial.
Filters are less restrictive than COPA. They impose selective restrictions on speech at the receiving end, not universal restrictions at the source. Under a filtering regime, adults without children may gain access to speech they have a right to see without having to identify themselves or provide their credit card information. Even adults with children may obtain access to the same speech on the same terms simply by turning off the filter on their home computers. Above all, promoting the use of filters does not condemn as criminal any category of speech, and so the potential chilling effect is eliminated, or at least much diminished. All of these things are true, moreover, regardless of how broadly or narrowly the definitions in COPA are construed.
Filters also may well be more effective than COPA. First, a filter can prevent minors from seeing all pornography, not just pornography posted to the Web from America. The District Court noted in its factfindings that one witness estimated that 40% of harmful-to-minors content comes from overseas. COPA does not prevent minors from having access to those foreign harmful materials. That alone makes it possible that filtering software might be more effective in serving Congress' goals. Effectiveness is likely to diminish even further if COPA is upheld, because the providers of the materials that would be covered by the statute simply can move their operations overseas. It is not an answer to say that COPA reaches some amount of materials that are harmful to minors; the question is whether it would reach more of them than less restrictive alternatives. In addition, the District Court found that verification systems may be subject to evasion and circumvention, for example by minors who have their own credit cards. Finally, filters also may be more effective because they can be applied to all forms of Internet communication, including e-mail, not just communications available via the World Wide Web.
* * * Congress . . . unambiguously found that filters are more effective than age-verification requirements. See Commission on Child Online Protection (COPA), Report to Congress, at 19-21, 23-25, 27 (Oct. 20, 2000) (assigning a score for "Effectiveness" of 7.4 for server-based filters and 6.5 for client-based filters, [*2793] as compared to 5.9 for independent adult-ID verification, and 5.5 for credit card verification). Thus, not only has the Government failed to carry its burden of showing the District Court that the proposed alternative is less effective, but also a Government Commission appointed to consider the question has concluded just the opposite. That finding supports our conclusion that the District Court did not abuse its discretion in enjoining the statute.
Filtering software, of course, is not a perfect solution to the problem of children gaining access to harmful-to-minors materials. It may block some materials that are not harmful to minors and fail to catch some that are. Whatever the deficiencies of filters, however, the Government failed to introduce specific evidence proving that existing technologies are less effective than the restrictions in COPA. * * * The Government's burden is not merely to show that a proposed less restrictive alternative has some flaws; its burden is to show that it is less effective. . . . It is not enough for the Government to show that COPA has some effect. Nor do respondents bear a burden to introduce, or offer to introduce, evidence that their proposed alternatives are more effective. The Government has the burden to show they are less so. The Government having failed to carry its burden, it was not an abuse of discretion for the District Court to grant the preliminary injunction.
One argument to the contrary is worth mentioning—the argument that filtering software is not an available alternative because Congress may not require it to be used. That argument carries little weight, because Congress undoubtedly may act to encourage the use of filters. We have held that Congress can give strong incentives to schools and libraries to use them. United States v. Am. Library Ass'n, 539 U.S. 194 (2003). It could also take steps to promote their development by industry, and their use by parents. It is incorrect, for that reason, to say that filters are part of the current regulatory status quo. * * * COPA presumes that parents lack the ability, not the will, to monitor what their children see. By enacting programs to promote use of filtering software, Congress could give parents that ability without subjecting protected speech to severe penalties.
The closest precedent on the general point is our decision in [United States v.] Playboy Entertainment Group,[ Inc., 529 U.S. 803 (2000),] which, like this case, involved a content-based restriction designed to protect minors from viewing harmful materials. The [*2794] choice was between a blanket speech restriction and a more specific technological solution that was available to parents who chose to implement it. Absent a showing that the proposed less restrictive alternative would not be as effective, we concluded, the more restrictive option preferred by Congress could not survive strict scrutiny. . . . In the instant case, too, the Government has failed to show, at this point, that the proposed less restrictive alternative will be less effective. The reasoning of Playboy Entertainment Group, and the holdings and force of our precedents require us to affirm the preliminary injunction. To do otherwise would be to do less than the First Amendment commands. "The starch in our constitutional standards cannot be sacrificed to accommodate the enforcement choices of the Government." Id., at 830 (Thomas, J., concurring).
There are also important practical reasons to let the injunction stand pending a full trial on the merits. First, the potential harms from reversing the injunction outweigh those of leaving it in place by mistake. Where a prosecution is a likely possibility, yet only an affirmative defense is available, speakers may self-censor rather than risk the perils of trial. There is a potential for extraordinary harm and a serious chill upon protected speech. . . . The harm done from letting the injunction stand pending a trial on the merits, in contrast, will not be extensive. No prosecutions have yet been undertaken under the law, so none will be disrupted if the injunction stands. Further, if the injunction is upheld, the Government in the interim can enforce obscenity laws already on the books.
Second, there are substantial factual disputes remaining in the case. As mentioned above, there is a serious gap in the evidence as to the effectiveness of filtering software. . . . For us to assume, without proof, that filters are less effective than COPA would usurp the District Court's factfinding role. By allowing the preliminary injunction to stand and remanding for trial, we require the Government to shoulder its full constitutional burden of proof respecting the less restrictive alternative argument, rather than excuse it from doing so.
Third, and on a related point, the factual record does not reflect current technological reality—a serious flaw in any case involving the Internet. The technology of the Internet evolves at a rapid pace. Yet the factfindings of the District Court were entered in February 1999, over five years ago. * * * More and better filtering alternatives may exist than when the District Court entered its findings. Indeed, we know that after the District Court entered its factfindings, a congressionally appointed commission issued a report that found that filters are more effective than verification screens. . . .
Remand will also permit the District Court to take account of a changed legal landscape. Since the District Court made its factfindings, Congress has passed at least two further statutes that might qualify as less restrictive alternatives to COPA—a prohibition on misleading domain names, and a statute creating a minors-safe "Dot Kids" domain. . . . Remanding for trial will allow the District Court to take into account those additional potential alternatives.
On a final point, it is important to note that this opinion does not hold that Congress is incapable of enacting any regulation of the Internet designed to prevent minors from gaining access to harmful materials. The parties, because of the conclusion of the Court of Appeals that the statute's definitions rendered it unconstitutional, did not devote their attention to the question whether further evidence might be introduced on the relative restrictiveness and effectiveness of alternatives to the statute. On remand, however, the parties will be able to introduce further evidence on this point. This opinion does not foreclose the District Court from concluding, upon a proper showing by the Government that meets the Government's constitutional burden as defined in this opinion, that COPA is the least restrictive alternative available to accomplish Congress' goal.
* * * The judgment of the Court of Appeals is affirmed, and the case is remanded for proceedings consistent with this opinion.
It is so ordered.
Justice Stevens, with whom Justice Ginsburg joins, concurring.
When it first reviewed the constitutionality of the Child Online Protection Act (COPA), the Court of Appeals held that the statute's use of "contemporary community standards" to identify materials that are "harmful to minors" was a serious, and likely fatal, defect. . . . I have already explained at some length why I agree with that holding. See Ashcroft v. American Civil Liberties Union, 535 U.S. 564, 603 (dissenting opinion) ("In the context of the Internet, . . . community standards become a sword, rather than a shield. If a prurient appeal is [*2796] offensive in a puritan village, it may be a crime to post it on the World Wide Web"). I continue to believe that the Government may not penalize speakers for making available to the general World Wide Web audience that which the least tolerant communities in America deem unfit for their children's consumption, . . . and consider that principle a sufficient basis for deciding this case.
But COPA's use of community standards is not the statute's only constitutional defect. Today's decision points to another: that, as far as the record reveals, encouraging deployment of user-based controls, such as filtering software, would serve Congress' interest in protecting minors from sexually explicit Internet materials as well or better than attempting to regulate the vast content of the World Wide Web at its source, and at a far less significant cost to First Amendment values.
In registering my agreement with the Court's less-restrictive-means analysis, I wish to underscore just how restrictive COPA is. COPA is a content-based restraint on the dissemination of constitutionally protected speech. It enforces its prohibitions by way of the criminal law, threatening noncompliant Web speakers with a fine of as much as $50,000, and a term of imprisonment as long as six months, for each offense. 47 U.S.C. § 231(a). Speakers who "intentionally" violate COPA are punishable by a fine of up to $50,000 for each day of the violation. And because implementation of the various adult-verification mechanisms described in the statute provides only an affirmative defense, § 231(c)(1), even full compliance with COPA cannot guarantee freedom from prosecution. Speakers who dutifully place their content behind age screens may nevertheless find themselves in court, forced to prove the lawfulness of their speech on pain of criminal conviction. . . .
Criminal prosecutions are, in my view, an inappropriate means to regulate the universe of materials classified as "obscene," since "the line between communications which 'offend' and those which do not is too blurred to identify criminal conduct." Smith v. United States, 431 U.S. 291, 316 (Stevens, J., dissenting). . . . COPA's creation of a new category of criminally punishable speech that is "harmful to minors" only compounds the problem. It may be, as Justice Breyer contends, that the statute's coverage extends "only slightly" beyond the legally obscene, and therefore intrudes little into the realm of protected expression. But even with Justice Breyer's guidance, I find it impossible to identify just how far past the already ill-defined territory of "obscenity" he thinks the statute extends. Attaching criminal sanctions to a mistaken judgment about the contours of the novel and nebulous category of "harmful to minors" speech clearly imposes a heavy burden on the exercise of First Amendment freedoms.
COPA's criminal penalties are, moreover, strong medicine for the ill that the statute seeks to remedy. To be sure, our cases have recognized a compelling interest in protecting minors from exposure to sexually explicit materials. See, e.g., Ginsberg v. New York, 390 U.S. 629, 640 (1968). As a parent, grandparent, and great-grandparent, [*2797] I endorse that goal without reservation. As a judge, however, I must confess to a growing sense of unease when the interest in protecting children from prurient materials is invoked as a justification for using criminal regulation of speech as a substitute for, or a simple backup to, adult oversight of children's viewing habits.
Justice Scalia, dissenting.
I agree with Justice Breyer's conclusion that the Child Online Protection Act (COPA) is constitutional. Both the Court and Justice Breyer err, however, in subjecting COPA to strict scrutiny. Nothing in the First Amendment entitles the type of material covered by COPA to that exacting standard of review.
"We have recognized that commercial entities which engage in the sordid business of pandering by deliberately emphasiz[ing] the sexually provocative aspects of [their nonobscene products], in order to catch the salaciously disposed, engage in constitutionally unprotected behavior." United States v. Playboy Entertainment Group, Inc., 529 U.S. 803, 831 (2000) (Scalia, J., dissenting) (internal quotations omitted). * * * There is no doubt that the commercial pornography covered by COPA fits this description. The statute applies only to a person who, "as a regular course of such person's trade or business, with the objective of earning a profit," 47 U.S.C. § 231(e)(2)(B), and "with knowledge of the character of the material," § 231(a)(1), communicates material that depicts certain specified sexual acts and that "is designed to appeal to, or is designed to pander to, the prurient interest," § 231(e)(6)(A). Since this business could, consistent with the First Amendment, be banned entirely, COPA's lesser restrictions raise no constitutional concern.
Justice Breyer, with whom the Chief Justice and Justice O'Connor join, dissenting.
* * * Like the Court, I would subject the Act to the most exacting scrutiny[.] * * * Nonetheless, my examination of (1) the burdens the Act imposes on protected expression, (2) the Act's ability to further a compelling interest, and (3) the proposed "less restrictive alternatives" convinces me that the Court is wrong. I cannot accept its conclusion that Congress could have accomplished its statutory objective—protecting children from commercial pornography on the Internetin other, less restrictive ways.
* * * Unlike the majority, I do not see how it is possible to make this comparative determination without examining both the extent to which the Act regulates protected expression and the nature of the burdens it imposes on that expression. That examination suggests that the Act, properly interpreted, imposes a burden on protected speech that is no more than modest.
The Act's definitions limit the material it regulates to material that does not enjoy First Amendment protection, namely legally obscene material, and very little more. A comparison of this Court's definition of unprotected, "legally obscene," material with the Act's definitions makes this clear.
[Justice Breyer compares the definition of legally obscene—and therefore unprotected—material in Miller v. California with the COPA's definition of the material it regulates.]
Both definitions define the relevant material through use of the critical terms "prurient interest" and "lacks serious literary, artistic, political, or scientific value." [*2799] Insofar as material appeals to, or panders to, "the prurient interest," it simply seeks a sexual response. Insofar as "patently offensive" material with "no serious value" simply seeks that response, it does not seek to educate, it does not seek to elucidate views about sex, it is not artistic, and it is not literary. . . .
The only significant difference between the present statute and Miller's definition consists of the addition of the words "with respect to minors," § 231(e)(6)(A), and "for minors," § 231(e)(6)(C). But the addition of these words to a definition that would otherwise cover only obscenity expands the statute's scope only slightly. That is because the material in question (while potentially harmful to young children) must, first, appeal to the "prurient interest" of, i.e., seek a sexual response from, some group of adolescents or postadolescents (since young children normally do not so respond). And material that appeals to the "prurient interest[s]" of some group of adolescents or postadolescents will almost inevitably appeal to the "prurient interest[s]" of some group of adults as well.
The "lack of serious value" requirement narrows the statute yet further—despite the presence of the qualification "for minors." That is because one cannot easily imagine material that has serious literary, artistic, political, or scientific value for a significant group of adults, but lacks such value for any significant group of minors. Thus, the statute, read literally, insofar as it extends beyond the legally obscene, could reach only borderline cases. And to take the words of the statute literally is consistent with Congress' avowed objective in enacting this law; namely, putting material produced by professional pornographers behind screens that will verify the age of the viewer. * * *
These limitations on the statute's scope answer many of the concerns raised by those who attack its constitutionality. Respondents fear prosecution for the Internet posting of material that does not fall within the statute's ambit as limited by the "prurient interest" and "no serious value" requirements; for example: an essay about a young man's experience with masturbation and sexual shame; "a serious discussion about birth control practices, homosexuality, . . . or the consequences of prison rape"; an account by a 15-year-old, written for therapeutic purposes, of being raped when she was 13; a guide to self-examination for testicular cancer; a graphic illustration of how to use a condom; or any of the other postings of modern literary or artistic works or discussions of sexual identity, homosexuality, sexually transmitted [*2800] diseases, sex education, or safe sex, let alone Aldous Huxley's Brave New World, J. D. Salinger's Catcher in the Rye, or, as the complaint would have it, Ken Starr's report on the Clinton-Lewinsky scandal. . . .
These materials are not both (1) "designed to appeal to, or . . . pander to, the prurient interest" of significant groups of minors and (2) lacking in "serious literary, artistic, political, or scientific value" for significant groups of minors. §§ 231(e)(6)(A), (C). Thus, they fall outside the statute's definition of the material that it restricts, a fact the Government acknowledged at oral argument.
* * * In sum, the Act's definitions limit the statute's scope to commercial pornography. It affects unprotected obscene material. Given the inevitable uncertainty about how to characterize close-to-obscene material, it could apply to (or chill the production of) a limited class of borderline material that courts might ultimately find is protected. But the examples I have just given fall outside that class.
The Act does not censor the material it covers. Rather, it requires providers of the "harmful to minors" material to restrict minors' access to it by verifying age. They can do so by inserting screens that verify age using a credit card, adult personal identification number, or other similar technology. See § 231(c)(1). In this way, the Act requires creation of an internet screen that minors, but not adults, will find difficult to bypass.
I recognize that the screening requirement imposes some burden on adults who seek access to the regulated material, as well as on its providers. The cost is, in part, monetary. The parties agreed that a Web site could store card numbers or passwords at between 15 and 20 cents per number. . . . And verification services provide free verification to Web site operators, while charging users less than $20 per year. [*2801] According to the trade association for the commercial pornographers who are the statute's target, use of such verification procedures is "standard practice" in their online operations. . . .
In addition to the monetary cost, and despite strict requirements that identifying information be kept confidential, see 47 U.S.C. § § 231(d)(1), 501, the identification requirements inherent in age-screening may lead some users to fear embarrassment. Both monetary costs and potential embarrassment can deter potential viewers and, in that sense, the statute's requirements may restrict access to a site. But this Court has held that in the context of congressional efforts to protect children, restrictions of this kind do not automatically violate the Constitution. And the Court has approved their use. See, e.g., United States v. Am. Library Association, 539 U.S. 194, 209 (2003) (plurality opinion) ("[T]he Constitution does not guarantee the right to acquire information at a public library without any risk of embarrassment"). Cf. Reno, 521 U.S., at 890, 138 L. Ed 2d 874, 117 S. Ct. 2329 (O'Connor, J., concurring in judgment in part and dissenting in part) (calling the age-verification requirement similar to"a bouncer [who] checks a person's driver's license before admitting him to a nightclub").
In sum, the Act at most imposes a modest additional burden on adult access to legally obscene material, perhaps imposing a similar burden on access to some protected borderline obscene material as well.
I turn next to the question of "compelling interest," that of protecting minors from exposure to commercial pornography. No one denies that such an interest is "compelling." . . .
* * * Conceptually speaking, the presence of filtering software is not an alternative legislative approach to the problem of protecting children from exposure to commercial pornography. Rather, it is part of the status quo, i.e., the backdrop against which Congress enacted the present statute. It is always true, by definition, that the status [*2802] quo is less restrictive than a new regulatory law. It is always less restrictive to do nothing than to do something. But "doing nothing" does not address the problem Congress sought to address—namely that, despite the availability of filtering software, children were still being exposed to harmful material on the Internet.
Thus, the relevant constitutional question is not the question the Court asks: Would it be less restrictive to do nothing? Of course it would be. Rather, the relevant question posits a comparison of (a) a status quo that includes filtering software with (b) a change in that status quo that adds to it an age-verification screen requirement. Given the existence of filtering software, does the problem Congress identified remain significant? Does the Act help to address it? These are questions about the relation of the Act to the compelling interest. Does the Act, compared to the status quo, significantly advance the ball? (An affirmative answer to these questions will not justify "[a]ny restriction on speech," as the Court claims, for a final answer in respect to constitutionality must take account of burdens and alternatives as well.)
The answers to these intermediate questions are clear: Filtering software, as presently available, does not solve the "child protection" problem. It suffers from four serious inadequacies that prompted Congress to pass legislation instead of relying on its voluntary use. First, its filtering is faulty, allowing some pornographic material to pass through without hindrance. * * * Because the software relies on key words or phrases to block undesirable sites, it does not have the capacity to exclude a precisely defined category of images. That is to say, in the absence of words, the software alone cannot distinguish between the most obscene pictorial image and the Venus de Milo. No Member of this Court disagreed.
Second, filtering software costs money. Not every family has the $40 or so necessary to install it. By way of contrast, age screening costs less. See [majority opinion] (citing costs of up to 20 cents per password or $20 per user for an identification number).
Third, filtering software depends upon parents willing to decide where their children will surf the Web and able to enforce that decision. As to millions of American families, that is not a reasonable possibility. More than 28 million school age children have both parents or their sole parent in the work force, at least 5 million children are left alone at home without supervision each week, and many of those children will spend afternoons and evenings with friends who may well have access to computers and more lenient parents. See United States v. Playboy Entertainment Group, Inc., 529 U.S. 803, 842 (2000) (Breyer, J., dissenting).
Fourth, software blocking lacks precision, with the result that those who wish to use it to screen out pornography find that it blocks a great deal of material that is valuable. * * * [*2803] . . . Indeed, the American Civil Liberties Union (ACLU), one of the respondents here, told Congress that filtering software "block[s] out valuable and protected information, such as information about the Quaker religion, and web sites including those of the American Association of University Women, the AIDS Quilt, the Town Hall Political Site (run by the Family Resource Center, Christian Coalition and other conservative groups)." Hearing on Internet Indecency before the Senate Committee on Commerce, Science, and Transportation, 105th Cong., 2d Sess., 64 (1998). . . .
Nothing in the District Court record suggests the contrary. . . .
In sum, a "filtering software status quo" means filtering that underblocks, imposes a cost upon each family that uses it, fails to screen outside the home, and lacks precision. Thus, Congress could reasonably conclude that a system that relies entirely upon the use of such software is not an effective system. And a law that adds to that system an age-verification screen requirement significantly increases the system's efficacy. That is to say, at a modest additional cost to those adults who wish to obtain access to a screened program, that law will bring about better, more precise blocking, both inside and outside the home.
The Court's response—that 40% of all pornographic material may be of foreign origin—is beside the point. Even assuming (I believe unrealistically) that all foreign originators will refuse to use screening, the Act would make a difference in respect to 60% of the Internet's commercial pornography. I cannot call that difference insignificant. The upshot is that Congress could reasonably conclude that, despite the current availability of filtering software, a child protection problem exists. It also could conclude that a precisely targeted regulatory statute, adding an age-verification requirement for a narrow range of material, would more effectively shield children from commercial pornography.
Is this justification sufficient? The lower courts thought not. But that is because those courts interpreted the Act as imposing far more than a modest burden. They assumed an interpretation of the statute in which it reached far beyond legally obscene and borderline-obscene material, affecting material that, given the interpretation set forth above, would fall well outside the Act's scope. But we must interpret the Act to save it, not to destroy it. . . . So interpreted, the Act imposes a far lesser burden on access to protected material. Given the modest nature of that burden and the likelihood that the Act will significantly further Congress' compelling objective, the Act may well satisfy the First Amendment's stringent tests. . . . Indeed, it does satisfy the First Amendment unless, of course, there [*2804] is a genuine alternative, "less restrictive" way similarly to further that objective.
* * * Obviously, the Government could give all parents, schools, and Internet cafes free computers with filtering programs already installed, hire federal employees to train parents and teachers on their use, and devote millions of dollars to the development of better software. The result might be an alternative that is extremely effective.
But the Constitution does not, because it cannot, require the Government to disprove the existence of magic solutions, i.e., solutions that, put in general terms, will solve any problem less restrictively but with equal effectiveness. Otherwise, "the undoubted ability of lawyers and judges," who are not constrained by the budgetary worries and other practical parameters within which Congress must operate, "to imagine some kind of slightly less drastic or restrictive an approach would make it impossible to write laws that deal with the harm that called the statute into being." Playboy Entertainment Group, 529 U.S., at 841 (Breyer, J., dissenting). As Justice Blackmun recognized, a "judge would be unimaginative indeed if he could not come up with something a little less ‘drastic' or a little less ‘restrictive' in almost any situation, and thereby enable himself to vote to strike legislation down." [Citation omitted.] Perhaps that is why no party has argued seriously that additional expenditure of government funds to encourage the use of screening is a "less restrictive alternative."
[T]he majority [also] suggests decriminalizing the statute, noting the "chilling effect" of criminalizing a category of speech. To remove a major sanction, however, would make the statute less effective, virtually by definition.
My conclusion is that the Act, as properly interpreted, risks imposition of minor burdens on some protected material—burdens that adults wishing to view the material may overcome at modest cost. At the same time, it significantly helps to achieve a compelling congressional goal, protecting children from exposure to commercial pornography. There is no serious, practically available "less restrictive" way similarly to further this compelling interest. Hence the Act is constitutional.
The Court's holding raises two more general questions. First, what has happened to the "constructive discourse between our courts and our legislatures" that "is an integral and admirable part of the constitutional design"? [Citation omitted.] After eight years of legislative effort, two statutes, and [*2805] three Supreme Court cases the Court sends this case back to the District Court for further proceedings. * * *
Moreover, Congress . . . . dedicated itself to the task of drafting a statute that would meet each and every criticism of the predecessor statute that this Court set forth in Reno. It incorporated language from the Court's precedents, particularly the Miller standard, virtually verbatim. . . . And it created what it believed was a statute that would protect children from exposure to obscene professional pornography without obstructing adult access to material that the First Amendment protects. . . . What else was Congress supposed to do?
I recognize that some Members of the Court, now or in the past, have taken the view that the First Amendment simply does not permit Congress to legislate in this area. See, e.g., Ginzburg, [v. United States,] 383 U.S., , 476, (1966) (Black, J., dissenting) ("[T]he Federal Government is without any power whatever under the Constitution to put any type of burden on speech and expression of ideas of any kind"). Others believe that the Amendment does not permit Congress to legislate in certain ways, e.g., through the imposition of criminal penalties for obscenity. There are strong constitutional arguments favoring these views. But the Court itself does not adopt those views. Instead, it finds that the Government has not proved the nonexistence of "less restrictive alternatives." That finding, if appropriate here, is universally appropriate. And if universally appropriate, it denies to Congress, in practice, the legislative leeway that the Court's language seem to promise. If this statute does not pass the Court's "less restrictive alternative" test, what does? If nothing does, then the Court should say so clearly.
Second, will the majority's holding in practice mean greater or lesser protection for expression? I do not find the answer to this question obvious. The Court's decision removes an important weapon from the prosecutorial arsenal. That weapon would have given the Government a choice—a choice other than "ban totally or [*2806] do nothing at all." The Act tells the Government that, instead of prosecuting bans on obscenity to the maximum extent possible (as respondents have urged as yet another "alternative"), it can insist that those who make available material that is obscene or close to obscene keep that material under wraps, making it readily available to adults who wish to see it, while restricting access to children. By providing this third option—a "middle way"—the Act avoids the need for potentially speech-suppressing prosecutions.
That matters in a world where the obscene and the nonobscene do not come tied neatly into separate, easily distinguishable, packages. In that real world, this middle way might well have furthered First Amendment interests by tempering the prosecutorial instinct in borderline cases. At least, Congress might have so believed. And this likelihood, from a First Amendment perspective, might ultimately have proved more protective of the rights of viewers to retain access to expression than the all-or-nothing choice available to prosecutors in the wake of the majority's opinion.
For these reasons, I dissent.