6 research outputs found

    Copyright's Digital/Analog Divide

    No full text
    This Article shows how the substantive balance of copyright law has been overshadowed online by the system of intermediary safe harbors enacted as part of the Digital Millennium Copyright Act (“DMCA”) in 1998. The Internet safe harbors and the system of notice-and-takedown fundamentally changed the incentives of platforms, users, and rightsholders in relation to claims of copyright infringement. These different incentives interact to yield a functional balance of copyright online that diverges markedly from the experience of copyright law in traditional media environments. This article also explores a second divergence: the DMCA’s safe harbor system is being superseded by private agreements between rightsholders and large commercial Internet platforms made in the shadow of those safe harbors. These agreements relate to automatic copyright filtering systems, such as YouTube’s Content ID, that not only return platforms to their gatekeeping role, but encode that role in algorithms and software. The normative implications of these developments are contestable. Fair use and other axioms of copyright law still nominally apply online; but in practice, the safe harbors and private agreements made in the shadow of those safe harbors are now far more important determinants of online behavior than whether that conduct is, or is not, substantively in compliance with copyright law. The diminished relevance of substantive copyright law to online expression has benefits and costs that appear fundamentally incommensurable. Compared to the offline world, online platforms are typically more permissive of infringement, and more open to new and unexpected speech and new forms of cultural participation. However, speech on these platforms is also more vulnerable to over-reaching claims by rightsholders. There is no easy metric for comparing the value of non-infringing expression enabled by the safe harbors to that which has been unjustifiably suppressed by misuse of the notice-and-takedown system. Likewise, the harm that copyright infringement does to rightsholders is not easy to calculate, nor is it easy to weigh against the many benefits of the safe harbors. DMCA-plus agreements raise additional considerations. Automatic copyright enforcement systems have obvious advantages for both platforms and rightsholders; they may also allow platforms to be more hospitable to certain types of user content. However, automated enforcement systems may also place an undue burden on fair use and other forms of non-infringing speech. The design of copyright enforcement robots encodes a series of policy choices made by platforms and rightsholders and, as a result, subjects online speech and cultural participation to a new layer of private ordering and private control. In the future, private interests, not public policy will determine the conditions under which users get to participate in online platforms that adopt these systems. In a world where communication and expression is policed by copyright robots, the substantive content of copyright law matters only to the extent that those with power decide that it should matter. Keywords: Copyright, DMCA, Infringement, Internet, Safe harbors, Enforcement, Fair use, Automation, Algorithms, Robots

    Making Smart Decisions About Surveillance: A Guide for Communities

    No full text
    California communities are increasingly grappling with whether to deploy new surveillance technologies ranging from drones to license plate readers to facial recognition. This is understandable, since public safety budgets are tight, technology vendors promise the ability to do more with less, and federal agencies or industry sponsors may even offer funding. But surveillance can be both less effective and far more costly to local agencies and to the community at large than initially imagined, leaving communities saddled with long-term bills for surveillance that doesn\u27t end up making the community safer. Surveillance can also be easily misused, leading to the erosion of community trust, bad press, and even costly lawsuits. In the wake of the revelations about the National Security Agency’s rampant warrantless spying and the use of military equipment in Ferguson, Missouri to quell protests, communities are increasingly focused on the need for greater transparency, oversight, and accountability of surveillance and local policing. More than ever, people are aware of how billions of dollars in federal funding and equipment provided directly to law enforcement is circumventing normal democratic processes and preventing communities from thoroughly evaluating the costs and risks of surveillance. As a result, many community leaders and residents are no longer willing to heed local law enforcement’s call to “just trust us.” Instead, leaders and residents want to know when and why surveillance is being considered, what it is intended to do, and what it will really cost — both in dollars and in individual rights — before taking any steps to seek funding or acquire or deploy surveillance technology. They also want to craft robust rules to ensure proper use, oversight, and accountability if surveillance is used. Unfortunately, few resources exist to help communities make thoughtful decisions about surveillance. That’s where this document comes in. This first-of-its-kind guide provides step-by-step assistance to help communities ask and answer the right questions about surveillance. It includes case studies highlighting smart approaches and missteps to avoid. Because each community and each type of surveillance may present a different set of issues, there is no one-size-fits-all solution. Instead, this guide gives communities a flexible framework that policymakers, community members and law enforcement should use to properly evaluate a wide array of surveillance technologies and develop policies that provide transparency, oversight, and accountability. It also includes a Surveillance & Community Safety Ordinance that communities should adopt to ensure that the right process is followed every time
    corecore