By: Max Jesse Goldberg, YLS ‘22

Introduction

As individuals and organizations grow increasingly conscious of cybersecurity, an ecosystem of security researchers has emerged to find and address cyber risks. Although security researchers have become vital to protecting our cybersecurity, their work is being frustrated by an unexpected problem: copyright law.

The Need for Independent Security Research

The problem begins with a market failure at the heart of today’s information economy. Governments, firms, and individuals seek to ensure that sensitive digital information remains safe from malicious actors. While software firms are incentivized to provide the security that their customers demand, even the most sophisticated consumers are often unable to precisely discern the level of security that a given product offers. Because producing security and credibly verifying it are both costly, software firms operating in a competitive market may produce products with security levels that are lower than the optimal level for customers. Although many large software firms now offer multimillion dollar “bug bounties” to security researchers who find unknow “zero-day” vulnerabilities in their products and report them to the firms, these vulnerabilities can fetch even higher prices when sold to governments and other actors for use as hacking tools. When these vulnerabilities fall into the wrong hands, the results can be ruinous.

The market for exploits and the absence of regulation that forces software firms to provide optimal levels of cybersecurity have led to a burgeoning industry of independent security research firms to fill the gap. Security researchers play an important role in ensuring that consumers receive closer-to-optimal levels of product cybersecurity. They dissect complex cyber systems to reveal vulnerabilities, examine malware to examine how they exploit weak points in information systems, and use that information to monitor and address cyber threats. Unsurprisingly, these firms are becoming increasingly indispensable to the maintenance of cybersecurity across a variety of domains.

Copyright Complications: Apple v. Corellium

Because revealing vulnerabilities in software systems frequently requires independent security researchers to alter, manipulate, or replicate portions of the original software as they search for bugs, their work often intersects with the law of copyright. In general, software firms are loath to provide licenses to independent security research firms to use their copyrighted code to find exploits. Even if such licenses could theoretically be negotiated, they are too cumbersome or costly for security researchers to negotiate or obtain. Because independent security researchers and the firms that supply the tools they use rely on copying and manipulating code from the original product, they often run the risk of being held liable for copyright infringement.  

The issue is neatly illustrated by a recent lawsuit by Apple against Corellium, a start-up whose product emulates several mobile operating systems, including Apple’s iOS. Security researchers who purchase Corellium’s software can use it to analyze and tinker with the software to reveal vulnerabilities. In 2019, based on the start-up’s alleged use of Apple’s copyrighted code and other material to develop its software, Apple brought copyright infringement and Digital Millennium Copyright Act (DMCA) trafficking claims against Corellium in federal district court. Corellium responded with a motion for summary judgement, arguing that its use of Apple’s copyrighted material was justified as fair use—Corellium’s product tweaks iOS and incorporates the start-up’s own code, creating a product that serves a unique and transformative purpose.

In a concise, 18-page order, Federal Judge Rodney Smith agreed with Corellium on the main copyright claims. Apple’s lawsuit claimed that Corellium’s simulated iOS, which contains only the functions necessary for security research, violated Apple’s software copyright. Judge Smith rejected these claims, noting that Corellium’s product made fair use of Apple’s copyrighted material: Correllium’s software serves a research tool for a small number of security-focused customers rather than competing with Apple’s product for market share, and is designed to improve the security of Apple’s product. Because Corellium met its burden of showing that its use of Apple’s copyright was covered by fair use, Apple lost its copyright claim.

Security researchers cheered the decision as a triumph for Corellium over Apple’s bullying.  But although Judge Smith granted summary judgement on the copyright claims on fair use grounds, he allowed Apple’s DMCA claims to proceed.

DMCA Section 1201’s Barriers for Security Research

Apple’s DMCA claims are less straightforward. The crux of the issue for these claims is DMCA Section 1201, which provides additional protection for copyrighted digital materials by specifying penalties for circumvention or hacking into copyrighted technology or software. Recognizing the additional challenges of preventing dissemination of copyrighted material, Congress designed the DMCA to protect software firms’ ability to include access controls or technological protection measures to protect the software from unauthorized use or dissemination, balancing among the needs of copyright owners and consumers. In contrast with basic copyright claims, courts have generally not provided blanket fair use defense to the DMCA’s anti-circumvention provisions.

While Section 1201 contains several statutory exceptions for security research, none are sufficient to fully protect a company like Corellium. 1201(f)’s “reverse engineering” exception only applies where the sole purpose of circumvention is to “identifying or analyzing elements . . . necessary to achieve [or] enable interoperability.” 1201(g)’s exception for “encryption research” is often unworkable because it requires researches to make a “good faith effort” to obtain the copyright owner’s authorization before they embark on any testing. Although Section 1201(j)’s exception for “security testing” at first seems tailor-made for work like Corellium’s, it requires the court to conduct two fact-bound inquiries into the circumstances of the use and maintenance of the information derived from security research, making it difficult to resolve these claims prior to extensive and costly discovery. Finally, while the Copyright Office has a triennial rulemaking process to promulgate regulatory exceptions to Section 1201’s prohibitions on circumvention, the exemptions are neither comprehensive nor permanent.

Judge Smith did not reach these issues in his ruling. After rejecting Corellium’s blanket fair use defense to Apple’s Section 1201 claims, Judge Smith held that Corellium’s argument about the extent of Apple’s copyright and technological protection measures presented issues of material fact that prevented resolution of the DMCA claims on summary judgement. On this basis, the litigation continues.

Conclusion: Fix the DMCA

Although Judge Smith’s fair use holding represents a major victory for independent security researchers, the court’s failure to resolve the DMCA claims underscores the power that large software firms like Apple wield to prevent researchers from examining their products for bugs and vulnerabilities. While well-funded Silicon Valley darlings like Corellium have the resources to fight back, smaller entities face more challenges—not to mention non-profit organizations of ethical hackers who seek to reveal and publicly disclose serious vulnerabilities in widely-used software. Unless and until Congress builds more robust protections for security researchers into laws like the DMCA, big tech will continue to use them as a cudgel, with potentially devastating consequences for our cybersecurity.