Hashing it Out: How an Automated Crackdown on Child Pornography is Shaping the Fourth Amendment
As AI develops and Google improves its recognition Techniques, the 4th amendment will continue to be tested. Why does this matter? Because protection from improper governmental searches is a line that should not be breached. How courts rule on these issues in the next few years will determine a major direction of criminal defense.
The Fourth Amendment only applies to government searches, and under an age-old rule, the government may use the fruits of private searches in investigations and prosecutions, no matter how unreasonable those searches are. However, the “Private Search Doctrine” was forged in an era where private searches were largely case-specific and human-performed. The world has changed, and Google and other companies increasingly use automated systems to monitor all of their users for illegal activity. In 2018, Chief Justice John Roberts acknowledged that modern corporations are not typical witnesses, and “they are ever alert, and their memory is nearly infallible.”
The internet has enabled an unprecedented rise in the availability of child pornography, and the removal of online child pornography and prosecution of its distributors is sorely needed. Courts have been happy to ratify extensive digital inspections performed by private companies like Google. However, there are reasons to be cautious. The surveillance technology that put Miller in prison is systematized and ubiquitous, and the government has easy, constitutional, and in the case of child sexual abuse, mandatory access to its output. In its present state, the private search doctrine is broader than it has ever been. It is time to consider whether the benefits are worth the risks.
The Ever-Watching Eye: How Google Spots Illegal Content
Imagine sending an email with an attachment. Unbeknownst to you, as soon as those files are attached, they pass through a “hashing” checkpoint. But what exactly is “hashing,” and how does it work?
A “hash” is a string of numbers that corresponds to a digital file, similar to a digital fingerprint. These hashes are based on a file’s unique properties, meaning that two identical images will have the same hash. However, even a slight alteration, like a one-pixel difference, will result in a completely different hash.
This hashing technology, developed in the late 2000s, has become ubiquitous across the tech industry as a cost-effective and reliable method for screening for illegal content.
How Google’s Hashing System Works
Google employs a two-part process for all files uploaded to its platforms:
- Hash Generation: First, Google scans the file’s properties and creates a unique hash.
- Hash Comparison: Next, this generated hash is compared against a comprehensive list of hashes known to correspond with child pornography.
If a match is found, it indicates that the uploaded file is identical to an image that a Google employee has previously reviewed and confirmed as child pornography.
Moral and Legal Obligations
Google’s use of hashing checkpoints isn’t mandated by law. Instead, Google has both moral and commercial reasons to actively screen for child pornography on its platforms. However, once its checkpoint flags a file, like it did in Miller’s case, Google incurs a legal duty to take action.
The Cybertip: Reporting Illegal Content
Federal law, specifically 18 U.S.C. § 2258A, requires Google and similar services to report any child pornography they detect by submitting a “cybertip” containing information identifying the account user. These reports are sent to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization charged by section 2258A with processing these tips.
While Google sometimes reviews a cybertip before submitting it, in most cases, the tip is sent automatically. While the hash value of the files matched images that a Google employee had previously identified as illicit, no one at Google actually looked at the images attached.
Once the cybertip reaches NCMEC, a worker conducts a more in-depth investigation, supplementing the information provided by Google by tracing the user’s public online presence. Like Google, NCMEC does not open the files at this stage.
From Cybertip to Conviction
Though a subpoena usually suffices to obtain an address from an internet service provider, the government needs a warrant to search a Gmail account. That type of cybertip usually provides enough evidence to obtain a warrant.
Based on most searches, which reveal more incriminating emails, the government can obtain another warrant to search a Defendant’s house, and yet another to search the devices and hard drives found on the premises.
This chain of events in these types of cases culminates in a Defendant’s indictment on six counts related to child pornography, even though a lot of time can pass.
At trial, the government usually presents cybertip, emails provided by Google, subscriber information from the Defendant’s ISP, and any other relevant information related to the materials. The government’s case usually will rest on this information and Google’s hash match.
The Private Search Doctrine: A Loophole in the Fourth Amendment?
The Fourth Amendment protects your reasonable expectation of privacy from government searches. In most cases, this means the government needs a warrant to search any email or other applications. However, thanks to the private search doctrine, Google isn’t bound by the same rules.
The Logic Behind the Doctrine
The private search doctrine is based on a seemingly simple logic: the Fourth Amendment only applies to government action. Therefore, if a private party conducts an unreasonable search, there has been no government search, and thus no Fourth Amendment violation. Once a private search has been conducted, any continuing expectation of privacy regarding the searched items is considered unreasonable. As a result, the government can review the results of a private search as long as it doesn’t exceed the search’s scope.
Limitations of the Doctrine
However, the private search doctrine isn’t absolute. The Fourth Amendment protects against unreasonable government searches, not just searches conducted by government officials. This means that a search by a private party is unconstitutional when it’s “fairly attributable” to the government.
For example, in U.S. v. Reed, the Ninth Circuit found the search of a motel room by the motel manager unconstitutional because two police officers stood by as lookouts, effectively “indirectly encourag[ing]” the search even though they didn’t instigate it or enter the room themselves.
The Supreme Court’s Perspective
The Supreme Court most recently addressed the private search doctrine in U.S. v. Jacobsen, where FedEx employees inspecting a damaged package noticed a suspicious white powder and notified the police. The police tested the powder and identified it as cocaine. In Jacobsen, the drug test didn’t exceed the scope of the FedEx inspection because whether the result was positive or negative, it didn’t compromise any legitimate interest in privacy. There is no legitimate privacy interest in possessing cocaine, and a negative test would only show that the powder wasn’t cocaine, not reveal any private fact.
The automated hashing systems employed by Google and other tech companies represent a paradigm shift in the scale and scope of private searches. The question remains whether the traditional private search doctrine adequately addresses these new realities or if it’s time to re-evaluate the balance between privacy rights and law enforcement needs in the digital age.
It is extremely important that you contact JC Law to discuss these possible applications!