by Michael
At the time of this writing six states have criminalized the creation of deepfake pornography of nonconsenting individuals in some way. Hawai’i, Georgia and New York consider it as something akin to “revenge pornography” and a privacy crime, Wyoming broadly criminalizes deepfakes as a form of identity theft without specifically requiring a sexual nature to the content, Texas considers it a general sexual offense and Virginia places it in the Obscenity category.
How To Classify Deepfake Pornography
There’s no early consensus to how this crime ought to be classified, and the state legislators can hardly be blamed for it. Deepfake pornography can conceivably be related to any crime relating to one’s identity or sexual participation. For example, consider a scenario in which a deepfake creator uses a celebrity’s appearance to create a deepfake that deceives paying customers into believing they’re having a one-on-one pornographic video call with that celebrity when they are actually interacting with a deepfake of the celebrity created on-the-fly. If the deepfake creator were criminally charged, their crime could conceivably be considered:
- Identity theft (of the celebrity’s appearance)
- Fraud (for deceiving customers)
- Deceptive Advertising (if advertising for the call featured the image of the celebrity)
- Theft (if the celebrity were a known sex worker and their pornographic content was argued to be their property)
- Electronic solicitation of a minor (if a client were a minor and the video call was two-way)
- Obscenity
- Child Pornography (if the celebrity were a child)
- and many more depending on the details of the fact pattern
Of course this fact pattern is low-resolution at best and the exact criminal statutes vary by jurisdiction, but one cannot ignore the underlying truth that there is just no best option to classify deepfake pornography of nonconsenting individuals and define it under a statute that encompasses all its possible definitions.
It seems entirely reasonable then to create a new category for deepfake pornography based on elements instead of a concrete, inflexible definition, and to start the discussion of what the elements of such a crime ought to be.
What Should Be Criminalized About Deepfake Pornography?
We might discern the essence of the criminality by examining what each statute has in common. If you read through the criminal deepfake statutes that have been currently passed, you’ll notice that each statute has a few elements in common:
Boiled-down Elements
- Intent (save for Georgia) to create or distribute the deepfake content
- The content itself is a deepfake, not a video that actually occurred
- The deepfake has the appearance of a nonconsenting person (save for Hawai’i)
- The deepfake is pornographic or sexual in nature
(We didn’t reference Wyoming because their statute is broad and isn’t directly intended for deepfakes)
Each state has some unique frills and definitions on their statute, such as Virginia requiring intent to coerce, harass or intimidate the person while Hawaii requires intent to create, disclose or threaten to disclose the content, but the basic element of intent is the same across the states. The above four elements are really the most boiled-down version of a deepfake crime that we have so far.
It strikes me as noteworthy that three of the four boiled-down elements pertain to the deepfake itself, not the mens rea of the producer or distributor of the deepfake or the potential for harm to the depicted person. This suggests to me that as a whole, the problem with deepfakes is the content itself, which is fundamentally a value-neutral proposition. In other words, the thing that’s worth criminalizing about deepfakes is the existence of the pornography itself regardless of its surrounding context.
It makes sense then, that three of the six states have treated deepfake pornography in a similar manner as their pre-existing revenge pornography crimes: revenge pornography’s existence is similarly a crime in and of itself, even if it isn’t created in the same manner.
A Model Statute for Deepfake Pornography Criminalization
Texas’ statute is pretty close to the boiled-down elements already. It reads:
(b) A person commits an offense if, without the effective
consent of the person appearing to be depicted, the person
knowingly produces or distributes by electronic means a deep fake
video that appears to depict the person with the person ’s intimate
parts exposed or engaged in sexual conduct.
The elements would be:
- Without consent of the depicted person, the defendant has
- knowingly produced or distributed
- by electronic means
- a deepfake video of sexual conduct or of exposed intimate parts
Not bad. It keeps the crime very “chargeable” by removing intent requirements, but still protects defendants by requiring proof of their production or distribution. This statute is centered on the existence of the pornography itself. Were I to write a model statute I think this would be pretty much it.
Legislative Intent Behind Criminalization
I reached out to Senator Huffman, the state senator who proposed the Texas bill, to ask her legislative intent behind the bill. Her office directed me to her comments to the Texas Senate Committee on Criminal Justice when presenting the bill, which begins at 14:15.
I encourage listening to her comments in full, which are succinct. Her stated intention was that of simple deterrence, that by criminalizing the conduct ahead of time it would prevent the creation of more victims. Deterrence is reasonable at this point in the life of the technology: the technology is just beginning to hit the mainstream and there’s many potential victims whose victimhood could be avoided by early action.
Let’s hope the deterrence works.
Leave a comment