Artificial intelligence technology has advanced to the point where it can generate highly realistic images, including content that depicts illegal activities. This new reality leads to an important question: is it illegal to view or produce AI-generated child sexual abuse material (CSAM), commonly referred to as child pornography, if no actual children were involved?
As AI technology becomes mainstream, programs that can generate photos and videos have become increasingly popular. Users continue to employ AI-generated imagery for various purposes, including child pornography.

AI programs are often used to generate sexual images and videos of fictional characters, while other times, they depict actual people performing sexual acts, referred to as "deepfakes."
Policymakers have been working on passing laws about AI-generated pornography. Notably, while pornographic imagery of fictional adults falls under the First Amendment right to free speech, deepfakes and sexual depictions of children are illegal in some states and in the process of being outlawed in others.
In California, Governor Gavin Newsom signed AB 1831, which criminalizes the creation, distribution, and possession of artificial intelligence-generated child sexual abuse material (CSAM).
This law addresses the rapidly accelerating dangers posed by artificial intelligence (AI) technologies creating disturbing and harmful content resembling actual children. AB 1831 is one of three bills to protect Californians from fast-moving AI technology and advance the responsible use of AI.
Although some states are still in the process of updating their laws to catch up with technology, federal law is quite clear that AI-generated images of CSAM fall under the legal definition of child pornography. Therefore, it's a federal crime to possess, view, create, or distribute such material.
If you're accused of doing any of these things, you could face significant time in federal prison if convicted, even if no actual minors were directly involved in the creation of the images.
What is CSAM?
Child sexual abuse material is defined as any visual depiction of sexually explicit conduct involving a minor. However, the term "child pornography" is widely used.
Still, legislators, law enforcement, and advocates increasingly use terms that emphasize the abusive nature of the content, such as "child sexual abuse material" or "child sexual exploitation and abuse imagery" (CSEAI). In the United States, efforts to fight CSAM online and, most recently, the use of AI to generate CSAM has wide bipartisan support.
Under federal law, knowing possession of CSAM is a crime. Importantly, the law treats AI-generated CSAM the same as real-life CSAM. Specifically, federal laws prohibit:
- Any visual depiction of CSAM that is indistinguishable from an actual minor engaging in sexual conduct.
- Visual depictions of any kind, including computer-generated images, drawings, cartoons, sculptures, or paintings that show a child engaging in sexual conduct if it is obscene or lacks serious artistic value.
- The law does not require that a depicted minor actually exist. Thus, people and organizations risk criminal liability even if the CSAM they host does not depict an actual child.
Studies have shown that the majority of people possessing and distributing CSAM also commit hands-on sexual offenses against minors. Frequently, the abuse has been committed by someone that the child knows and trusts. Offenders often use grooming techniques to normalize sexual contact and encourage secrecy.
AI-generated child porn, including deepfakes and images of purported children, are illegal under federal law. An important thing to remember about crimes is that they become federal when they cross state lines.
This includes crimes using the internet, which almost always involves interstate communication. AI-generated child porn, therefore, may be prosecuted under state and federal law.
What is the Federal Definition of Child Pornography?
Similarly, under federal law, the term "child pornography" is defined in 18 U.S. Code 2256(8). This statute clarifies that child pornography includes any visual depiction-such as a photograph, video, or computer-generated image-that involves a minor engaged in sexually explicit conduct.
Importantly, for cases involving modern technology, the definition does not stop at traditional forms of media. Federal law explicitly includes the following within its scope:
- Computer-Generated Depictions: This refers to visuals that are digitally created to depict a child engaging in sexually explicit conduct. This means that digital creations, even if no real child was involved in their production, can still qualify as CSAM.
- Indistinguishable Depictions: A computer-generated image or picture that is "indistinguishable" from a real visual depiction of a minor engaged in sexually explicit conduct is classified as child pornography. This addresses the increasing sophistication of AI-generated content to ensure that hyper-realistic depictions fall under the same legal restrictions.
- Computer-Modified Depictions. This refers to altering or modifying an image to make it appear that an identifiable child is engaged in sexually explicit conduct. For example, if you superimpose the head of an actual child onto an AI-generated sexual scene, it is still child pornography even though the child was not actually involved.
By framing the law this way, Congress leaves little room for ambiguity regarding AI-generated material. The verbiage here is predicated on the understanding that the existence of CSAM is harmful in itself, even if no actual minors were overtly abused in the production of the material.
Simply put, Federal law prohibits the production, advertisement, transportation, distribution, receipt, sale, access with intent to view, and possession of child sexual abuse material (CSAM). Underlying every sexually explicit image or video of a child is abuse, rape, molestation, or exploitation. The production of CSAM creates a permanent record of the child's victimization.
What are the Federal Penalties?
The penalties for possessing or distributing child pornography, whether it involves real children or computer-generated images, are severe under federal law. The principal statutes governing these offenses are found in 18 U.S. Code § 2252 and 18 U.S. Code § 2252A. These laws include strict sentencing guidelines, including:
- Possession: Anyone found in possession of child pornography-including AI-generated materials, can face up to 10 years in prison. If the image in question depicts a prepubescent minor or a minor under the age of 12, sentences can increase.
- Distribution or Production: If the individual is accused of producing, distributing, or receiving child pornography, the penalties are even more significant. Sentences range from a mandatory minimum of 5 years to up to 20 years in federal prison. Repeat offenders or those with prior convictions for similar crimes face even harsher minimums, escalating mandatory sentences to a minimum of 15 years and a maximum of 40 years.
Defending Against Federal Child Pornography Charges
The stakes are extraordinarily high for those accused of federal crimes involving AI-generated CSAM. Federal prosecutors and courts take these offenses seriously, with penalties designed to reflect the gravity of the crime. The law does not differentiate between traditional and AI-generated images when it comes to the harm inflicted or the societal impact.
This means that the fact that no minors were used in the creation of the material will not be accepted as a defense against the charges. If you are accused of a federal crime involving child pornography, hiring a knowledgeable federal criminal defense attorney is an important first step in navigating the charges.
Despite the severity of the crime and the possibility of severe penalties, a good attorney can still employ several defense strategies on your behalf. Among the most common defenses are:
- Lack of Knowledge or Intent: A conviction under federal child pornography laws still requires prosecutors to show that you knowingly produced, distributed, received, possessed, or viewed the material in question. Suppose you can show that you were unaware that the material in question was child pornography. In that case, that viewing the material was accidental, incidental, or that you were unaware the depictions were of minor children, for example, could be used effectively as valid defenses.
- No Foreign or Interstate Commerce: For child pornography of any type (AI or traditional) to be prosecutable under federal law, it must still either involve federal infrastructure (e.g., mail or the Internet) or crossing state or national boundaries. Suppose your alleged activity did not meet this criteria and was contained within one state. In that case, your attorney may be able to argue that it falls under state rather than federal jurisdiction-which could lower the potential penalties as a result.
For more information, contact Eisner Gorin LLP, our federal criminal defense lawyers based in Los Angeles, California.