@Xeraser I actually appreciate how specific they are being.
They expressly use very specific wording. Here are some examples.
Federal law prohibits the production, advertisement, transportation, distribution, receipt, sale, access with intent to view, and possession of any CSAM, including realistic computer-generated images. (They even underline the “Including realistic computer-generated images.”)
Individuals have been known to use content manipulation technologies and services to create sexually explicit photos and videos that appear true-to-life
In November 2023, a child psychiatrist in Charlotte, North Carolina, was sentenced to 40 years in prison, followed by 30 years of supervised release, for sexual exploitation of a minor and using AI to create CSAM images of minors. Regarding the use of AI, the evidence showed the psychiatrist used a web-based AI application to alter images of actual, clothed minors into CSAM
In November 2023, a federal jury convicted a Pittsburgh, Pennsylvania registered sex offender of possessing modified CSAM of child celebrities. The Pittsburgh man possessed pictures that digitally superimposed the faces of child actors onto nude bodies and bodies engaged in sex acts3.
They are using terms like “True-to-life“, “realistic computer-generated images” and “actual, clothed minors” These specific details are exceptionally important when talking about legallity and I’m of the opinion they’re included for a reason. There are so many things in terms of legality which are vague, they do not have to include these details, some of which they specifically underline.
Also take a look at those cases, both the referenced cases seem to include actual abuse of an actual minor, then they incurred extra charges for the images.
Obviously, we’ll see how this goes, but these details jump right out at me.