SALEM, Ore. (KATU) — A new law set to take effect in a few weeks will criminalize AI-generated sexually explicit images, expanding the crime of illegal intimate images to include digitally created, manipulated, or altered images.
This legislative move comes in response to a significant rise in AI-generated intimate deepfakes, which experts argue predominantly victimize women and children.
“I do think this is a serious problem because artificial intelligence has become so sophisticated,” said Representative Kevin Mannix of Salem.
“This is a serious problem because artificial intelligence has become so sophisticated,” said Rep. Kevin Mannix of Salem, a Republican and one of the chief sponsors of the bipartisan bill. “This is not funny. People are being victimized and we want to put a stop to it,” he added.
SEE ALSO | Multnomah County investigates AI-generated nude images of Corbett High School students
Under the new law, prosecutors must prove that the accused intended to harass, humiliate, or injure a victim and that the victim experienced harassment, humiliation, or injury.
Mannix said the bipartisan law will send a strong message. “I hope that all of our schools around the state will get the message out to their students that the use of artificial intelligence to create a fake image is, in itself, wrong.”
RELATED | Corbett families question administration’s delayed reporting of nude AI image of student
Concerns over artificial intelligence have been growing, particularly after an incident in the Corbett School District where parents and students claimed administrators did not act quickly enough after an AI-generated picture showing a nude student began circulating online.
The district reported that a Corbett High School principal discovered the AI photo last week and, following a multi-day investigation, reported the incident to police.
The Multnomah County Sheriff’s Office is now investigating the case as potential child sexual abuse material, though it remains unclear what charges, if any, can be pressed.
Starting in January, the unlawful dissemination of an intimate image will be classified as a Class A misdemeanor, potentially resulting in a year in jail and a fine exceeding $6,000.
Repeat offenders could face felony charges, and victims will have the right to sue.
Meanwhile, Washington state law already includes AI-generated porn laws safeguarding children and adults.
President Donald Trump also signed the “TAKE IT DOWN Act” in May, a legislation aiming to combat the proliferation of artificial intelligence-generated explicit imagery.
