Sometimes it is because we already recognise the victim that we can determine the difference. The Internet Watch Foundation (IWF) has identified a significant and growing threat where AI technology is being exploited to produce child sexual abuse material (CSAM). Our first report in October 2023 revealed the presence of over 20,000 AI-generated images on a dark web forum in one month where more than 3,000 depicted criminal child sexual abuse activities. While the material, commonly known as child pornography, predates the digital era, smartphone cameras, social media and cloud storage have allowed the images to multiply at an alarming rate.
Media Portrayals
They worked as administrators of the site and gave advice to members on how to evade law enforcement when using the platform for illegal child pornography. “With thanks to our law enforcement partners in Germany and Europol, a dangerous site hosting tens of thousands of child sexual abuse videos has been taken down,” the NCA’s Neil Keeping said. Officials arrested 337 individuals around the world, following an international investigation into a dark web child pornography site that sold illegal videos for untraceable digital currencies. Horrifyingly, forum members referred to those creating the AI-imagery as “artists”.
Make Optimized Content In Minutes

Both recirculated and new images occupy all corners of the internet, including a range of platforms as diverse as Facebook Messenger, Microsoft’s Bing search engine and the storage service Dropbox. Last year, tech companies reported over 45 million online photos and videos of children being sexually abused — more than double what they found the previous year. For example, victims may have continuing trauma due to the permanence of photos or videos, just knowing the images are out there. Generative AI is exacerbating the problem of online child sexual abuse materials (CSAM), as watchdogs report a proliferation of deepfake content featuring real victims’ imagery.

He has covered Washington for two decades, earning 20 Emmy and Edward R. Murrow awards. The federal investigation which uncovered and shuttered the first dark web site, also led to the closure of three others. “The leadership team that operated one of the sites also operated several of the others.” Their search warrant was so important to the bureau that it was approved by the FBI director himself.
Start Of International Cooperation Investigation
Spearman went by the nickname “Boss” and was labeled by the Justice Department as “one of the most significant” purveyors of child sex abuse material in the world. His arrest in 2022, his guilty plea a year later and his eventual life sentence were part of an unprecedented takedown of a prodigious child abuse network. In November 2019, live streaming of child sex abuse came to national attention after AUSTRAC took legal action against Westpac Bank over 23 million alleged breaches of anti-money laundering and counter-terrorism laws. The man had shared sexually explicit videos online, the police said, including one of a 10-year-old boy being “orally sodomized” by a man, and another of a man forcing two young boys to engage in anal intercourse. In 2016, a federal court held that the national center, though private, qualified legally as a government entity because it performed a number of essential government functions.
Illegal Pornography
Serafini called the site “horrific” and blamed McIntosh’s involvement on a lifelong sexual addiction. Given a “psychiatric diagnosis as a pedophile” in the 1990s, McIntosh said he suffered with the affliction for 50 years and had fought it with varying degrees of success. Her decision followed an hour-long discussion over the website’s contents, and whether McIntosh’s role warranted a longer or shorter sentence than that of his codefendants, three other men identified as the site’s leaders.
- Adults may offer a young person affection and attention through their ‘friendship,’ but also buy them gifts both virtually and in real life.
- The site had more than 200,000 videos which had collectively been downloaded more than a million times.
- Disturbingly, the ability to make any scenario a visual reality is welcomed by offenders, who crow in one dark web forum about potentially being able to “…create any child porn1 we desire… in high definition”.
- To get away with such crimes, he says predators use sophisticated tools such as encryption, virtual private networks (VPNs), and cryptocurrency to cover their tracks.
- It may also include encouraging youth to send sexually explicit pictures of themselves which is considered child sexual abuse material (CSAM).
- In it’s investigation, the National Society for the Prevention of Cruelty to Children (NSPCC) compared official numbers published by Apple to numbers gathered through freedom of information requests.
According to law enforcement, when requests are made to the company, Snap often replies that it has no additional information. The anonymity offered by the sites emboldens members to post images of very young children being sexually abused, and in increasingly extreme and violent forms. With so many reports of the abuse coming their way, law enforcement agencies across the country said they were often besieged. Some have managed their online workload by focusing on imagery depicting the youngest victims. The Justice Department, given a major role by Congress, neglected even to write mandatory monitoring reports, nor did it appoint a senior executive-level official to lead a crackdown. And the group tasked with serving as a federal clearinghouse for the imagery — the go-between for the tech companies and the authorities — was ill equipped for the expanding demands.

The Ticketmaster Data Breach May Be Just The Beginning
Since April last year, the IWF has seen a steady increase in the number of reports of generative AI content. Analysts assessed 375 reports over a 12-month period, 70 of which were found to contain criminal AI-generated images of the sexual abuse of children2. Some of the deepfake videos feature adult pornography which is altered to show a child’s face. Others are existing videos of child sexual abuse which have had another child’s face superimposed. According to the snapshot study, there has been 17 percent increase in online AI-altered CSAM since the fall of 2023, as well as a startling increase in materials showing extreme and explicit sex acts. Materials include adult pornography altered to show a child’s face, as well as existing child sexual abuse content digitally edited with another child’s likeness on top.
Once More, A Judge Rules Against Gov’t In Tor-enabled Child Porn Case
A young person may be asked to send photos or videos of themselves to a ‘friend’ that they might have met online. These photos and videos may then be sent to others and/or used to exploit that child. Alternatively, they may also be used as a threat or manipulation tool to get a young person to participate in sexual or illegal activities. It may seem like the best solution is to restrict or remove access to digital media, but this can actually increase the risk of harm. A youth may then become more secretive about their digital media use, and they therefore may not reach out when something concerning or harmful happens. Instead, it’s crucial that children and youth have the tools and the education to navigate social media, the internet, and other digital media safely.
- The Times found that there was a close relationship between the center and Silicon Valley that raised questions about good governance practices.
- At some point on this timeline, realistic full-motion video content will become commonplace.
- It also demonstrates the dangers of allowing a young child unsupervised access to an internet enabled device with a camera.
- Progress in computer technologies, including progress in generative AI, has enormous potential to better our lives, and misuse of this technology is a small part of this picture.
- The European Union’s law enforcement agency Europol said 79 suspects had been arrested for sharing and distributing child sexual abuse material on a platform known as Kidflix.
“The website monetized the sexual abuse of children and was one of the first to offer sickening videos for sale using the cryptocurrency Bitcoin,” the NCA said in a statement. A search of Mendonsa’s electronic devices revealed thousands of illicit images of children, “approximately 6,500 of which depicted identified victims of his conduct,” the statement said. Westpac was accused of failing to monitor $11 billion worth of suspicious transactions, including those to the Philippines suspected to be for child sexual exploitation. “Offenders often request how they want the child to be sexually abused either before or during the live-streaming session,” the report said.
The Internet Is Overrun With Images Of Child Sexual Abuse What Went Wrong?
The dark web streaming platform provided access to thousands of videos depicting extreme child sexual abuse—including crimes against very young children. According to the Internet Watch Foundation, which tracks down and removes abuse from the internet, there has been an 830% rise in online child sexual abuse imagery since 2014. Kidflix was a dark web streaming platform that provided access to thousands of videos depicting extreme child sexual abuse—including crimes against very young children.
A new investigation by The New York Times found that the internet’s largest tech platforms are failing to effectively shut down the giant portions of online child sexual abuse material found in search engines, social networks, and cloud storage. Viewing, producing and/or distributing photographs and videos of sexual content including children is a type of child sexual abuse. This material is called child sexual abuse material (CSAM), once referred to as child pornography. It is illegal to create this material or share it with anyone, including young people. The police busted one of the largest child abuse networks in the world, operating in nearly 35 countries. The European Union’s law enforcement agency Europol said 79 suspects had been arrested for sharing and distributing child sexual abuse material on a platform known as Kidflix.

Tips included tutorials on how to encrypt and share material without being detected by the authorities. Tech companies are legally required to report images of child abuse only when they discover them; they are not required to look for them. In interviews, victims across the United States described in heart-wrenching detail how their lives had been upended by the abuse.
Testimony in his criminal case revealed that it would have taken the authorities “trillions of years” to crack the 41-character password he had used to encrypt the site. He eventually turned it over to investigators, and was sentenced to life in prison in 2016. Multiple police investigations over the past few years have broken up enormous dark web forums, including one known as Child’s Play that was reported to have had over a million user accounts. Alicia Kozakiewicz, who was abducted by a man she had met on the internet when she was 13, said the lack of follow-through was disheartening. Now an advocate for laws preventing crimes against children, she had testified in support of the 2008 legislation. The most likely places for such behavior to start include social media, messaging apps, and chat rooms – including on gaming devices.