Aye-Be sexual abuse videos are now “uninterrupted” from the actual footage, a major donation warned.
Internet watch foundation (IWF), who finds online and helps remove derogatory imagination, said that criminals were rapidly making realistic and more extreme content, and warned that technology may soon enable the production and distribution of feature-length films of such materials.
Extremely realistic videos are no longer limited to small, disturbed clips that were earlier common with technology, now using AI to use AI with criminals that often include similarities of real children on a huge scale.
The new IWF data published on Friday discovered 1,286 individual AI-related child sexual abuse videos in the first half of this year.
Last year, only two such videos were discovered during the same period.
A government minister described the figures as “completely frightening” and said that the criminals behind the video were “hate as people who pose a threat to children in real life”.
In 2025, all the confirmed videos so far are so confident that they were to be treated with UK The law said IWF said just as they were real footage.
More than 1,000 videos were evaluated as Category A Embolists, the most extreme – which may include illustrations of rape, sexual torture and bestness.
Statistics have also shown that the AI-oriented child sexual abuse was discovered in the first half of this year, compared to 42 webpages in 2024, while the reports of images for charity increased by 400 percent.
Each webpage can have several images or videos.
The earlier data said by the IWF was earlier stated that last year, 291,273 reports of child sexual abuse were reported.
Charity has told the government that it ensures the secure development and use of the AI ​​model by starting binding regulation that the design of technology cannot be misused.
IWF’s interim CEO Derek Ray-Hil said: “We can all we all have already involved record amounts to sexual abuse to prevent floods of synthetic and partially synthetic materials.
“I am disappointed to see that technology is developing at speed, and that new and unstable methods are abused.
“As we still saw with images, AI videos of hair sexual abuse have now reached the point that they can be unlikely from real films.
“Children who are being portrayed are often real and recognizable, this material that damages is real, and it is a danger that it threatens to move even further.”
Mr. Ray-Hil said that the government should “get a hold” on the issue because it was currently “very easy” for criminals to produce videos, and that the actual children’s feature-length AI-based child sexual abuse films were inevitable.
“The Prime Minister recently promised that the government would ensure that the tech could make a better future for children. Any delay determined the government’s efforts to fulfill the government’s pledge to keep children safe and to reduce violence against girls.
“Our analysts tell us all this is the convenience of AI misuse imagery girls. It is clear that it is still another way to make girls targeted and endangered online.”

Security Minister Jess Philips said: “These data are completely frightening. People who commit these crimes are only disgusting as those who pose a threat to children in real life.
“AI-borne child sexual abuse material is a serious crime, which is why we have introduced two new laws to crack this wile material.
“Soon, criminals who own the equipment who generate materials or manipulate them to manipulate them in legitimate AI devices will have to face prison sentence for a long time and we will continue to work with regulators for the safety of more children.”
An anonymous senior analyst at IWF said AI Child sexual abuse imagery creators had the quality of the video that was available last year.
He said, “The first AI child sexual abuse video we watched were deepfakes – a known victim’s face was placed on an actor in an existing adult pornographic video. It was not sophisticated, but still very confident,” he said.
“The first perfectly synthetic child sexual abuse video we saw earlier last year, was just a series of jerk images, nothing was confident.
“But now they have actually turned into a corner. Quality is dangerously high, and the categories of crime shown are getting more extreme because equipment improves their ability to generate videos showing two or more people.
“The video also includes sets showing the known victims in the new scenarios.”
IWF has advised the public to report to the charity images and videos for charity anonymously and only once, including the exact URL, where the material is located.