Skip to content

OpenAI unveils new tool to identify AI-generated images

By | Published | No Comments

OpenAI on Tuesday unveiled its new artificial intelligence (AI) image recognition and identification tools. The AI ​​firm announced the new tool, highlighting the need to authenticate AI-generated content and create awareness about it. The company has also formally joined the Coalition for Content Provenance and Authenticity (C2PA) committee, which has created an open standard for labeling AI-generated content. Notably, OpenAI has been using this standard in its Dall-E-generated images since February 2024 and continues to add AI-related information to the metadata of images.

one in blog post, OpenAI highlights new challenges that have emerged with the introduction of AI-generated content. The company said, “As generated audiovisual content becomes more common, we believe it will become more important for society as a whole to adopt new technology and standards that enhance the content people find online.” “Helps you understand the tools used to create content.” Additionally, the ChatGPT-maker said it is taking two different measures to contribute to AI content authentication.

In its first step, OpenAI formally joined the C2PA committee and called it a widely used standard for digital content authentication. The company also highlighted that the standard is followed by a wide range of software companies, camera manufacturers, and online platforms. Simply put, C2PA advocates adding information to the metadata of images and other file types to reveal how they were created. While the image taken by the camera will include the name and specifications of the camera, the AI-generated image will include the name of the AI ​​model.

This type of authentication method is used because metadata is difficult to remove or change from an image and it persists even if the image is shared, cropped, or changed in any way or form.

Highlighting its second phase, OpenAI said it is working on a new tool that can identify AI-generated images. Without naming the tool, the company called it “OpenAI’s Image Detection Classifier.” The tool predicts the probability of Dall-E creating an image. According to the post, despite using filters or cropping the image, the tool was able to correctly tag 98 percent of the Dall-E-generated images compared to real images. However, the tool struggles when Dall-E’s AI images are compared to other AI models. The AI ​​firm said that in those cases the tool makes mistakes in about 5-10 percent of the sample.

However, OpenAI has now opened the tool up for limited public testing and invites research labs and research-oriented journalism nonprofits to register with the AI ​​firm and gain access to the tool.


Affiliate links may be automatically generated – see our ethics statement for details.
Denial of responsibility! Thelocalreport.in is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us.The content will be deleted within 24 hours.

Reference Url

Surja, a dedicated blog writer and explorer of diverse topics, holds a Bachelor's degree in Science. Her writing journey unfolds as a fascinating exploration of knowledge and creativity.With a background in B.Sc, Surja brings a unique perspective to the world of blogging. Hers articles delve into a wide array of subjects, showcasing her versatility and passion for learning. Whether she's decoding scientific phenomena or sharing insights from her explorations, Surja's blogs reflect a commitment to making complex ideas accessible.