HomeGame GuidesOpenAI Announces Media Manager; Starting to test deep fake image detector

OpenAI Announces Media Manager; Starting to test deep fake image detector

Published on

OpenAI made several announcements today related to how its generative AI tools are used, and how images created with these tools can be recognized.

The first announcement Discovers that OpenAI is working on a tool called Media Manager. It was created so that content producers and content owners can inform OpenAI of their ownership, and whether or not they want the content to be added to the company’s AI training models.

OpenAI stated:

It will require innovative machine learning research to build a first-of-its-kind tool that will help us identify copyrighted text, images, audio and video across multiple sources and reflect creator preferences.

This tool is still in its early stages and will not be officially launched until 2025. OpenAI may have been spurred to launch this tool by the many lawsuits it is currently facing from news organizations, including the New York Times, who claim that the company illegally used its content to train its AI models its.

Today’s second announcement revealed that OpenAI is working on another tool designed to detect whether images have been created with its DALL-E AI art generator. OpenAI says:

Our goal is to enable independent research that evaluates the effectiveness of the classifier, analyzes its real-world application, raises relevant considerations for such use, and investigates the characteristics of AI-generated content.

OpenAI says an early version of the currently unnamed tool successfully identified over 98 percent of images generated by DALL-E, and misidentified them less than 0.5 percent of the time as generated by DALL-E. The tool is currently under private testing by a number of “research labs and research-oriented journalism associations.” No word on when the tool will be generally available.

OpenAI also said in the same blog post that it is joining the Steering Committee of the Coalition for Content Origin and Authenticity. This group creates standards for digital content certification. In addition, OpenAI and its major partner Microsoft are collaborating to fund $2 million for what the companies are calling Social Resilience Fund. It will be used to fund a number of organizations dedicated to the “education and understanding of AI”.

Latest articles

More like this