
DeepBrain AI, a domestic company specializing in generative AI, announced that it provided a deepfake detection solution to determine the authenticity of entries in an AI contest hosted by the Korea International Cooperation Agency (KOICA). With recent advances in generative AI technology, it is becoming increasingly difficult to discern AI use with the naked eye, increasing the need for specialized identification technology to ensure fair evaluation.
DeepBrainAI introduced its deepfake detection solution, AI Detector, at the "2025 Republic of Korea Humanitarian Aid AI Promotional Video and Poster Contest" hosted by KOICA last October. This solution accurately identifies AI-inspired content by analyzing screen pixel changes and subtle patterns in video and audio. It also meticulously analyzes the flow of the production process, including screen, audio, and scene transitions, enhancing the fairness of the evaluation process.
Jang Se-young, CEO of DeepBrain AI, said, “As generative AI technology spreads, a system that can transparently verify the content creation method and usage background is an important element in increasing fairness and reliability,” and added, “We will continue to work to create a safe and transparent AI utilization environment across both the public and private sectors.”
DeepBrain AI is continuing to advance its deepfake detection solution through the 'Cultural Technology Research and Development' project hosted by the Ministry of Culture, Sports and Tourism and the Korea Creative Content Agency. It is also carrying out various innovative projects, such as launching an API service and developing technology to determine whether generative AI video has been manipulated.
- See more related articles
You must be logged in to post a comment.