An image of the disaster in Shizuoka Prefecture taken by a drone over the heavy rains caused by Typhoon No. ”, etc., and acknowledged that it was a fake image.



Experts point out that the accuracy of images generated by AI is rapidly increasing to the point that it is difficult for humans to distinguish, and caution is required, such as checking the source of the information.

On September 26, while Shizuoka Prefecture was suffering from flooding and other damage due to the heavy rains caused by Typhoon No. 15, an image of the disaster in Shizuoka Prefecture taken with a drone was posted on Twitter.



In the image, it appeared that a residential area was flooded over a wide area, and was widely circulated. I used it," he admitted that it was a fake image.

We asked Professor Isao Echizen of the National Institute of Informatics in Chiyoda-ku, Tokyo, who specializes in information security, to analyze this image.



According to it, the service that the poster is said to have used is an AI-generated image that follows the input text. This means that it is possible to create a clever fake image.

If you look carefully at the image posted this time, there are some unnatural parts such as the sudden disappearance of the flow of the river, but it is difficult to distinguish trees and rivers from the real thing.

Compared to that, the building has an unnatural shape such as a distorted outline, and it is easy to distinguish from the real thing.



In addition to checking the video in detail, Professor Echizen says that it is important to check the source of the information as a way to judge whether the image is genuine.

To do so, search for the original image by entering keywords on a search site on the Internet to see if there are any similar images, or check the comments on the post to see if there are any suspicious points. I want you to



Regarding SNS at the time of disaster, he posted false content on Twitter, such as "a lion escaped from the zoo" during the Kumamoto earthquake six years ago, and was accused of obstructing the zoo's business by fraudulent means. Some people have been arrested on suspicion.



In addition, if the image of the affected area at the time of the disaster is incorrect, the disaster victims may make the wrong decision to evacuate, and the disaster prevention agency may be burdened with confirmation.

Professor Echizen says, ``The field of AI learning and image generation is progressing rapidly, and the accuracy of images is increasing, making it difficult to distinguish whether it is real or not. If you can't confirm it, I think you need to doubt whether it's an image generated by AI.With AI, you can even create images that look like actual disasters. I want you to contact me."