We are rapidly approaching a point where AI image generation will soon be indistinguishable from real life photo and video. With AI audio generation, we are basically already at that point, with AI being able to generate fake clips of real life people "saying" things that they never actually said, or "singing" songs they never actually sung.
What are the future implications of living in a world where a genuine photograph/video/audio recording is totally identical in quality to an AI-generated fake? How will humans ever be able to trust recorded evidence of ANYTHING ever again, knowing that any "photo" or "footage" they see could simply be an AI-generated fake?
Studying history will become impossible, because every piece of evidence you find for past events could simply be an AI forgery. Recording current events will become impossible, because no matter if you get high-quality 4K footage of something happening in real time, people will just dismiss it and say "yeah but that could just be AI". Nobody will trust anything anymore unless they see it with their own two eyes in real life.
>inb4 "b-b-but you can still tell the difference between AI and real life"
Yes, right now you can, because there are still noticeable flaws eg. the weird arms on the girls in pic related. But one day these flaws will be ironed out, and AI image/video will look identical to IRL. What will humans do when this day comes? How will we cope?