The success of these AI tools seem to be driven by their ability to view/collection millions, billions of light-weight images. Mostly with Common Crawl currently.
If it's made much more complicated/intensive to obtain the image via a script at scale, I wonder if that could successfully counter these tools?
On someone's own site they could maybe provide a robots.txt file. But if artists want to use popular social media sites that aren't using that, would uploading their work as videos with some sort of canned intro to prevent thumbnailing be a good deterrent?