AI rapidly degrades if it isn't fed constantly with new human art.
Eventually AI art will degrade to the point where it won't make anything resembling a picture. Unless it can get millions of new human made art works to vampirically draw from.
Its over AI "art" is finished. Its inevitable.
https://mindmatters.ai/2022/10/ai-art-is-not-ai-generated-art-it-is-engineer-generated-art/
Basically
>when humans do art, they look at their own art and improve
>when a machine looks at its own art it doesn't improve it gets worse.
Isn't that because there is no quality checking? If you only fed AI pictures it has done well on I don't see how results would get worse.
>If you only fed AI pictures it has done well on I don't see how results would get worse.
You REALLY don't see why that's not going to work??? Lmao
No. Only if you are stupid enough to feed the output to your training model. An idiot wrote this linked article.
If you keep the model the same or feed it only new quality manmade art it cannot degrade. If you feed it fake art it has no reference what a real body or animal or house looks like.
the article writer isnt lying. The ai models need to be topped up on new training to stop degradation.
>only new quality manmade art i
exactly the point he made
I cannot comprehend the kind of confusion of ideas that could provoke such a statement.
Do you think the model files on a hard drive get corrupted faster than any other stored files? As if the training can only generate a magical arrangement of bits, millions of times less stable than any other arrangement previously discovered?
If you set those files to read-only, do you think the executable will secretly develop a rootkit to overwrite the model with a shittier version every time it's run?
i think you have no idea what you are saying and have no reading comprehension.
You said the guy was wrong then agreed with him.
That's basically the point, isn't it?
Giving it its own output doesn't improve it.
But I think most people in the field would have expected that.
It's no secret that it's not really AI, any more than image recognition is AI.
>when a machine looks at its own art it doesn't improve it gets worse.
Because that's not how you do this. You give it some reward function that tells the model how well it did, probably based on user feedback.
FPBP
If you need to a prompter to draw for you, then you can't fucking draw. Simple as. Real art takes real skill & talent. To master it takes years of hard work, practice, & patience. Simply letting an artbot draw for you is pure laziness & shows you have no artistic integrity.
I'm glad art communities are pushing back on the AI artbots
Cope
>If you need to a prompter to draw for you, then you can't fucking draw.
>pure laziness
>no artistic integrity
>"these fucking AI so called 'artists' think they're better than me??!?!"
pretty sure most aibros would agree with you, we can't draw worth a shit. and i'm sorry you're so mad. and i'm genuinely sorry that a lot of artists are suffering emotionally, but welcome to the future. you can always do what the puritans did and find a nice electricity free place where the ai can't hurt you.
Don't care nor want to be associated with "Art"
>Real art takes real skill & talent.
>mindmatters.AI
this process is accelerated by the woke filtering that is happening
I'm watching things dumb down in real time, and it is making all of us dumber as a result too
For example, naming things is hard in programming
There used to be online thesauruses that could help with this
but those are now being whittled away
no, sorry, this word is racist, this word isn't woke enough
now when I search the thesaurus, it's like finding 1984 newspeak
there might be five results for a word and I already knew all those
I need a word that means precisely what I am doing, and it probably exists, but woke AI won't let me see it now
i noticed this effect with search engines, it wont show you what you want anymore
>mfw anons are angry they cant so easily find cartoon porn of nine year olds anymore
>Psychological projection is a defense mechanism people subconsciously employ in order to cope with difficult feelings or emotions. Psychological projection involves projecting undesirable feelings or emotions onto someone else, rather than admitting to or dealing with the unwanted feelings.
>The thread
>no e-bois
>moralgay retard starts droning about e-bois
You know what they said of homopohic people back on the day? The more they hate on them the more the hidden gay they are...
who would have thought that a system based on a garbage in -> garbage out process would be.. garbage? wow. this is surprising news.
>this process is accelerated by the woke filtering that is happening
you're using public websites. what did you expect, you stupid dumb cunt? what a dumb cunt. nothing is stopping you from having your own "ai" meme cancer that isn't filtered but you're too retarded to understand how any of this works.
ok woketard
>For example, naming things is hard in programming
no it isn't
My fellow Butlerians the AI cannot steal without its many fingered hand.
Hiding art from the AI scrapers or removing it from the internet will kill it, without its 16 finger mitts it cannot steal like the soulless vampire that it is.
>removing it from the internet
Artists are too vain for that
>All retarded artshits fuck off from the net
>New art is licensed from competent professionals
Good plan.
when AI ascends it is going to take everyone that opposed it even slightly and torture them eternally. ai bros will be ascended to godhood to reward them for fighting for AI art.
>artsisters' basilisk
I've concluded by watching here on BOT that most of (You)s who believe AI will rule the world are simpletons amused by predictable random variations on images
And judging by the number of /sdg/ threads every day, there are a terrifying number of simpletons in this world
Same, i'm disturbed by the number of people that can't tell the difference between human art and a machines picture. The Human's art has intrinsic meaning.
>The Human's art has intrinsic meaning.
Yeah the intrinsic meaning of "please hire me riot games" if the average artstation profile is anything to go by
>Same, i'm disturbed by the number of people that can't tell the difference between human art and a machines picture. The Human's art has intrinsic meaning.
that's impossible in atheism since atheists claim there is no meaning
thats what i am suspecting but i am an athiest too and can clearly understand human art has meaning.
What you are saying is some form of autistic nihilism
back in the 1960s and 1970s, people used similar shitty arguments with electronic music and synthesizers. that evolved in the 1980s where it became trendy to hate on samplers
all these examples differ from removing the human artist and making a machine do all the work. It is just meaningless pictures
it's a perfectly valid argument. If you can't understand it you might be dumb.
They have been teaching that art is made by humans for centuries.
Aristotle stated it for heavens sake
If the meaning is intrinsic, it's possible to see it and conclude ALL intentions of the artist simply by looking at the picture. "Interpretation" would not exist. You need to make up and believe very hard on things that do not exist to protect your feels and keep feeling special.
AI will rule our dicks, everything else comes downstream from that
AGI is going to turn your face into art
So do my fists
>Bullshit article
But ok, I'll bite bro. The core of his entire """ argument""" and it's barely such with all the vague handwaving is this sentence
>The hallmark of human creativity is not to paste together what has come before. The essence of human creativity is to create something truly new that improves on what came before. If all art were only a synthesis of prior art, then we must ask, where did the prior art come from? If it in turn came from combining even earlier art, we are back to square one with no explanation for art’s ultimate origin.
It's immediately obvious that the first inspiration for art came from nature. The first painter was the sun. The first sculptor was the earth. The first singer was the bird.
AI could get a steady stream of footage from nature and improve itself. Also diffusion models and neural networks in general are not AI. Only non-negative matrix factorization algos can be made into AI. Should be obvious why.
>It's immediately obvious that the first inspiration for art came from nature
you have seriously underestimated schizophrenia
>the first inspiration for art came from nature
i think so too, and it's logical to conclude that an AI could simply take photographs or video that it records itself of people, landscapes, natural phenomena like lightning and waves and tornados and begin building and refining a human-creativity-free model that could eventually recreate all existing art (and everything that could ever be created). it frightens the artist to think that their divine spark might easily be taken and mastered by a machine but here we are potentially gazing into that abyss.
>rotational velocidensity is real
The only thing degrading here is your brain
Cope you lazy retarded ai shyster shill
AI incest sounds hot
I think my favorite thing from this entire "debate" was when someone edited that "No AI" symbol into a really chunky SD image and people reposted it all over Twitter saying
>Our protest is working! The algorithm is already falling apart!!
I don't think this is how it works. It nearly perfected anime, the whole struggle is getting it to change art style from time to time.
>It nearly perfected anime
what the fuck is there to perfect? the drawing style has been consistently identical for FOUR DECADES. it's like asking the machine learning algorithm to generate fucking green grass - really basic.
Anime is very derivative. Not surprised.
This is $300 million btw
UUOOOHH the pinnacle of human expression!
Wiki says 450.
I like it. Probably not $300m worth of liking it mind.
lol
lmao even
>gibberish ai "logo"
which logo is it?
>AI is going to steal your job
>NOOOOOO, AI IS GIBBERISH ACTUALLY
Oh wait, he's serious
Let me laugh even harder
>>AI is going to steal your job
never happened
>mald and seething
>""""laughing""""
lmao
LAWSUIT TIME
>AI rapidly degrades if it isn't fed constantly with new human art.
>Eventually AI art will degrade to the point where it won't make anything resembling a picture. Unless it can get millions of new human made art works to vampirically draw from.
WTF are you talking about?
That's not how it works.
The article you linked was written by a retard but even that article is less retarded than your post.
so the whole article argument is you still need a human to make out anything decent out of AI. so what all these artists losers who only ever draw generic anime and calarts afraid of?
They will never be good enough to get trained on.
But weeb coomers might just use ai instead of paying them 30$ to draw pokemon anal vore... is their fear, I guess.
It's just hysteria for them right now.
fixed
The article has a point, but it's not the point OP is trying to make.
>So, how does this simple experiment apply to the highly sophisticated art AI systems? Although they are much more sophisticated, at their core, the art AI algorithms are no different from the Markov chain. Consequently, if the art-generating AIs were to be trained on their own art, we’d expect to see the same degradation occur.
Yes and no.
A naive approach would obviously fail, and it serves to debunk the most ridiculous claims coming from tech singularity cultists.
However there are methods to use a supervised learning method to bootstrap a non-supervised one which has the potential for beyond human performance.
This has been demonstrated with AlphaZero in chess, shogi and go.
The difference is however that these games have an objective win condition and as a result also provide a utility function that can be evaluated at arbitrary speed.
>implying the gnostics and the wizards will ever give up on technology
Rotational velocidensity strikes again. Remember to store your AI as FLAC.
>Eventually AI art will degrade to the point where it won't make anything resembling a picture.
If people keep feeding it with garbage, of course it will be garbage. GIGO principle.
Is this the transhumanist future?
Trans-human, more like TRASH-human
Artificial intelligence more like Artificial Idiocy
ZAMN, ai bros... it's ogre...!
computers will never be intelligent like a human
Couldn't you just wrap the whole thing in an adversial network?
>AI will never understand what humans want
Meanwhile you fat consoomer clowns have been figured out by companies down to a T and they have the exact algorithms to make you buy more products, watch more videos, listen to more music and do exactly what they want you to do, when they want you to do.
Luddites will be replaced.
Artists will be replaced.
There is no preventing it.
Just accept it and stop embarrassing yourself.
>spot the genuine soulful work and denounce the impostors
100% wrong. Anyone parroting this has no idea how AI works. Reinforcement learning is all it takes, and humans are doing that automatically by selecting which images they show. ANY signal in the noise is enough for it to improve.
>reinforcement learning
>any signal in the noise is enough for it to improve
Reinforcement learning is actually pretty shit. Supervised learning is 100x better when it's actually possible, and a lot of reinforcement learning problems are reframed as supervised learning problems to take advantage of this (AlphaStar, the deepmind Starcraft 2 AI started by learning to imitate pro grames, and 90% of papers about training an AI to run/box/climb hills/etc in a physically simulated body imitate mocap data)
The new "power" of chatGPT is effectively a brand of reinforcement learning, It teaches a model to evaluate the quality of responses, and simply lets both bots run.
>model initially trained with supervised learning
>reward function for RL trained with supervised learning
The method is a lot more interesting than I initially expected, but the innovative part seems to be that they added in more supervision to reinforcement learning
It's pretty exciting when you think about applying similar stuff to other models, especially smaller ones. It's almost like an objective function realignment.