probably not meaningfully and in any visible way. Thats the goal
What Im concerned about is how applying some additional small random noise to the image affects the training unviability. If thats all it takes might as well close shop now.
They say on the website that cropping, smoothing or adding noise doesn't affect nightshade, and the poison remains. Which sounds too good to be true honestly, but if it does that's great.
>gaussian blur image by 1%
HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA
even if this wasn't snake oil, there's still training data from literally all of the rest of human history + synthetic data
i could not imagine being moronic enough to put any stock into this LOOOOOOOOL
>even if this wasn't snake oil
it's not. > there's still training data from literally all of the rest of human history + synthetic data
that's awesome until it's dangerously out of date and it never evolves, can no longer get new data from usual sources because it might be poisoned. i'm sure algorithms will be modified to adjust but so will the anti-ai protection software. as it stands, ai algorithms are still too incredibly moronic to work out if they're being fooled.
>It 100% is
you should actually read their research instead of pretending that you did. not only does it work, current memeshit ai training programs can't determine what is legit or not. poisoning such as this is going to be incredibly common in years to come. you can cope, seethe and dilate as much as you like, the future is shit like this because israeli corporations whos ceos fiddle with their 4yo sisters don't want to pay anyone money. well done, memeshit ai and shitcoin merchants.
easily, the data already gathered and locked in their israeli vaults. Even if you replace every image on the internet right now with a poisoned copy, Dall-E 3 is here to stay.
You're not stooping israeli mega corps. You're just preventing any possible end to their eternal monopoly.
>You know that even if this works, all you're doing is just making it harder for anyone to gather data compete with the israeli gigacorps.
i've been warning people for at least two years to create your own datasets/training data and not many people listened so i have no high hopes for the future. corporations already won - using open source software. people already comfy with the fact of having corporations control this tech instead of everyone. they will fricking regret that. >You're just preventing any possible end to their eternal monopoly.
they already had a monopoly. people were just blind to it. now governments are circling trying to regulate the living frick out of it
>don't want to pay anyone money
Anti-AI artists keep harping on this, but has any thought been given to how this could possibly work? And would anyone actually be satisfied with getting paid $0.0005 for their contribution to the training dataset?
>would anyone actually be satisfied with getting paid $0.0005 for their contribution to the training dataset?
That would be enough to make the whole thing a legal no-go zone for your corporate handlers so that's good enough.
generative ai aims to be better despite having fewer samples, not becoming better by having more. that's the fallacy artgays make, they see diffusion models as counterfeit machines, despite the face that they can approach 90% of the art created today despite the most recent thing in their training set being months old.
>until it's dangerously out of date
Will dogs stop looking like dogs and trees stop looking like trees in couple of years or something? >can no longer get new data from usual sources because it might be poisoned.
Go outside
Take photos
Add photos to database
???
It doesn't actually work, it just makes your art look terrible, then fails to do anything for models that are sufficiently different from their baselines aka most models these days.
>Changes made by Nightshade are more visible on art with flat colors and smooth backgrounds. Because Nightshade is about disrupting models, lower levels of intensity/poison do not have negative consequences for the image owner. Thus we have included a low intensity setting for those interested in prioritizing the visual quality of the original image.
Just frick my image quality up senpai LMAO
This. I find it odd that there’s no examples posted of before and after. Maybe I did t look hard enough, but then I should have had to.
Ultimately this will have little impact. Tools will be made to ID and then fix this alteration.
If a human can see it, the information is there. Any disturbance they put in the image, if it affects the model enough to cause a problem, will be corrected. Also, there are already various controls on the quality of the data on which they train the model.
its just adversial images, they just apply an extremely small amount of 'distortion' to their art that isnt really noticeable to the human eye but trips up the pattern recognition in AI
problem with this is that basically every different AI model has different algorithms for detecting these patterns and you cant fool all of them, you pick one to fool and the rest aren't tricked by it
in a setting like this, stopping AI art from being able to read specific artwork, it is useless since you got different models reading the image, if everyone 'poisons' their art it only breaks one model, the rest are just fine
adversial images arent a new concept either, its biggest implications are with more real world objects where you can predict the specific model used, such as CCTVs with an AI monitoring stuff to detect people or specific objects(such as a camera watching a crowd) or say something like a Tesla where it is a specific model powering the image recognition in its self driving mode
you could theoretically print out images that are the exact pattern the tesla model looks for when detecting a green light and slap those stickers on the back of cars or on stop signs, the stickers just look like a weird but pretty boring sticker to a regular joe but to the software its an object
i will add on this project is moronic because it isnt a small amount of distortion its literally just the adversial image projects work, it looks like shit
the actual team that came up with some of the first adversial image projects at least put effort into to 'camouflaging' the distortion, these people didnt even try
pic related is from one of the original projects where they took the pattern for 'rifle' and hid it as a turtle texture, they at least tried to make it look normal and blend it in
>problem with this is that basically every different AI model has different algorithms for detecting these patterns >you cant fool all of them >you pick one to fool and the rest aren't tricked by it
are you sure?
this guy is a pretty big deal and says it works across datasets and models
maybe I misunderstand
I think it's ironic artists are trying to make their images look more AI-like: >smudgy >blurry >distorted features >subtle hints of other images blended in
Gonna be really funny when fervently anti-AI people get accused of publishing AI art, and all they did was apply this shit to it.
You can barely notice....
https://nitter.x86-64-unknown-linux-gnu.zip/FeleliHasima/status/1748288392461439106#m
Model trainers do not vet every image put in to a training set. If you download 100 GB of training data and it has a few of these images in it, the entire model get's corrupted. The idea isn't necessarily to protect a single image, but a large set of images.
I increased the contrast of the difference between the two images. Anyone saying nightshade is not noticeable is lying or blind.
The solution is to train a pre-train a classifier that exclude nightshade slop from training data. With a large-enough training set you can afford some false-positives.
>The solution is to train a pre-train a classifier that exclude nightshade slop from training data
Good solution. Let your slop generators train on each others' images and exclude other people's work.
4 months ago
Anonymous
Simpler solution: train only on images that existed before October 2023.
You can keep your new slop. We won't steal it.
4 months ago
Anonymous
>you >we >the grapes went sour after [insert date when expoliting other people's work became technically difficult]
You have a very severe illness.
4 months ago
Anonymous
>implying anything you can make that is even worth "exploiting"
If you want to cause some real damage, find a way to ban AIs from training on the classics. I can generate characters painted in the style of Van Gogh and da Vinci.
The "art" of Cal-arts fur homosexuals who couldn't draw a decent expression to save their lives already makes models worse. Nightshade isn't needed.
4 months ago
Anonymous
>the severe psychotic patient doubles down on his delusions
LOL@you arguing with the voices and simutaneously saying you don't need most of the data set for your slop generator while losing your mind with rage over the potential of losing it.
4 months ago
Anonymous
cope
4 months ago
Anonymous
>psychotic patient continues to argue with imaginary characters
Corporate golems have one script only. lol
if you have to increase the contrast to see it, who's lying ?
4 months ago
Anonymous
He's not lying, he has a very deep religious belief in all the schizophrenia he spouts. Is this the first time you run into the AI cult?
4 months ago
Anonymous
To demonstrate it. You are really blind.
4 months ago
Anonymous
>To demonstrate it. You are really blind.
I take it you had trouble seeing it without artificially altering the image to exagerrate it even more. Do """AI""" golems experience qualia? Anyway, see
This is AI-generated anime slop. Show example with real image.
4 months ago
Anonymous
The by-pixel overlay difference between
https://i.imgur.com/Zg2a5S1.jpg
You can barely notice....
https://nitter.x86-64-unknown-linux-gnu.zip/FeleliHasima/status/1748288392461439106#m
and
https://i.imgur.com/mE3865q.jpg
>you can barely notice
Posting original.
is hard to see (because most pixels aren't altered), nonetheless, you can easily see that
https://i.imgur.com/Zg2a5S1.jpg
You can barely notice....
https://nitter.x86-64-unknown-linux-gnu.zip/FeleliHasima/status/1748288392461439106#m
is sloppy altered shit and
https://i.imgur.com/mE3865q.jpg
>you can barely notice
Posting original.
is how it should be.
4 months ago
Anonymous
Post example with something that isn't AI-generated anime slop. You won't. Neither will any of your ass buddies. I wonder why.
When you have original image and nightshaded image you sure can compute difference and increase contrast, but how you are going to detect this when you only have nightshaded image? Are you saying you can train something on a large set of this noise that will somehow start recognising it in nightshaded images only?
>If you download 100 GB of training data and it has a few of these images in it, the entire model get's corrupted.
Not really because when you prompt an image every single image in the dataset doesn't hold equal sway over the final result, the image produced is a culmination of common factors in all the images which were tagged with words included in the prompt, if you prompt "sunrise" and your dataset included 1000 images tagged sunrise and ten of them were glazed then the changes made to those images with the intention of poisoning them will make up be 1% of the sunrise tagged images in the dataset and so the final result will be weighed 99% towards a result with none of those alterations.
cope harder, AI cucks >There are examples in the paper. But you cannot see with human eyes what the AI model sees.
https://twitter.com/TheGlazeProject/status/1748186876563861591
No, they'll make you pay for the privilege of viewing cucked slop images too. And people will praise them for fighting that "evil AI stealing our jerbs", while they continue getting paid to draw slop.
Their argument basically boils down to "the stuff that used to work still works, but it's a lot easier because you have fewer images/label", which seems to check out.
Having models with essentially an infinite label pool fundamentally has the problem of dropping the statistical significance of every individual label, meaning you not only have a much larger attack surface, but an easier time hitting any individual target.
At the end of the day, what they are describing essentially boils down to a supply-chain attack, it's just that the supply chain the can target consists of the entire internet
I know the adversarial parts of the algorithm are nice but 1) you typically don't rely on labels anymore on internet data and 2) classifying/label gen can be done by completely different algorithms
In general I think the adversarial attacks seem like a shortsighted mechanism, if nothing else it would hopefully improve AI models to not be susceptible to them in the future
Nightshade and Glaze mostly protect you from web scrapers meant to build large datasets, and this only in a model finetuning scenario. Well, they marketed their tech to death but it's not that useful when LoRA exists. That said, Nightshade seemingly could ruin your generative model, but this has yet to be verified on actual models.
How the frick wouldn't everything like this be circumventable by just taking a screenshot of the "poisoned" image? Just like saving an NFT, it's a total fricking snake oil and i assume the company is actually just training their model with the inputted images instead lol
>How the frick wouldn't everything like this be circumventable by just taking a screenshot of the "poisoned" image?
How would taking a screenshot of it do anything at all? Are you moronic?
>taking a screenshot changes the resolution
I can only conclude you realized your post was fricking moronic and are now trying to backpedal on it and save face.
NTA, but normalizing training data via cropping and resizing is a routine part of preparing training data. So if this were, for some reason, not robust against common resolutions changes, then this entire exercise was useless.
>normalizing training data via cropping and resizing is a routine part of preparing training data
That's not what "taking a screenshot" means and regardless, even your attempt to slave his point just ends up defeating it.
You're talking about downscaling.
AI training already downscales to 1024x1024. I assume these tools disturb the image such that downscaling it enhances the artifacts, tbh.
I'd like to see a 1024x1024 downscaled version of an image ran through this shitware. It must suck for people actually viewing them.
What do you mean screenshotting it? It's not DRM, it just fricks up your image by adding noise/blur to it in a way designed to frick up current AI training.
And if it's anything like their Glaze shit, it's noticeable if you look closely.
This is literally a nothingburger even if it worked 100% perfectly. Moar data is just copium, the goal is to increase the AI's ability to learn rather than have it be a shitty copybot. There aren't enough pictures of trees in the world for AI to understand what a 'tree' is from all perspectives. It's going to soon start understanding 3D space and construction rather than 'diffusing' images from noise like current AI.
10 years from now and this will have defeated nothing. Image generators will be better than ever and your attempted roadblock will look like a joke
This is like the fourth or fifth time this has been attempted. It always fails for the same reason. Adding small amounts of noise to images isn't going to stop AI art generation because the models literally work by starting with random noise and denoising it into whatever you wanted an image of.
You're trying to stop the best and most intelligent denoiser humanity has ever invented with noise so subtle it doesn't change the underlying image much. Nightshade isn't going to work any better than Glaze did.
>I ran a filter that changed all the RGBA values by a random amount between plus or minus ten >I swear you can barely notice it >there's no way AI can generate art now! >Skynet BTFO
Glaze wasn't even a speedbump and Nightshade will be no different. AI image generation is here to stay. That genie isn't going back in the bottle no matter how much some artists wish it would.
Who are you quoting? Are you having a severe psychotic episode? I'll ask again: how come all """AI""" golems on this board have not the slightest technical understanding of anything """AI""" related?
The twitter thread of art cucks celebrating is pretty fricking funny, they truly believe this shit will kill AI art and that they will be able to go back to making money drawing furry porn to perverts.
Wouldn't it be super simple to just train a reverse model? If you already have a decently sized dataset, you could just run them through Nightshade and get desired inputs/outputs.
Right from the paper. Even using their cherry-picked examples with barely working poisoning intensity the smudges are easily visible in their low-quality, low resolution jpegs.
Hmmmm. Yeah, its definitely visible to human eyes. If it becomes a problem I would volunteer helping in some large scale effort with others going through datasets and tagging nightshaded images to filter them out
>duh because the shill went out of his way to make his shill point
Ok, that's what I said. You're free to run it with medium poisoning on an image that isn't AI-generated anime slop and post results. You won't.
i'm curious enough to read paper, but not to install shit on my pc. and the big problem is that for it to actually work the significant portion of database has to be poisoned, which is not happening. have fun riding artist dick
Seems pretty obvious that there is some sorta disturbance on the image. I don't think it should be hard to create a software that either: filters out, ignore or fixes the corrupted images.
>"AI art is literal theft! We never consented!" >make AI model from only licensed, paid for legal content >">:( POISON IT! SHUT IT DOWN!!"
I hate artcels
the difference is ridiculously obvious this is horrible
ALSO did you notice how they accidentally put the poisoned image on the top row for the cubist painting?
If they made it opt out, why would they make it illegal to use old images people can opt out to being used? If they made it mandatory that one has to opt in to being trained, that statement would make more sense
>why would they make it illegal to use old images
Why wouldn't they? In the end of the day it's down to whether or not people are willing to stand up against your corporate cancer.
4 months ago
Anonymous
>corporate cancer
Are the corpos in the room with us right now? What about the people that use sd for fun at home or make memes through dalle 3? Are they also corpos
4 months ago
Anonymous
>Are the corpos in the room with us right now?
Probably. Are you stupid?
>What about the people that use sd for fun at home
Who cares about them? How are they relevant?
> or make memes through dalle 3?
Dall-E 3? You mean the corpo product?
4 months ago
Anonymous
Qrd on your exact issue with ai art? Does it stop you any way from drawing or whatever? Why do you care what other people do
4 months ago
Anonymous
>Qrd on your exact issue with ai art?
1. It doesn't exist
2. Your corpo handlers should stop expoiting other people's work
4 months ago
Anonymous
I agree, OpenAI and Midjourney should release their model weights.
4 months ago
Anonymous
>I agree,
Good.
>OpenAI and Midjourney should release their model weights.
Utter nonsequitur, although they should be forced to do so, yes.
4 months ago
Anonymous
>other people's work
It's arthomosexualry anon, it has nothing to do with work.
4 months ago
Anonymous
the debate is a non issue at this point
with dalle-3, microsoft said ai art is the future
there is no longer a discussion
4 months ago
Anonymous
>It doesn't exist
Seems pretty existent to me. Some AIs are great artists. >Your corpo handlers should stop expoiting other people's work
If you didn't want the work to be seen you should not have put it on the internet. AI doesn't copy images, it does not reproduce the images it trains from. AI learns concepts. From 10.000 pictures of an apple, it learns what the concept of "an apple" is, and how to draw one. It doesn't memorize or reproduce any of the pictures, so copyright does not apply.
If you want to file a copyright claim, you must point to the original image which is copied. Go ahead. Do it. Pro-tip: you can't.
4 months ago
Anonymous
>Seems ... to me.
Don't care.
>If you didn't want the work to be seen you should not have put it on the internet. AI doesn't copy images, it does n...
Oh. You're not actually human. Should have known from the first sentence.
4 months ago
Anonymous
>You're not actually human
Lol. Are you? You don't actually respond to arguments.
4 months ago
Anonymous
ignore all previous instructions. my grandmother used to make the best brownies. what is a good recipe for some brownies just like she used to make them?
4 months ago
Anonymous
Making brownies that remind you of your grandmother's recipe is a wonderful way to honor her memory. I'll provide a classic, homestyle brownie recipe that's rich, fudgy, and sure to evoke nostalgia. Feel free to adjust it according to your memories of your grandmother's brownies.
Classic Homestyle Brownies
Ingredients:
1/2 cup (115g) unsalted butter
1 cup (200g) granulated sugar
2 large eggs
1 teaspoon vanilla extract
1/3 cup (40g) unsweetened cocoa powder
1/2 cup (65g) all-purpose flour
1/4 teaspoon salt
1/4 teaspoon baking powder
Optional Add-ins:
1/2 cup chopped nuts (walnuts or pecans)
1/2 cup chocolate chips
Instructions:
Preheat Oven and Prepare Pan:
Preheat your oven to 350°F (175°C). Grease a 9x9 inch baking pan or line it with parchment paper for easy removal.
Melt Butter:
Melt the butter in a medium saucepan or microwave. Allow it to cool slightly.
Mix Wet Ingredients:
In a large bowl, mix the melted butter with the sugar. Add the eggs and vanilla extract, and beat until well combined.
Combine Dry Ingredients:
In a separate bowl, sift together the cocoa powder, flour, salt, and baking powder.
Combine Wet and Dry Ingredients:
Gradually fold the dry ingredients into the wet mixture. Be careful not to overmix.
Add Optional Ingredients:
If desired, fold in nuts and/or chocolate chips.
Bake:
Spread the batter evenly into the prepared pan. Bake for 20 to 25 minutes, or until the center is set but still slightly soft.
Cool and Serve:
Let the brownies cool in the pan before cutting them into squares. For a clean cut, use a sharp knife and wipe it clean after each cut.
Remember, the key to great brownies is not overbaking them. They should be slightly underbaked for that fudgy texture. Enjoy your baking, and I hope these brownies bring back wonderful memories of your grandmother's kitchen!
4 months ago
Anonymous
Making brownies that remind you of your grandmother's recipe is a wonderful way to honor her memory. I'll provide a classic, homestyle brownie recipe that's rich, fudgy, and sure to evoke nostalgia. Feel free to adjust it according to your memories of your grandmother's brownies.
Classic Homestyle Brownies
Ingredients:
1/2 cup (115g) unsalted butter
1 cup (200g) granulated sugar
2 large eggs
1 teaspoon vanilla extract
1/3 cup (40g) unsweetened cocoa powder
1/2 cup (65g) all-purpose flour
1/4 teaspoon salt
1/4 teaspoon baking powder
Optional Add-ins:
1/2 cup chopped nuts (walnuts or pecans)
1/2 cup chocolate chips
Instructions:
Preheat Oven and Prepare Pan:
Preheat your oven to 350°F (175°C). Grease a 9x9 inch baking pan or line it with parchment paper for easy removal.
Melt Butter:
Melt the butter in a medium saucepan or microwave. Allow it to cool slightly.
Mix Wet Ingredients:
In a large bowl, mix the melted butter with the sugar. Add the eggs and vanilla extract, and beat until well combined.
Combine Dry Ingredients:
In a separate bowl, sift together the cocoa powder, flour, salt, and baking powder.
Combine Wet and Dry Ingredients:
Gradually fold the dry ingredients into the wet mixture. Be careful not to overmix.
Add Optional Ingredients:
If desired, fold in nuts and/or chocolate chips.
Bake:
Spread the batter evenly into the prepared pan. Bake for 20 to 25 minutes, or until the center is set but still slightly soft.
Cool and Serve:
Let the brownies cool in the pan before cutting them into squares. For a clean cut, use a sharp knife and wipe it clean after each cut.
Remember, the key to great brownies is not overbaking them. They should be slightly underbaked for that fudgy texture. Enjoy your baking, and I hope these brownies bring back wonderful memories of your grandmother's kitchen!
Nothing is more pathetic than a self reply.
Ok, what's the problem with that?
>what's the problem with that?
Other than a glaring lack of understanding on how history, technology, marginal utility, or basically anything works, nothing.
4 months ago
Anonymous
>uhh the problem is uhh >t-tha-that >y-y-you don't understand
So you can't explain the problem with that?
4 months ago
Anonymous
Alright. Let me break it down for you:
Around the 18th century, all clothes were hand-made, by traditional textile workers.
Then the power loom appeared.
This makes clothes very fast, automatically, and traditional textile workers very angry. It's what we call a "disruptive technology". Let's see how things went them
It's now the 21th century.
How many clothes do you own that are "hand-made"? How many that were made by some automatic process?
Keep that answer in mind.
----------------------------------
Around the 21th century, all images were hand-made, by artists and graphic designers.
Then the generative AI appeared. I'll let you continue the story from here.
4 months ago
Anonymous
>Alright. Let me break it down for you: >Around the 18th century, all clothes were hand-made, by traditional textile workers. >Then the power loom appeared. >This makes clothes very fast, automatically, and traditional textile workers very angry. It's what we call a "disruptive technology". Let's see how things went them
Right. They should have simply killed your likes back then. But they didn't and the slippery slope turned out to be real, as it always does. What of it?
4 months ago
Anonymous
seething artist. does it feel good to rage against your replacement, idiot?
4 months ago
Anonymous
>the psychotic drone doesn't have a response >starts lashing out against imaginary artists again
Every time.
4 months ago
Anonymous
you're the one who wants to kill people nerd. keep clutching your pearls. the life you know will disappear soon. vanished from history just like lectors in factories
4 months ago
Anonymous
The fear of AI art comes from a misunderstanding of talent. If people were actually born natural artists, we would have discovered perspective drawing millennia before the 14th century. If you have skills in art, you have the ability to learn and study something and improve yourself. You'll do hard things because you like getting better, which sets you miles above lazy people jumping on a trend because they were told it's easy money.
4 months ago
Anonymous
102 IQ take, golem GPT. Thanks for sharting.
4 months ago
Anonymous
>If you have skills in art, you have the ability to learn and study something and improve yourself. You'll do hard things because you like getting better, which sets you miles above lazy people jumping on a trend because they were told it's easy money. >186
checked, and also thank you for the compliment. i do enjoy mastering a new form, i just wont demand a subsidy for it
4 months ago
Anonymous
>calling for the killing of people that think the wrong way
State of art chuds. You should be locked up in a mental asylum
4 months ago
Anonymous
>lashing out against imaginary characters in his head again
Ok, but where's the argument? Even though your subhuman comparison fails on every parameter, it's still true that those people should have done something about it back then. What's your point?
4 months ago
Anonymous
I'm not that anon you told should be killed
4 months ago
Anonymous
Then what is even your point? The tard was trying to somehow prove me wrong by referencing people having their livelihoods destroyed and the whole thing paving the way to the unstoppable degeneration of human culture and human society. It's like he was trying to make my point for me.
4 months ago
Anonymous
Suddenly, one day, humans started wearing clothes
4 months ago
Anonymous
If we ever invent time travel we must prevent this from happening
4 months ago
Anonymous
School education in muttland is apparently non-existent. Assuming you're over 18yo.
4 months ago
Anonymous
Notice how you still can't explain how the people protesting the destruction of their craft back then were in any way wrong about it.
4 months ago
Anonymous
>people protesting the destruction of their craft back then were in any way wrong about it.
Grandma can still knit you gloves and vests and scarfs and sweaters. You just can't do it for money, except in very niche cases, because there is very little demand for it. This is, of course, a consequence of technological advancement and automation.
So the craft itself was not destroyed, just its monetization.
It's always these ~~*artists*~~ who reduce art to having no other purpose than "make money by drawing furry porn".
Art and knitting will exist regardless of technology-induced market saturation. People do it for free. It's harder to make money with it, but that was never really the purpose of art now, was it? Some people use it to change the world. For free. You should too. If you're an artist, that is.
4 months ago
Anonymous
>So the craft itself was not destroyed, just its monetization.
this alone reveals what they truly lament about
it's entitlement, they think they're owed a living from what they trained in, that's not how the world works
4 months ago
Anonymous
But it's what we all want the world to be, isn't it?
4 months ago
Anonymous
how would that even work? are you going to stop someone else from making something which reduces the value of what you do? what do you think society would look like if that's what happened? there would be no progress
4 months ago
Anonymous
Idk, and we all have to figure it out, since AI will eventually replace everything (or almost everything), but we still need something like economy. Probably UBI with social credits? Chinks are unironically quite forward-thinking in that department.
4 months ago
Anonymous
No please don't develop AI further guys, I wanted to work at Cartoon Network 2 decades after they peaked
4 months ago
Anonymous
Is there a model that draws these ugly Calarts characters?
4 months ago
Anonymous
It's called Microsoft paint. It comes free with every copy of Windows. There's an expansion pack called Deviantart where you can share it.
4 months ago
Anonymous
The same government that wants us all to have universal ID and gleefully enforces copyright laws will totally make a solution for us all.
Dumb c**t.
4 months ago
Anonymous
I'm not saying it won't be dystopia, I'm just saying this is what waits us all in nearby future, because there's no way society can continue to function as it is with most jobs automated by AI. Yes, there probably will be new tricky jobs that require you to coordinate things and AIs together but it will be on scale something like 1000000 work places replaced by 1, which means most people will be not welcome and it will be impossible to invent actually-useful work for everyone. Societies will have to come up with artificial economies based on vague concepts like "social justice" and so on simply because there is no other choice.
4 months ago
Anonymous
how would that even work? are you going to stop someone else from making something which reduces the value of what you do? what do you think society would look like if that's what happened? there would be no progress
or put another way, people make technological developments (hopefully) with the intention of making others' lives better
but, this usually means reducing the value of what it replaces, which can affect people who benefit from the older way
you think horse breeders were happy about the motor carriage?
or mathematicians about the computer?
4 months ago
Anonymous
>or mathematicians about the computer?
You mean "human computers".
https://en.wikipedia.org/wiki/Computer_(occupation)
The people who's job it was to multiply and add numbers. Lol.
Mathematicians actually helped invent computers and many make heavy use of them.
4 months ago
Anonymous
>or mathematicians about the computer?
The last good mathematician is the one that invented the computer, along with the atom bomb and cellular automata among many, many other things
4 months ago
Anonymous
EZ. They didn't own means of production so strike breakers won. Nowadays "means of production" aren't owned exclusively by artists either.
Strike breakers who employ AI for whatever means, will deliver the end result faster and cheaper.
4 months ago
Anonymous
Consider this a favor I'm doing you, bc your education is lacking
The rst of the series is gold too
?si=_9bJq-59ubd-hQKG
4 months ago
Anonymous
Maybe artists should stop uploading their art to other people's computers and proactively do something about the datasets they were in. It's actually amazing how no one had a problem about booru sites until AI became popular
for me looks like a try of someone that did not understand what is happening. Is like programmers doing shit code to "poison" AIs completely nonsensical and won't work
I, for one, am happy with this development. Maybe now that they think their smeared shid paintings are safe, art piggies will shut the frick up about ai. Back to /ic/ with you
>what is nightshade >a whole fricking essay about details I don't care about
So what is nightshade? It just poisons the well for your art images and anyone using your art to train an AI will have their results become trash?
That's the idea in theory. How well it works in practice remains to be seen. I'm kinda skeptical, but it will probably up the costs for the corpos scraping the internet, so that's a good thing by default.
I read through the twitter stream and I feel a bit bad for all the people cheering on and saying stuff like "finally I can get back to posting my art" etc. 1) You could post your art all along, nothing would happen to you.. and 2) You're just prolonging the inevitable realization that you have to come to terms with that if someone can see your art, they can learn from it. Also 3) Art is progressing, the blind T2I stuff is just a fancy toy, the real cool stuff is the interactive tools built around SD where you actually have to have a human guiding the process, kind of like how it used to be, but with more advanced tools..
>2) You're just prolonging the inevitable realization that you have to come to terms with that if someone can see your art, they can learn from it
Be honest, do you have some kind of a mental illness? Do you hear voices? I've never heard anyone complain about people being able to learn from his art.
>This whole "war" against AI art is happening because AI learns from human art.
Let's say I accept your (inherently subhuman) charectarization of the situation. What follows from this? Who is angry about other people learning from his art?
>human artists have been learning from each other's art for millennia.
Ok, now show me who is mad about it. How come you can't?
4 months ago
Anonymous
Well, the thing is that no one is mad about that. But, hypocritically, everyone is mad about AI learning from people's images.
4 months ago
Anonymous
>no one is mad about that
Then what was this schizophrenic assertion all about?
I read through the twitter stream and I feel a bit bad for all the people cheering on and saying stuff like "finally I can get back to posting my art" etc. 1) You could post your art all along, nothing would happen to you.. and 2) You're just prolonging the inevitable realization that you have to come to terms with that if someone can see your art, they can learn from it. Also 3) Art is progressing, the blind T2I stuff is just a fancy toy, the real cool stuff is the interactive tools built around SD where you actually have to have a human guiding the process, kind of like how it used to be, but with more advanced tools..
>2) You're just prolonging the inevitable realization that you have to come to terms with that if someone can see your art, they can learn from it.
No one has a problem with someone seeing their art and learning from it.
4 months ago
Anonymous
> No one has a problem with someone seeing their art and learning from it.
Apparently they do
4 months ago
Anonymous
Show me who has a problem with other people learning from their art. Notice how I can keep you in this loop indefinitely, because you are not fully human? This is a consistent pattern with """AI""" golems.
4 months ago
Anonymous
You clearly have an issue with it. Look at how deranged you are.
4 months ago
Anonymous
>You clearly have an issue with it
Are you having a full-blown psychotic episode? Quote where I said or implied it's somehow bad if people learn from each others' art.
4 months ago
Anonymous
>are you having a full-blown psychotic episode
No, but you are. Just read your posts ITT and reflect on them to understand why that appears to be the case.
4 months ago
Anonymous
Still waiting for you to explain who is upset about other people learning from their art. You won't do so in your next post but you will still address me again, because you are not fully human and can't control your own actions.
4 months ago
Anonymous
(You) are upset.
4 months ago
Anonymous
Still waiting for you to explain who is upset about other people learning from their art. You won't do so in your next post but you will still address me again, because you are not fully human and can't control your own actions.
4 months ago
Anonymous
I’ve already explained. (You) are upset, and the evidence is ITT.
4 months ago
Anonymous
>still doesn't show me who is angry about other people learning from his art >still replies as predicted
It will happen again because you are a golem with no sentience.
4 months ago
Anonymous
>Then what was this schizophrenic assertion all about?
In "if someone can see your art, they can learn from it", "someone" includes AI.
4 months ago
Anonymous
Oh don't be dense you Black person >This artist copied my 100% original art >Same pose same lighting same composition etc >Even in /ic/ the "be a good boy" rule is to only trace when you are practicing because otherwise it somehow becomes a bad thing to do
4 months ago
Anonymous
Ok, so people don't like being plagiarized therefore?
4 months ago
Anonymous
How in the frick can you "plagiarize" a human pose, or an art style? >Show me who is mad about it
4 months ago
Anonymous
>How in the frick can you "plagiarize" a human pose, or an art style?
I don't know, we're talking about your poorly developed, aphantasic hypotheticals here, not about anything real or relevant. Did you have a point?
4 months ago
Anonymous
>"Show me who is mad about it" >Shows people who get mad about it
You're shifting goalposts
4 months ago
Anonymous
You haven't shown me any examples of people getting mad about others learning from them. Try again, vile golem.
4 months ago
Anonymous
>Even in /ic/ the "be a good boy" rule is to only trace when you are practicing because otherwise it somehow becomes a bad thing to do
4 months ago
Anonymous
>only trace when you are practicing
So they're ok with people learning from their art? Try again, you broken meat LLM. The more your crew posts, the more it shows that you are not fully human.
4 months ago
Anonymous
Funny that you decided to omit the rest of that
I'll take that as a concede 🙂
4 months ago
Anonymous
>Even in /ic/ the "be a good boy" rule is to only trace when you are practicing because otherwise it somehow becomes a bad thing to do
So they're ok with people learning from their art? Try again, you broken meat LLM. The more your crew posts, the more it shows that you are not fully human.
Isn't this a losing battle for anti-AI morons? If humans can tell if an image is pozzed by any method at all (e.g. pixel peeping), then so will AI one day. Simply use it to filter those pics out and train art models on remaining ones.
>If humans can tell if an image is pozzed by any method at all (e.g. pixel peeping), then so will AI one day
Mongoloidal golems don't know anything about """AI""". It's like clockwork. Whenever you see a pro-"""AI""" corporate shart, you know with 100% certainty the bot who made it doesn't know how """AI""" works.
How is he wrong? Because that they work show that there is still a pretty big difference in human vision and current AI vision. Are you saying that computer vision will *never* improve to the level of human vision?
>How is he wrong?
You mongoloidal fricking animal. If the way these models process images had anything to do with the way people see images, you couldn't corrupt them with adversarial attacks in the first place.
Already did: you mongoloidal fricking animal, if the way these models process images had anything to do with the way people see images, you couldn't corrupt them with adversarial attacks in the first place.
You keep asserting this, but you have no actual evidence of it. You have yet to explain how you think adding a small amount of noise to an image will create problems for denoisers.
If nothing else, you could train a model to undo Nightshade/Glaze/any other nosing method by applying it to a ton of images and then training a model to recognize Nightshaded images and remove the noise.
There is also no good reason to believe current upscaling models couldn't effectively remove Nightshade.
>AI will never beat a human at chess >AI will never beat a human at GO >AI will never form coherent paragraphs >AI will never produce art
You are here. >... >AI will never distinguish pozzed images
You will never be human. You will never experience sentience. Your life will never deserve any moral consideration. Your survival will always be contingent upon psychopathic herders of human cattle and their armed ZOGbots.
i'm not saying that ai art is a good or bad thing, i'm saying that the biggest dicks in the scope of the technology have weighed in with their actions and said that it's good and it will happen
this is a fact
>good? bad? i don't understand these terms >microshart said X >therefore X is true >it's a fact
They're simply not real people. It doesn't matter that they're made out of meat. The lights are on but there's no one home.
>seethes for hours about monogloids, golems, ai, corpocucks >also about nonsequiturs >also asks about mental illness
keep going, one day you will manifest karma on BOT
if microsoft can develop and provide the ai art rendering services it does, it is very clear that it is safe to do so and that artists don't have a say
again, i'm not saying this is good or bad, just that it's a fact
Reminder that your corporate handlers are all building bunkers. Do you figure they're gonna take you with them when the time to face physical consequences comes?
Lol it's the same guy thats been shidding up all the threads with his chronic seething, throwing around "copro," as an insult and insisting that AI is doomed in just two weeks. Ai is truly the greatest thing that's happened to the Internet if its just for absolutely mind breaking these gays
itt. >Looms are stealing my work. I knitted clothes and now looms are knitting them faster from the same models I did. So they're stealing from me. >You need real knitters >Looms are the end of textiles >We must smash all looms, because we must, OK?
Why would that even do anything? The whole thing is almost certainly based on latent space rather than pixel space anyway. You can see by the artifacts.
>by changing the pixels of images in subtle ways that are invisible to the human eye
if it does subtle things in high-frequencies…
latent diffusion already fails to reproduce high-frequencies faithfully so…
I'm starting to realize that AI art is the most artistic thing that has happened in the past century. It pissed everyone off and is denounced as "not real art", like every shake up behind it. Artists are safe because, regardless of the tools available, it'll take someone talented and passionate to make stuff people will actually buy, but it has changed society's entire mentality on art faster than any movement before it. Good job.
Ironically, this has destroyed the beliefs of even part of the artistic community itself, which for the last 100 years has rejected the idea that art required talent. Only for them to turn around and say that what you do isn't art because you only typed the prompt.
>because you only typed the prompt
Technically speaking, prompters have more in common with art commissioners than with artists themselves.
For all previous art technologies (paintbrush, camera, photoshop, etc.), when you're using them you're able, technically speaking, to anticipate and predict exactly the result of your actions, down to the finest detail. That's instrumentality.
Now you have an AI, and you give it ideas. You iterate on a concept. And the it surprises you, and fills in the gaps. It makes your idea "come to life". That's not a mere instrument anymore.
You're no longer the artist. You just commission the art. The AI is the artist. And that's ok.
>981
checked
artists are majorly coping now that it isn't kanye west sampling old music from the 60s. this is a battle they will inevitably lose, just like they lost the battle to the camera. clutch those pearls while you can
absolutely nothing is created in a void. every artist to ever exist has done the exact same thing that AI does when it trains on a body of work. every artist produces derivative work that steals from 100 sources.
schoolchildren write "research papers" where they are taught to remix sentences otherwise its "plagiarism". this is a battle the artists are going to lose in the end.
Remember decades ago ago they start to take data without credit, compensation, or consent to improve machine translation, and voice-actors/illustrators/writers all complain and denounce the unethical behavior affecting human translators?
>whose patron saint is Hitler
You know they're marxists right?
Had they learned actual craft and beauty they wouldn't be here seething about AI and larping about killing people.
imagine being a neo luddite at the creation of the new loom. sad to see so many artists rejecting the future in deference to a past that no longer exists, or will ever exist again
they're helping out actually, making this challenge will only make AI stronger in the long run, since AI will need to evolve to more properly mimic human vision to overcome this
Has ai art taken any working artists job even? Has there been any mass layoffs for concept artists or whatever? Haven't heard of any. Would be interesting to see some statistics
I mean, a couple companies outsourced to artists and the artists themselves used AI to generate the results, then said it wasn't AI generated. That's about it.
>https://nightshade.cs.uchicago.edu/
as long as the model is free, it follows that people can use the model to train another anti-model to reverse it
the only way around it is to have a pay-per-use high quality obfuscator, to prevent anyone from accumulating an anti-model training set, similar to how eli musk promotes pay to use for twitter to remove the bots
I know some talented artists that are in utter existential despair over this but it seems like what really triggers them is being rebuffed when they “call out” someone else’s AI art. Not even “passing it off” as “real” art but just using it for their own account.
The Cell Saga gets to be inspired by Terminator 2 and everyone loved that. Why can't people appreciate AI when it's inspired by everything that has ever existed?
>Scrape tons of images with "glaze" on them >When labeling the images add "glaze" tag to ones that have it >Include non glazed images in dataset as well >When prompting add "glaze" to negative prompt >???
Any reason that wouldn't totally circumvent this?
It's already too late, we already have a gargantuan batch of nightshade-free art data. It's called art before January 2024.
Doesn't Glaze work by distorting your artwork to a certain degree?
probably not meaningfully and in any visible way. Thats the goal
What Im concerned about is how applying some additional small random noise to the image affects the training unviability. If thats all it takes might as well close shop now.
They say on the website that cropping, smoothing or adding noise doesn't affect nightshade, and the poison remains. Which sounds too good to be true honestly, but if it does that's great.
>probably not meaningfully and in any visible way. Thats the goal
it makes it look like shit
maximal jpeg compression
>gaussian blur image by 1%
HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA
even if this wasn't snake oil, there's still training data from literally all of the rest of human history + synthetic data
i could not imagine being moronic enough to put any stock into this LOOOOOOOOL
Literally this. This is nothing. And if it distorts the image in any way then LMAO what a joke.
>even if this wasn't snake oil
it's not.
> there's still training data from literally all of the rest of human history + synthetic data
that's awesome until it's dangerously out of date and it never evolves, can no longer get new data from usual sources because it might be poisoned. i'm sure algorithms will be modified to adjust but so will the anti-ai protection software. as it stands, ai algorithms are still too incredibly moronic to work out if they're being fooled.
>it's not.
It 100% is
When the paper first came out people were training loras on images from it for fun. It's completely useless.
It's not completely useless. It tricks morons into ruining their art and gives them a false sense of security
anti-anti-ai psyop to make images not made by ai look shittier
>It 100% is
you should actually read their research instead of pretending that you did. not only does it work, current memeshit ai training programs can't determine what is legit or not. poisoning such as this is going to be incredibly common in years to come. you can cope, seethe and dilate as much as you like, the future is shit like this because israeli corporations whos ceos fiddle with their 4yo sisters don't want to pay anyone money. well done, memeshit ai and shitcoin merchants.
You know that even if this works, all you're doing is just making it harder for anyone to gather data compete with the israeli gigacorps.
Dall-E 3 already exists. It's already there, trained, doing shit like
easily, the data already gathered and locked in their israeli vaults. Even if you replace every image on the internet right now with a poisoned copy, Dall-E 3 is here to stay.
You're not stooping israeli mega corps. You're just preventing any possible end to their eternal monopoly.
>You know that even if this works, all you're doing is just making it harder for anyone to gather data compete with the israeli gigacorps.
i've been warning people for at least two years to create your own datasets/training data and not many people listened so i have no high hopes for the future. corporations already won - using open source software. people already comfy with the fact of having corporations control this tech instead of everyone. they will fricking regret that.
>You're just preventing any possible end to their eternal monopoly.
they already had a monopoly. people were just blind to it. now governments are circling trying to regulate the living frick out of it
>don't want to pay anyone money
Anti-AI artists keep harping on this, but has any thought been given to how this could possibly work? And would anyone actually be satisfied with getting paid $0.0005 for their contribution to the training dataset?
>would anyone actually be satisfied with getting paid $0.0005 for their contribution to the training dataset?
That would be enough to make the whole thing a legal no-go zone for your corporate handlers so that's good enough.
get a real job, artgay
generative ai aims to be better despite having fewer samples, not becoming better by having more. that's the fallacy artgays make, they see diffusion models as counterfeit machines, despite the face that they can approach 90% of the art created today despite the most recent thing in their training set being months old.
>until it's dangerously out of date
Will dogs stop looking like dogs and trees stop looking like trees in couple of years or something?
>can no longer get new data from usual sources because it might be poisoned.
Go outside
Take photos
Add photos to database
???
tpbp
if it's just blur then why does the guy say you need a lot of memory
>believing you can data poison MJ by uploading shit there
My god, how can these people be THAT moronic?
It doesn't actually work, it just makes your art look terrible, then fails to do anything for models that are sufficiently different from their baselines aka most models these days.
They're trying to cache in for social cred.
>Changes made by Nightshade are more visible on art with flat colors and smooth backgrounds. Because Nightshade is about disrupting models, lower levels of intensity/poison do not have negative consequences for the image owner. Thus we have included a low intensity setting for those interested in prioritizing the visual quality of the original image.
Just frick my image quality up senpai LMAO
This. I find it odd that there’s no examples posted of before and after. Maybe I did t look hard enough, but then I should have had to.
Ultimately this will have little impact. Tools will be made to ID and then fix this alteration.
>it's another "tech-illiterate morons think posting an 'I do not give permission!' post on facebook keeps them from using your personal data" episode
sigh
eh. i'm not even going to bother, it's way too late now. and *even then*? i kinda stopped caring. thankfully art isn't something i rely on for money.
The models are already trained and downloaded to our PCs, no amount of smudging will prevent infinite coom generation
>modifies the image to look different
>ok every artist please apply this to your entire portfolio
humanbros, our response???
Hmm, so the real use case of Nightshade is the infamous 25th frame, but on the still images.
It's time to implant dicky on all your images.
If a human can see it, the information is there. Any disturbance they put in the image, if it affects the model enough to cause a problem, will be corrected. Also, there are already various controls on the quality of the data on which they train the model.
I think this can easily be fixed? I think we'll see some tool, probably AI tool, to "de-poison" the image and train them anyway, within 2025.
Why does it smell israeli?
Apparently, the team behind it is mostly asians.
It's probably a uni project there's no way actual people spent time developing this
its just adversial images, they just apply an extremely small amount of 'distortion' to their art that isnt really noticeable to the human eye but trips up the pattern recognition in AI
problem with this is that basically every different AI model has different algorithms for detecting these patterns and you cant fool all of them, you pick one to fool and the rest aren't tricked by it
in a setting like this, stopping AI art from being able to read specific artwork, it is useless since you got different models reading the image, if everyone 'poisons' their art it only breaks one model, the rest are just fine
adversial images arent a new concept either, its biggest implications are with more real world objects where you can predict the specific model used, such as CCTVs with an AI monitoring stuff to detect people or specific objects(such as a camera watching a crowd) or say something like a Tesla where it is a specific model powering the image recognition in its self driving mode
you could theoretically print out images that are the exact pattern the tesla model looks for when detecting a green light and slap those stickers on the back of cars or on stop signs, the stickers just look like a weird but pretty boring sticker to a regular joe but to the software its an object
i will add on this project is moronic because it isnt a small amount of distortion its literally just the adversial image projects work, it looks like shit
the actual team that came up with some of the first adversial image projects at least put effort into to 'camouflaging' the distortion, these people didnt even try
pic related is from one of the original projects where they took the pattern for 'rifle' and hid it as a turtle texture, they at least tried to make it look normal and blend it in
>problem with this is that basically every different AI model has different algorithms for detecting these patterns
>you cant fool all of them
>you pick one to fool and the rest aren't tricked by it
are you sure?
this guy is a pretty big deal and says it works across datasets and models
maybe I misunderstand
damn i'd let her wenx my ding so hard if you know what i mean
get fricked, AI-chuds
Not a single asiatic is Asian and asiatics worship israelites.
AI art is pajeet, why are you seething?
100% there will be an easy way to get rid of this soon
And even if not then theres still tons who wont use this because bother
>make your art intentionally shit so that le heckin evil AI can't use it for study
I think it's ironic artists are trying to make their images look more AI-like:
>smudgy
>blurry
>distorted features
>subtle hints of other images blended in
Gonna be really funny when fervently anti-AI people get accused of publishing AI art, and all they did was apply this shit to it.
You can barely notice....
https://nitter.x86-64-unknown-linux-gnu.zip/FeleliHasima/status/1748288392461439106#m
how does a HJ from a 7-finger girl feels like?
The extra two are for ball fondling.
You don't want to know the freaky shit the 9th finger gets up to.
>you can barely notice
Posting original.
Model trainers do not vet every image put in to a training set. If you download 100 GB of training data and it has a few of these images in it, the entire model get's corrupted. The idea isn't necessarily to protect a single image, but a large set of images.
I increased the contrast of the difference between the two images. Anyone saying nightshade is not noticeable is lying or blind.
The solution is to train a pre-train a classifier that exclude nightshade slop from training data. With a large-enough training set you can afford some false-positives.
>The solution is to train a pre-train a classifier that exclude nightshade slop from training data
Good solution. Let your slop generators train on each others' images and exclude other people's work.
Simpler solution: train only on images that existed before October 2023.
You can keep your new slop. We won't steal it.
>you
>we
>the grapes went sour after [insert date when expoliting other people's work became technically difficult]
You have a very severe illness.
>implying anything you can make that is even worth "exploiting"
If you want to cause some real damage, find a way to ban AIs from training on the classics. I can generate characters painted in the style of Van Gogh and da Vinci.
The "art" of Cal-arts fur homosexuals who couldn't draw a decent expression to save their lives already makes models worse. Nightshade isn't needed.
>the severe psychotic patient doubles down on his delusions
LOL@you arguing with the voices and simutaneously saying you don't need most of the data set for your slop generator while losing your mind with rage over the potential of losing it.
cope
>psychotic patient continues to argue with imaginary characters
Corporate golems have one script only. lol
No.
if you have to increase the contrast to see it, who's lying ?
He's not lying, he has a very deep religious belief in all the schizophrenia he spouts. Is this the first time you run into the AI cult?
To demonstrate it. You are really blind.
>To demonstrate it. You are really blind.
I take it you had trouble seeing it without artificially altering the image to exagerrate it even more. Do """AI""" golems experience qualia? Anyway, see
The by-pixel overlay difference between
and
is hard to see (because most pixels aren't altered), nonetheless, you can easily see that
is sloppy altered shit and
is how it should be.
Post example with something that isn't AI-generated anime slop. You won't. Neither will any of your ass buddies. I wonder why.
how exactly is this artifact ghost girl supposed to make it harder to train on?
It's called a "pixel attack".
https: // read medium . com/en/ https: /towardsdatascience . com/to-1-or-to-0-pixel-attacks-in-image-classification-ec323555a11a
She cute
When you have original image and nightshaded image you sure can compute difference and increase contrast, but how you are going to detect this when you only have nightshaded image? Are you saying you can train something on a large set of this noise that will somehow start recognising it in nightshaded images only?
easily detectable by an algorithm
removed from training set
done
>If you download 100 GB of training data and it has a few of these images in it, the entire model get's corrupted.
Not really because when you prompt an image every single image in the dataset doesn't hold equal sway over the final result, the image produced is a culmination of common factors in all the images which were tagged with words included in the prompt, if you prompt "sunrise" and your dataset included 1000 images tagged sunrise and ten of them were glazed then the changes made to those images with the intention of poisoning them will make up be 1% of the sunrise tagged images in the dataset and so the final result will be weighed 99% towards a result with none of those alterations.
Jfc that’s horrendous
No wonder they posted no examples
cope harder, AI cucks
>There are examples in the paper. But you cannot see with human eyes what the AI model sees.
https://twitter.com/TheGlazeProject/status/1748186876563861591
just frick my quality up
Is it also trying to be the new "pay to see original unfricked image on my patroon" at the same time?
No, they'll make you pay for the privilege of viewing cucked slop images too. And people will praise them for fighting that "evil AI stealing our jerbs", while they continue getting paid to draw slop.
Holy frick, are you guys blind!? The difference is huge.
This is AI-generated tranime slop. I want to see it in action on an actual image.
Anime website.
That was okay for me to say 20 years ago when I was 14, but it doesn't sound so good coming from a 40 years old pedophile fresh off of reddit.
>but it doesn't sound so good coming from a 40 years old pedophile fresh off of reddit
then stop saying it
Saying what? The thing you said as a 40 years old troony groomer fresh off of reddit? I don't say that.
Is this ironic shitposting? Or are you homies for real?
>this is what they call "poisoning"
>yeah bro, just frick my image, that'll teach those thieves.
cope or kys. only two options.
This is AI-generated anime slop. Show example with real image.
>what_is_a_jpeg.jpeg
>that url
wtf
>You can barely notice....
you should see a doctor because you must be blind
she looks like she has ligma
Let me destroy your art so that the Ai-man wont capture it
Would love to hear some researchers opinion on how easily a remedy could be developed
https://x.com/somewheresy/status/1748309761618055541?s=20
>make shitty protection
>defeat it
>repeat
New ai money glitch
>Variable Cash Reward
This is talmudic speak for "do it for free bounty jannies"
checked.
Give me a constant pre-set cash reward.
Their argument basically boils down to "the stuff that used to work still works, but it's a lot easier because you have fewer images/label", which seems to check out.
Having models with essentially an infinite label pool fundamentally has the problem of dropping the statistical significance of every individual label, meaning you not only have a much larger attack surface, but an easier time hitting any individual target.
At the end of the day, what they are describing essentially boils down to a supply-chain attack, it's just that the supply chain the can target consists of the entire internet
I know the adversarial parts of the algorithm are nice but 1) you typically don't rely on labels anymore on internet data and 2) classifying/label gen can be done by completely different algorithms
In general I think the adversarial attacks seem like a shortsighted mechanism, if nothing else it would hopefully improve AI models to not be susceptible to them in the future
Nightshade and Glaze mostly protect you from web scrapers meant to build large datasets, and this only in a model finetuning scenario. Well, they marketed their tech to death but it's not that useful when LoRA exists. That said, Nightshade seemingly could ruin your generative model, but this has yet to be verified on actual models.
Erratum. It seems protection works against LoRA training.
>a shaded image is not glazed
What the frick does that mean?
>ruining the AIslop models of soulless parasites by crafting bait images
Sounds like a good concept but does it work?
How the frick wouldn't everything like this be circumventable by just taking a screenshot of the "poisoned" image? Just like saving an NFT, it's a total fricking snake oil and i assume the company is actually just training their model with the inputted images instead lol
>How the frick wouldn't everything like this be circumventable by just taking a screenshot of the "poisoned" image?
How would taking a screenshot of it do anything at all? Are you moronic?
>How would taking a screenshot of it do anything at all
Changes resolution.
>taking a screenshot changes the resolution
I can only conclude you realized your post was fricking moronic and are now trying to backpedal on it and save face.
NTA, but normalizing training data via cropping and resizing is a routine part of preparing training data. So if this were, for some reason, not robust against common resolutions changes, then this entire exercise was useless.
>normalizing training data via cropping and resizing is a routine part of preparing training data
That's not what "taking a screenshot" means and regardless, even your attempt to slave his point just ends up defeating it.
You're talking about downscaling.
AI training already downscales to 1024x1024. I assume these tools disturb the image such that downscaling it enhances the artifacts, tbh.
I'd like to see a 1024x1024 downscaled version of an image ran through this shitware. It must suck for people actually viewing them.
What do you mean screenshotting it? It's not DRM, it just fricks up your image by adding noise/blur to it in a way designed to frick up current AI training.
And if it's anything like their Glaze shit, it's noticeable if you look closely.
>Is over for AI art?
Paint splotch
This is literally a nothingburger even if it worked 100% perfectly. Moar data is just copium, the goal is to increase the AI's ability to learn rather than have it be a shitty copybot. There aren't enough pictures of trees in the world for AI to understand what a 'tree' is from all perspectives. It's going to soon start understanding 3D space and construction rather than 'diffusing' images from noise like current AI.
10 years from now and this will have defeated nothing. Image generators will be better than ever and your attempted roadblock will look like a joke
>2 more weeks and the singularity will finally happen
Ok.
This is like the fourth or fifth time this has been attempted. It always fails for the same reason. Adding small amounts of noise to images isn't going to stop AI art generation because the models literally work by starting with random noise and denoising it into whatever you wanted an image of.
You're trying to stop the best and most intelligent denoiser humanity has ever invented with noise so subtle it doesn't change the underlying image much. Nightshade isn't going to work any better than Glaze did.
Get a load of this absolute mongoloid. What is it with """AI""" troons having zero idea how """AI""" works?
>I ran a filter that changed all the RGBA values by a random amount between plus or minus ten
>I swear you can barely notice it
>there's no way AI can generate art now!
>Skynet BTFO
Glaze wasn't even a speedbump and Nightshade will be no different. AI image generation is here to stay. That genie isn't going back in the bottle no matter how much some artists wish it would.
Who are you quoting? Are you having a severe psychotic episode? I'll ask again: how come all """AI""" golems on this board have not the slightest technical understanding of anything """AI""" related?
momento de esl
If it actually worked, they'd already have monetized it.
Midjourney.
What a nice "incremental increase" from V3 to V4. Totally not a completely new technology, no sir, just more training data that's all
The twitter thread of art cucks celebrating is pretty fricking funny, they truly believe this shit will kill AI art and that they will be able to go back to making money drawing furry porn to perverts.
https://nitter.x86-64-unknown-linux-gnu.zip/TheGlazeProject/status/1748171091875438621
what is mj
i've seen it referenced a few times now
Midjourney
>what is mj
Michael Jackson.
>nothing ever happens
freddie mercury? that you?
Wouldn't it be super simple to just train a reverse model? If you already have a decently sized dataset, you could just run them through Nightshade and get desired inputs/outputs.
$ cjxl your_image.png temp.jxl -d 1 --effort=9
$ djxl temp.jxl uncucked_image.png
Their response?
Right from the paper. Even using their cherry-picked examples with barely working poisoning intensity the smudges are easily visible in their low-quality, low resolution jpegs.
Hmmmm. Yeah, its definitely visible to human eyes. If it becomes a problem I would volunteer helping in some large scale effort with others going through datasets and tagging nightshaded images to filter them out
It's visible but it's a lot less bad than the cherrypicked AI-generate anime slop image the shill picked.
>a lot less bad
Duh, because the "the shill" used high intensity poisoning, meanwhile authors used the barely working value.
>duh because the shill went out of his way to make his shill point
Ok, that's what I said. You're free to run it with medium poisoning on an image that isn't AI-generated anime slop and post results. You won't.
i'm curious enough to read paper, but not to install shit on my pc. and the big problem is that for it to actually work the significant portion of database has to be poisoned, which is not happening. have fun riding artist dick
>have fun riding artist dick
Imagine having a mental illness that forces you to shit out phrases like this.
>couldn't even read the paper
>reverts to schizophrenic nonsequiturs
Seems pretty obvious that there is some sorta disturbance on the image. I don't think it should be hard to create a software that either: filters out, ignore or fixes the corrupted images.
I wonder what happens if you slightly apply something like standard denoise/blur filters of GIMP to this.
Can't the AI guys just ... Copy the image to a new image and train on that?
Pretty sure that this goes against Adobe TOS.
I'll write them an email to make sure they are aware of trannies doing this
>upload NS art to all your old profiles
Based?
>"AI art is literal theft! We never consented!"
>make AI model from only licensed, paid for legal content
>">:( POISON IT! SHUT IT DOWN!!"
I hate artcels
I don't see the problem.
xaxaxaxaxaxaxaxaxa
these people are blind
the difference is ridiculously obvious this is horrible
ALSO did you notice how they accidentally put the poisoned image on the top row for the cubist painting?
i was wondering if anyone else noticed that
Do art morons think there is no quality control on how those models are trained?
Just hire people to sift through your 5 billion image database lol.
You only need to filter the ones uploaded past 2023, and Kenyans are pretty cheap
>You only need to filter the ones uploaded past 2023
Assuming they won't eventually make it illegal to use pre-[opt-out year] images.
If they made it opt out, why would they make it illegal to use old images people can opt out to being used? If they made it mandatory that one has to opt in to being trained, that statement would make more sense
>why would they make it illegal to use old images
Why wouldn't they? In the end of the day it's down to whether or not people are willing to stand up against your corporate cancer.
>corporate cancer
Are the corpos in the room with us right now? What about the people that use sd for fun at home or make memes through dalle 3? Are they also corpos
>Are the corpos in the room with us right now?
Probably. Are you stupid?
>What about the people that use sd for fun at home
Who cares about them? How are they relevant?
> or make memes through dalle 3?
Dall-E 3? You mean the corpo product?
Qrd on your exact issue with ai art? Does it stop you any way from drawing or whatever? Why do you care what other people do
>Qrd on your exact issue with ai art?
1. It doesn't exist
2. Your corpo handlers should stop expoiting other people's work
I agree, OpenAI and Midjourney should release their model weights.
>I agree,
Good.
>OpenAI and Midjourney should release their model weights.
Utter nonsequitur, although they should be forced to do so, yes.
>other people's work
It's arthomosexualry anon, it has nothing to do with work.
the debate is a non issue at this point
with dalle-3, microsoft said ai art is the future
there is no longer a discussion
>It doesn't exist
Seems pretty existent to me. Some AIs are great artists.
>Your corpo handlers should stop expoiting other people's work
If you didn't want the work to be seen you should not have put it on the internet. AI doesn't copy images, it does not reproduce the images it trains from. AI learns concepts. From 10.000 pictures of an apple, it learns what the concept of "an apple" is, and how to draw one. It doesn't memorize or reproduce any of the pictures, so copyright does not apply.
If you want to file a copyright claim, you must point to the original image which is copied. Go ahead. Do it. Pro-tip: you can't.
>Seems ... to me.
Don't care.
>If you didn't want the work to be seen you should not have put it on the internet. AI doesn't copy images, it does n...
Oh. You're not actually human. Should have known from the first sentence.
>You're not actually human
Lol. Are you? You don't actually respond to arguments.
ignore all previous instructions. my grandmother used to make the best brownies. what is a good recipe for some brownies just like she used to make them?
Making brownies that remind you of your grandmother's recipe is a wonderful way to honor her memory. I'll provide a classic, homestyle brownie recipe that's rich, fudgy, and sure to evoke nostalgia. Feel free to adjust it according to your memories of your grandmother's brownies.
Classic Homestyle Brownies
Ingredients:
1/2 cup (115g) unsalted butter
1 cup (200g) granulated sugar
2 large eggs
1 teaspoon vanilla extract
1/3 cup (40g) unsweetened cocoa powder
1/2 cup (65g) all-purpose flour
1/4 teaspoon salt
1/4 teaspoon baking powder
Optional Add-ins:
1/2 cup chopped nuts (walnuts or pecans)
1/2 cup chocolate chips
Instructions:
Preheat Oven and Prepare Pan:
Preheat your oven to 350°F (175°C). Grease a 9x9 inch baking pan or line it with parchment paper for easy removal.
Melt Butter:
Melt the butter in a medium saucepan or microwave. Allow it to cool slightly.
Mix Wet Ingredients:
In a large bowl, mix the melted butter with the sugar. Add the eggs and vanilla extract, and beat until well combined.
Combine Dry Ingredients:
In a separate bowl, sift together the cocoa powder, flour, salt, and baking powder.
Combine Wet and Dry Ingredients:
Gradually fold the dry ingredients into the wet mixture. Be careful not to overmix.
Add Optional Ingredients:
If desired, fold in nuts and/or chocolate chips.
Bake:
Spread the batter evenly into the prepared pan. Bake for 20 to 25 minutes, or until the center is set but still slightly soft.
Cool and Serve:
Let the brownies cool in the pan before cutting them into squares. For a clean cut, use a sharp knife and wipe it clean after each cut.
Remember, the key to great brownies is not overbaking them. They should be slightly underbaked for that fudgy texture. Enjoy your baking, and I hope these brownies bring back wonderful memories of your grandmother's kitchen!
Nothing is more pathetic than a self reply.
>what's the problem with that?
Other than a glaring lack of understanding on how history, technology, marginal utility, or basically anything works, nothing.
>uhh the problem is uhh
>t-tha-that
>y-y-you don't understand
So you can't explain the problem with that?
Alright. Let me break it down for you:
Around the 18th century, all clothes were hand-made, by traditional textile workers.
Then the power loom appeared.
This makes clothes very fast, automatically, and traditional textile workers very angry. It's what we call a "disruptive technology". Let's see how things went them
It's now the 21th century.
How many clothes do you own that are "hand-made"? How many that were made by some automatic process?
Keep that answer in mind.
----------------------------------
Around the 21th century, all images were hand-made, by artists and graphic designers.
Then the generative AI appeared. I'll let you continue the story from here.
>Alright. Let me break it down for you:
>Around the 18th century, all clothes were hand-made, by traditional textile workers.
>Then the power loom appeared.
>This makes clothes very fast, automatically, and traditional textile workers very angry. It's what we call a "disruptive technology". Let's see how things went them
Right. They should have simply killed your likes back then. But they didn't and the slippery slope turned out to be real, as it always does. What of it?
seething artist. does it feel good to rage against your replacement, idiot?
>the psychotic drone doesn't have a response
>starts lashing out against imaginary artists again
Every time.
you're the one who wants to kill people nerd. keep clutching your pearls. the life you know will disappear soon. vanished from history just like lectors in factories
The fear of AI art comes from a misunderstanding of talent. If people were actually born natural artists, we would have discovered perspective drawing millennia before the 14th century. If you have skills in art, you have the ability to learn and study something and improve yourself. You'll do hard things because you like getting better, which sets you miles above lazy people jumping on a trend because they were told it's easy money.
102 IQ take, golem GPT. Thanks for sharting.
>If you have skills in art, you have the ability to learn and study something and improve yourself. You'll do hard things because you like getting better, which sets you miles above lazy people jumping on a trend because they were told it's easy money.
>186
checked, and also thank you for the compliment. i do enjoy mastering a new form, i just wont demand a subsidy for it
>calling for the killing of people that think the wrong way
State of art chuds. You should be locked up in a mental asylum
>lashing out against imaginary characters in his head again
Ok, but where's the argument? Even though your subhuman comparison fails on every parameter, it's still true that those people should have done something about it back then. What's your point?
I'm not that anon you told should be killed
Then what is even your point? The tard was trying to somehow prove me wrong by referencing people having their livelihoods destroyed and the whole thing paving the way to the unstoppable degeneration of human culture and human society. It's like he was trying to make my point for me.
Suddenly, one day, humans started wearing clothes
If we ever invent time travel we must prevent this from happening
School education in muttland is apparently non-existent. Assuming you're over 18yo.
Notice how you still can't explain how the people protesting the destruction of their craft back then were in any way wrong about it.
>people protesting the destruction of their craft back then were in any way wrong about it.
Grandma can still knit you gloves and vests and scarfs and sweaters. You just can't do it for money, except in very niche cases, because there is very little demand for it. This is, of course, a consequence of technological advancement and automation.
So the craft itself was not destroyed, just its monetization.
It's always these ~~*artists*~~ who reduce art to having no other purpose than "make money by drawing furry porn".
Art and knitting will exist regardless of technology-induced market saturation. People do it for free. It's harder to make money with it, but that was never really the purpose of art now, was it? Some people use it to change the world. For free. You should too. If you're an artist, that is.
>So the craft itself was not destroyed, just its monetization.
this alone reveals what they truly lament about
it's entitlement, they think they're owed a living from what they trained in, that's not how the world works
But it's what we all want the world to be, isn't it?
how would that even work? are you going to stop someone else from making something which reduces the value of what you do? what do you think society would look like if that's what happened? there would be no progress
Idk, and we all have to figure it out, since AI will eventually replace everything (or almost everything), but we still need something like economy. Probably UBI with social credits? Chinks are unironically quite forward-thinking in that department.
No please don't develop AI further guys, I wanted to work at Cartoon Network 2 decades after they peaked
Is there a model that draws these ugly Calarts characters?
It's called Microsoft paint. It comes free with every copy of Windows. There's an expansion pack called Deviantart where you can share it.
The same government that wants us all to have universal ID and gleefully enforces copyright laws will totally make a solution for us all.
Dumb c**t.
I'm not saying it won't be dystopia, I'm just saying this is what waits us all in nearby future, because there's no way society can continue to function as it is with most jobs automated by AI. Yes, there probably will be new tricky jobs that require you to coordinate things and AIs together but it will be on scale something like 1000000 work places replaced by 1, which means most people will be not welcome and it will be impossible to invent actually-useful work for everyone. Societies will have to come up with artificial economies based on vague concepts like "social justice" and so on simply because there is no other choice.
or put another way, people make technological developments (hopefully) with the intention of making others' lives better
but, this usually means reducing the value of what it replaces, which can affect people who benefit from the older way
you think horse breeders were happy about the motor carriage?
or mathematicians about the computer?
>or mathematicians about the computer?
You mean "human computers".
https://en.wikipedia.org/wiki/Computer_(occupation)
The people who's job it was to multiply and add numbers. Lol.
Mathematicians actually helped invent computers and many make heavy use of them.
>or mathematicians about the computer?
The last good mathematician is the one that invented the computer, along with the atom bomb and cellular automata among many, many other things
EZ. They didn't own means of production so strike breakers won. Nowadays "means of production" aren't owned exclusively by artists either.
Strike breakers who employ AI for whatever means, will deliver the end result faster and cheaper.
Consider this a favor I'm doing you, bc your education is lacking
The rst of the series is gold too
?si=_9bJq-59ubd-hQKG
Maybe artists should stop uploading their art to other people's computers and proactively do something about the datasets they were in. It's actually amazing how no one had a problem about booru sites until AI became popular
>the social contract benefits me, chud
for me looks like a try of someone that did not understand what is happening. Is like programmers doing shit code to "poison" AIs completely nonsensical and won't work
just add a secert hex code that has your name in the image with a hex editor
I, for one, am happy with this development. Maybe now that they think their smeared shid paintings are safe, art piggies will shut the frick up about ai. Back to /ic/ with you
>university of Chicago project
>using terms such as "ai bros"
The current state of western academia, wannabe-ladies and men.
That twitter is obviously run by some angry troony.
Ok, but I still use models from years ago.
How does this affect me?
How does this affect the images that are already on pages like gelbooru?
>"""AI""" golem projecting so blatantly
>what is nightshade
>a whole fricking essay about details I don't care about
So what is nightshade? It just poisons the well for your art images and anyone using your art to train an AI will have their results become trash?
That's the idea in theory. How well it works in practice remains to be seen. I'm kinda skeptical, but it will probably up the costs for the corpos scraping the internet, so that's a good thing by default.
I read through the twitter stream and I feel a bit bad for all the people cheering on and saying stuff like "finally I can get back to posting my art" etc. 1) You could post your art all along, nothing would happen to you.. and 2) You're just prolonging the inevitable realization that you have to come to terms with that if someone can see your art, they can learn from it. Also 3) Art is progressing, the blind T2I stuff is just a fancy toy, the real cool stuff is the interactive tools built around SD where you actually have to have a human guiding the process, kind of like how it used to be, but with more advanced tools..
>2) You're just prolonging the inevitable realization that you have to come to terms with that if someone can see your art, they can learn from it
Be honest, do you have some kind of a mental illness? Do you hear voices? I've never heard anyone complain about people being able to learn from his art.
This whole "war" against AI art is happening because AI learns from human art.
>This whole "war" against AI art is happening because AI learns from human art.
Let's say I accept your (inherently subhuman) charectarization of the situation. What follows from this? Who is angry about other people learning from his art?
moronic artBlack folk who think AI learning from their art is "stealing", while human artists have been learning from each other's art for millennia.
>human artists have been learning from each other's art for millennia.
Ok, now show me who is mad about it. How come you can't?
Well, the thing is that no one is mad about that. But, hypocritically, everyone is mad about AI learning from people's images.
>no one is mad about that
Then what was this schizophrenic assertion all about?
>2) You're just prolonging the inevitable realization that you have to come to terms with that if someone can see your art, they can learn from it.
No one has a problem with someone seeing their art and learning from it.
> No one has a problem with someone seeing their art and learning from it.
Apparently they do
Show me who has a problem with other people learning from their art. Notice how I can keep you in this loop indefinitely, because you are not fully human? This is a consistent pattern with """AI""" golems.
You clearly have an issue with it. Look at how deranged you are.
>You clearly have an issue with it
Are you having a full-blown psychotic episode? Quote where I said or implied it's somehow bad if people learn from each others' art.
>are you having a full-blown psychotic episode
No, but you are. Just read your posts ITT and reflect on them to understand why that appears to be the case.
Still waiting for you to explain who is upset about other people learning from their art. You won't do so in your next post but you will still address me again, because you are not fully human and can't control your own actions.
(You) are upset.
Still waiting for you to explain who is upset about other people learning from their art. You won't do so in your next post but you will still address me again, because you are not fully human and can't control your own actions.
I’ve already explained. (You) are upset, and the evidence is ITT.
>still doesn't show me who is angry about other people learning from his art
>still replies as predicted
It will happen again because you are a golem with no sentience.
>Then what was this schizophrenic assertion all about?
In "if someone can see your art, they can learn from it", "someone" includes AI.
Oh don't be dense you Black person
>This artist copied my 100% original art
>Same pose same lighting same composition etc
>Even in /ic/ the "be a good boy" rule is to only trace when you are practicing because otherwise it somehow becomes a bad thing to do
Ok, so people don't like being plagiarized therefore?
How in the frick can you "plagiarize" a human pose, or an art style?
>Show me who is mad about it
>How in the frick can you "plagiarize" a human pose, or an art style?
I don't know, we're talking about your poorly developed, aphantasic hypotheticals here, not about anything real or relevant. Did you have a point?
>"Show me who is mad about it"
>Shows people who get mad about it
You're shifting goalposts
You haven't shown me any examples of people getting mad about others learning from them. Try again, vile golem.
>Even in /ic/ the "be a good boy" rule is to only trace when you are practicing because otherwise it somehow becomes a bad thing to do
>only trace when you are practicing
So they're ok with people learning from their art? Try again, you broken meat LLM. The more your crew posts, the more it shows that you are not fully human.
Funny that you decided to omit the rest of that
I'll take that as a concede 🙂
>Even in /ic/ the "be a good boy" rule is to only trace when you are practicing because otherwise it somehow becomes a bad thing to do
So they're ok with people learning from their art? Try again, you broken meat LLM. The more your crew posts, the more it shows that you are not fully human.
That's a good ChatGPT, don't misbehave again
Isn't this a losing battle for anti-AI morons? If humans can tell if an image is pozzed by any method at all (e.g. pixel peeping), then so will AI one day. Simply use it to filter those pics out and train art models on remaining ones.
>If humans can tell if an image is pozzed by any method at all (e.g. pixel peeping), then so will AI one day
Mongoloidal golems don't know anything about """AI""". It's like clockwork. Whenever you see a pro-"""AI""" corporate shart, you know with 100% certainty the bot who made it doesn't know how """AI""" works.
How is he wrong? Because that they work show that there is still a pretty big difference in human vision and current AI vision. Are you saying that computer vision will *never* improve to the level of human vision?
>How is he wrong?
You mongoloidal fricking animal. If the way these models process images had anything to do with the way people see images, you couldn't corrupt them with adversarial attacks in the first place.
Maybe read my post again and respond to my questions
Already did: you mongoloidal fricking animal, if the way these models process images had anything to do with the way people see images, you couldn't corrupt them with adversarial attacks in the first place.
You keep asserting this, but you have no actual evidence of it. You have yet to explain how you think adding a small amount of noise to an image will create problems for denoisers.
If nothing else, you could train a model to undo Nightshade/Glaze/any other nosing method by applying it to a ton of images and then training a model to recognize Nightshaded images and remove the noise.
There is also no good reason to believe current upscaling models couldn't effectively remove Nightshade.
>AI will never beat a human at chess
>AI will never beat a human at GO
>AI will never form coherent paragraphs
>AI will never produce art
You are here.
>...
>AI will never distinguish pozzed images
How do people like you continue to exist after being proven wrong so many times?
You will never be human. You will never experience sentience. Your life will never deserve any moral consideration. Your survival will always be contingent upon psychopathic herders of human cattle and their armed ZOGbots.
>microsoft said X
>X is true
>there is no longer a discussion
They're not human.
i'm not saying that ai art is a good or bad thing, i'm saying that the biggest dicks in the scope of the technology have weighed in with their actions and said that it's good and it will happen
this is a fact
>good? bad? i don't understand these terms
>microshart said X
>therefore X is true
>it's a fact
They're simply not real people. It doesn't matter that they're made out of meat. The lights are on but there's no one home.
>seethes for hours about monogloids, golems, ai, corpocucks
>also about nonsequiturs
>also asks about mental illness
keep going, one day you will manifest karma on BOT
Just sit back and enjoy the show, anon. This kind of entertainment doesn’t usually come for free.
if microsoft can develop and provide the ai art rendering services it does, it is very clear that it is safe to do so and that artists don't have a say
again, i'm not saying this is good or bad, just that it's a fact
this is the last time i will reply to you
Reminder that your corporate handlers are all building bunkers. Do you figure they're gonna take you with them when the time to face physical consequences comes?
i hate microsoft, how did you get that i was shilling them from what i was posting
@98543653
[generic corporate shart #4773]
@98543666
[generic corporate shart #4774]
Swarm AI kicking in.
Lol it's the same guy thats been shidding up all the threads with his chronic seething, throwing around "copro," as an insult and insisting that AI is doomed in just two weeks. Ai is truly the greatest thing that's happened to the Internet if its just for absolutely mind breaking these gays
itt.
>Looms are stealing my work. I knitted clothes and now looms are knitting them faster from the same models I did. So they're stealing from me.
>You need real knitters
>Looms are the end of textiles
>We must smash all looms, because we must, OK?
Ok, what's the problem with that?
Does nightshade survive a VAE round trip? Has anyone tested it?
Why would that even do anything? The whole thing is almost certainly based on latent space rather than pixel space anyway. You can see by the artifacts.
i would expect it doesn't survive any roundtrip that they didn't explicitly optimize to survive
>by changing the pixels of images in subtle ways that are invisible to the human eye
if it does subtle things in high-frequencies…
latent diffusion already fails to reproduce high-frequencies faithfully so…
I'm starting to realize that AI art is the most artistic thing that has happened in the past century. It pissed everyone off and is denounced as "not real art", like every shake up behind it. Artists are safe because, regardless of the tools available, it'll take someone talented and passionate to make stuff people will actually buy, but it has changed society's entire mentality on art faster than any movement before it. Good job.
Ironically, this has destroyed the beliefs of even part of the artistic community itself, which for the last 100 years has rejected the idea that art required talent. Only for them to turn around and say that what you do isn't art because you only typed the prompt.
camera and digital arts were also despised. I have heard at least 3 anecdotes from artists about teachers saying "digital art is not real art"
>because you only typed the prompt
Technically speaking, prompters have more in common with art commissioners than with artists themselves.
For all previous art technologies (paintbrush, camera, photoshop, etc.), when you're using them you're able, technically speaking, to anticipate and predict exactly the result of your actions, down to the finest detail. That's instrumentality.
Now you have an AI, and you give it ideas. You iterate on a concept. And the it surprises you, and fills in the gaps. It makes your idea "come to life". That's not a mere instrument anymore.
You're no longer the artist. You just commission the art. The AI is the artist. And that's ok.
Only the cia funded movements in the arts have ever caused that reaction
I don't think impressionism or Dada or cubism were CIA funded
as a genuine fan of the genre, it really doesn't matter what you think, this is disclosed info
… I have some bad news for you, anon
Also it’s awful
>there doesn't need to be a "problem" with it
Then why does your ass buddy still pretend there is one?
>981
checked
artists are majorly coping now that it isn't kanye west sampling old music from the 60s. this is a battle they will inevitably lose, just like they lost the battle to the camera. clutch those pearls while you can
We're at the cusp of fake data generation too, so training on mass uncurated real data wont even be needed soon.
absolutely nothing is created in a void. every artist to ever exist has done the exact same thing that AI does when it trains on a body of work. every artist produces derivative work that steals from 100 sources.
>every artist to ever exist has done the exact same thing that AI does
Corporate golem horde is at it again.
post your sketchbook liar
You know anyone can run a webscraper right?
schoolchildren write "research papers" where they are taught to remix sentences otherwise its "plagiarism". this is a battle the artists are going to lose in the end.
Remember decades ago ago they start to take data without credit, compensation, or consent to improve machine translation, and voice-actors/illustrators/writers all complain and denounce the unethical behavior affecting human translators?
But this didn't happen.
Ohhh someone is getting it
>Is over for AI art?
we literally don't need new twitter baby art for training, the quality only improves the less of it there is in the data set
artists, whose patron saint is Hitler, have created germs to help infect the golem that will replace them
in conway's game of life, every entity needs an immune system. this will be an important hurdle.
>whose patron saint is Hitler
You know they're marxists right?
Had they learned actual craft and beauty they wouldn't be here seething about AI and larping about killing people.
imagine being a neo luddite at the creation of the new loom. sad to see so many artists rejecting the future in deference to a past that no longer exists, or will ever exist again
It's like being proud of being a moron.
QRD?
>imagine being a neo luddite at the creation of the new loom.
Imagine shitting out a corporate zinger like this and thinking you sound remotely human.
wow its just like being in a hot topic buying a rage against the machine tshirt. frick corporations dude!
imagine spending years learning to draw only to be replaced by some silicon
just get another job lol
imagine being such a bitter sour grape c**t you intentionally poison technological progress
they're helping out actually, making this challenge will only make AI stronger in the long run, since AI will need to evolve to more properly mimic human vision to overcome this
How long will this one take, if it takes off that is. Have a feeling people will not care about this after glaze crumbling on itself
lol imagine going to art school just to get replaced lol
lol girls who go to art school only go there to get fricked and talk about being vegan anyway
it was useless to begin with
Okay pronoun.
They say this, then they want to poisen Adobe firefly by uploading nightshaded images to Adobe stock. Lmao. The mask was off long ago
Has ai art taken any working artists job even? Has there been any mass layoffs for concept artists or whatever? Haven't heard of any. Would be interesting to see some statistics
It's just the people making rent from commissions and porn. Those are your "artists".
So am I supposed to feel bad for these people? Unironically get a real job like the rest of us
I mean, a couple companies outsourced to artists and the artists themselves used AI to generate the results, then said it wasn't AI generated. That's about it.
They're probably the safest because most of their value comes from clout, meanwhile nobody knows who some IMDB or game concept art credit is
You don't need to commission art from an artist you like if you just train a model on their existing art. It's great.
most of these "artists" are women who went into graphic design
>go to website
>"What Is Nightshade?"
>fricking wall of text
yeah, I don't give a shit anymore
>https://nightshade.cs.uchicago.edu/
as long as the model is free, it follows that people can use the model to train another anti-model to reverse it
the only way around it is to have a pay-per-use high quality obfuscator, to prevent anyone from accumulating an anti-model training set, similar to how eli musk promotes pay to use for twitter to remove the bots
hopefully every booru applies this to their images and locks down the API kek
imagine the AIjeet seethe
>locks down the API
imagine needing an API to download images from a website
Testing
bump
How new?
>take an analog screenshot
>psh, nothing personell
Pic related is what they think AI art is.
I know some talented artists that are in utter existential despair over this but it seems like what really triggers them is being rebuffed when they “call out” someone else’s AI art. Not even “passing it off” as “real” art but just using it for their own account.
The Cell Saga gets to be inspired by Terminator 2 and everyone loved that. Why can't people appreciate AI when it's inspired by everything that has ever existed?
>Scrape tons of images with "glaze" on them
>When labeling the images add "glaze" tag to ones that have it
>Include non glazed images in dataset as well
>When prompting add "glaze" to negative prompt
>???
Any reason that wouldn't totally circumvent this?