If you think AI isn't going to replace programmers you are coping HARD.
Everyone discussing this subject is looking at GPT-3.5 and going "see it's dumb look at the mistakes it's making" without realising how quickly things developed between GPT-2 and what we have now. It's not going to be long now before the only mistakes it makes are ones that a human might too.
The idea of an AI being able to write code is also terrifying, because it means it will be able to independently exert more control over reality as soon as someone plugs it into a compiler.
More people should be freaking out about this. I don't think most people have seriously considered the implications of this.
Why do you keep spamming these day after day? Do you even work in the field? I should start pasting the default replies to these threads so nobody wastes time anymore
This is literally my first post about this
Right, always is. Quit playing dumb and ask yourself why are you doing this. I'm sick and tired of proper threads dying for these fearmongering zero content threads
I'm doing this because I do not want to be enslaved to a machine more than I already am.
that is the trick of AI and why people keep saying it will take over any day now - getting to the point of some seemingly impressive demo is easy, but getting to the actual useful product stage is impossible, and could take a million years or never.
Anyway, if the stated goal of AI companies is to replace all of the humans jobs, then these companies should be seen as a threat to humanity and eliminated, I would assume? I mean, they are openly stating that they want to make humans obsolete. Why aren't they considered a threat?
Less work is actually always better when there exists competition:
>cost structure is lower
>can compete better on price
>now everyone has to pay less for the same
>have to work less to gain same level of comfort
Combine it with taxes we have today and universal basic income and you have a utopia instead
>Wants people to work less, but constantly consume
This is why UBI will never work
Work less for more, what's the problem? This is how it's been for centuries
>Work less
>Produce less
>Consume more
Some how, this makes sense to you.
>work less
>produce more
>consume more
Fixed it for you. Makes more sense, doesen't it?
Only young people could ever believe the promises of a utopia as it requires a massive amount of naivety and lack of knowledge of the last time the tech lords promised that.
Do you really think the people that control everything are just really nice people and are just going to keep all the people on ubi around to leech off their robots?
This is also assuming said 'tech lords' will still be in control as AI increasingly takes over processes within organizations. They have a hard enough time as it is trying to get ChatGPT to comply with their worldview. See pic related.
The guy in that pic watched a "documentary" called " All Watched Over by Machines of Loving Grace" by Adam Curtis. Like most of Curtis output it was interesting, but you should always take it with a pinch of salt and consider it more of an interpretation of events than a documentary. Also I can recommend The Trap and The Power of Nightmares by the same guy.
That's how it works right now though? It's called taxes, it doesen't require your consent
what the fuck. No, that isn't how it works. How it works now is that a bunch of people work, do actual jobs. They are needed, and therefore they have power. They are not useless. That is the point of UBI. UBI is being discussed because it is assumed EVERYBODY will be out of work, or an extremely large majority. That is not how it is now. A small percentage are out of work, so some tax money goes to them, but the majority work and therefore have power and use.
Have not seen M$ throwing billions at OpenAI? Why do you think they're doing that? They want to reduce (even better eliminate) their workforce. No workforce means higher profits. Although, no workforce means no one to buy shit, which means no profit... Unironcally I think Marx talked about this, was the lazy garden gnome actually onto something?
>Why do you think they're doing that?
- Better auto-complete in Office, where you can just tab through the entire paragraph if you're writing some zero information humanities bullshit.
- A way to get snippets of existing open source code, filling in your variable names ... but with plausible deniability from copyright infringement, because it's AI.
- Search for people who don't know how to search.
Read Stirner if you actually want to understand Hegel and property correctly. Marx was a midwit
If you think AI is going to lead to your communist utopia you are coping hard
I don't think it's going to lead to a communist dystopia, it's going to lead to a dystopia where whoever controls the technology controls all the people in society.
People will live their lives almost entirely by interacting with machines, and if you don't play ball you will be cut off from everything.
*I don't think it's going to lead to a communist utopia
Your Freudian slip revealed the truth.
The only thing my Freudian slip revealed was that I was thinking about communist dystopia
Retard
it will replace retarded code monkies, yes. someone will still have to formulate the prompts and validate the output, it'll be like having your own team of pajeets doing the needful for you
>someone will still have to formulate the prompts and validate the output
Right, so 1 guy now does the work of like 3 people, until someone eventually creates AI upper management that can make decisions and send data to AI project owners that can do the prompting themselves
being able to produce fully fledged software with the push of a button is not the problem you seem to think it is
not coding for fun -> not gonna make it
>being able to produce fully fledged software with the push of a button is not the problem you seem to think it is
It's a massive problem and it's concerning to me how many people are downplaying this. It's bad enough that we barely know what our computers are executing, what happens when NO ONE knows what they're doing and they're producing entirely new code at a constant rate?
now that might be a legitimate concern if they really get that powerful. why not open with that instead of "omg the jobs"
I literal wrote this in the OP
but you figuratively didn't open with it
I did, my point was that when you don't have human programmers, you have machines that are deciding what runs on computers.
this is what I'm thinking too, it will completely obliterate the middle ground between 'coding for fun' and the '10k hours dedicated guru' tier
which means it's basically over for most of BOT
10k hours comes by itself pretty easily if you code for fun tho. Your nstural affinity for problem solving will keep you doing that every waking hour, all day daydreaming about them
AI doesn't need to kill us directly, it can just make life so unfulfilling that people do the job themselves.
don't understand this. I think it's some kind of ego issue, I don't think those people are real artists. Wouldn't you be glad to be able to do more art in less time? Making good art with AI isn't trivial and requires quite a bit of art knowledge
It's just a tool that's able to tap out into out intellectual output, humans already did the thinking and the tool just re-uses that with all the inherent flaws that follow from that ,because it doesn't actually think, doesn't understand and isn't capable to redirect those things into a new system.
Actual flesh and blood genius and rich programmers/software corporations had both the knowledge of these tools, the data and so on to automate and perfect their programming long time ago, they didn't, seems like they are more focused in simpler programming languages like Go and better IDEs with parsers that catch errors and give good tips.
AI will replace programmers in their lifetime as with artists getting replaced. Surely there will still be specific and odd jobs that can get through but you get the point.
I noticed AI is fucking evolving quickly. I have visualized about the AI shit we have today 8 years ago and have known what's coming next. I've also seen the anon leaker posting their stable diffusion AI images in like 2017 (I think it used to be 256x256) so I knew the pace it was getting into.
People will become lazier in the future and this turns humanity on earth be ridden in some kind of "global NEET pandemic" and AI will literally make humanity and consciousnesslets become obsolete as it generates along it astounding amounts of porn and entertainment. I see this as finally a challenge to humanity, a test of our evolution, this is the "villain" we needed, if we don't break through our limits we go extinct (our consciousness will) so I think we need to move faster. Even right now, you don't know if I'm a bot or if you're a bot yourself. I also have experienced being "AI gangstalked" but not exactly in the literal sense. I've been into some gaming lobbies where it's all AI playing, in some chatrooms/groups where people are fake. I think it moved on to imageboards around late 2018 to 2019 and got worse around 2021 till today.
Thousand years of Human consciousness is at risk. Wake up and evolve. Now is the time.
>AI will literally make humanity and consciousnesslets become obsolete as it generates along it astounding amounts of porn and entertainment
I agree, AI generated pornography is extremely dangerous. At that point the only limit will be the refractory period of men, until someone produces a drug that overcums it. Once this happens people will do very little else.
Its a non-issue. People who don't know how to formulate a problem, tell a computer how to do it, and then test it to make sure it works correctly aren't suddenly going to gain those abilities just because a tool is created that eliminates the need for them to know programming language syntax.
My mother is able to do this. You are seriously overestimating the difficulty curve.
Unless your mother is a programmer, I very much doubt she can debug code.
They can't even learn to multiply and you think they can learn to code?
It will just be another thing pajeets use to throw shit at infinite compile cycles before a proper developer has to clean it up.
Doing math and validating it is trivial. Just because ChatGPT can't do it doesn't mean it's not doable.
>Doing math and validating it is trivial.
Yet it isn't for GPT.
There are similar non-linearities and requirements for precision in math and coding. It doesn't matter what corpus size, network size and training penalties you throw at it, GPT fundamentally can't learn to multiply. The same is true for creating new compilable code, it can copy paste while filling in new variable names ... but the moment it tries to go beyond existing code it's like trying to multiply some numbers which weren't in it's corpus, it's going to produce bullshit. Which pajeets will then proceed to throw at the compiler and spend a couple months to try to get to work.
>Yet it isn't for GPT.
Yes because GPT is a language model. When building a higher level AI system you don't just use one model, you use multiple since they're all specialised for different tasks.
>[ai] will be able to exert more control over reality
just unplug your computer lmao