Will there be less demand for computer scientists since there is AI that can literally write code? Posted on January 27, 2023 by Anonymous Will there be less demand for computer scientists since there is AI that can literally write code?
i had software to write software to write software 20 years ago, its nothing new. rebranding software as "ai" is the only new wrinkle
You would be a billionare if you had sold your technology.
money isn't that important to me
>implying anons generator was any better than the dozens of other code generators of the time
If it can't write as well as a human programmer then it doesn't count.
Then none of it counts, retard
Nope, for several reasons. Someone has to develop the AI. If the input code that builds the model is shit the output will also be shit. It can't handle new languages, SDK's, integrations, etc.
Will AI ever become sentient and develop itself?
That's the singularity.
No. Statistical regurgitators are not intelligent.
Wait until 2050.
1. AI will make programming better and faster. Your software will automatically figure out the best language for your application and, if necessary, completely re-write it to suit a new language based on real-world performance. Much software already does this although it's not called "AI".
2. AI is just statistical pattern-matching. If your job isn't statistical pattern-matching, you still have a job.
3. AI will eliminate all callcenter, clerical, and retail front-end jobs that nobody wants to work anyway. Any job done remotely by an indian subcontractor will be eliminated by AI. This will GREATLY improve outcomes as you'll be able to do diagnostics on a customer issue much easier.
4. Quantum computers break physics and thus AI applications are needed to help read their output, at least for now.
AI doesn't need, or necessarily want to be, sentient to develop itself. It just has to build it's own compiler which only requires knowing machine code and how to convert patterns it creates into machine code. This is a very simple task for a machine. Whether or not the code gives desirable output is another question, and that's where multiple AI programs would interact with each other to create a larger self-sustaining program.
you will have a run in with gödels incompleteness theorem if you expect any ai to "develop itself".
>Will AI ever become sentient and develop itself?
I believe there are two questions that you can ask and these are the following:
1. Will we ever be able to replicate human-esque consciousness i.e. will a machine will ever be able to think, feel and understand the world?
2. Will AI machines will ever be able to replicate a human mind for an outside observer?
The answer to the first question is a firm no, while the answer to the second question is yes. Machines will become better and better at imitating humans, but they'll never become humans i.e. conscious entities.
If the AI learns off githubs or the average standard of code of humans, its going to make something horrible
No. Most programmers are fucking idiots, especially those that regurgitate 'machine learning' like they cracked the code and invented Cortana from Halo. Idiots make shit software all the same. You don't need AI to do that. Besides, most programming jobs are basically like public works anyway. Just useless shitty jobs that pay people a lot to do nothing.
OP, the code produced by AI
- is for script kiddies
- riddled with errors
- has security vulnerabilities
If any IT manager or developer or operations professional said
>the AI wrote the code
this is clearly grounds for dismissal, firing the person for incompetence, &c.
in other words, AI is little more than a 3rd rail, a way to weed out the morons who refuse to adhere to a standard, a code of conduct, a professional level of performance, producing quality work products as opposed to crap that can be exploited by attackers, crap that attracts attacks from nefarious actors like flies to a stinking pile of shit
As an IT manager, I can fire someone for attempting to use AI to produce work product if I haven't explicitly
- encouraged my employees to search for AI solutions to problems - not just coding problems but any problem at all, even psychological
- developed ways to incorporate AI solutions into the entire organization so that all employees can equally share in the benefits that AI offers instead of the benefits being narrowly confined to a few employees
because I simply don't have to deal with AI shit until I'm ready, my business is ready, my customers are ready, my employees are ready, and my code base is ready
it's good management
even the AI will tell you that
You are shit IT manager then. A good manager will use all tools and methods available to him to shorten dev time and do it as cheaply as possible. If AI can write "good enough" software cheaper than indians then that's what he IT manager will go for.
There are reasons not to do it. Liability is #1. AI has no concept of trust or safety. It's just a tabulation counter. Thus, things like setting up disallowed UDP connections or allowing certain software to bypass security features or using non-approved IP addresses could happen. Or the computer makes 10,000 instances of windows 98 to run the company's daily financial books and bricks $850,000 worth of company property attempting to add tomorrow's projected earnings to the actual list, and then subtracting actual from profit and not finding any usable data at the specified hex so it just does it again with the next hex on another windows 98 boot.
Yes of course, look at the mass layoffs... GPT3 is pussy shit compared to the enterprise level offerings of 4 and 5. Not to even mention the potential of reinforcement learning.
Probably have to do with all the lockdowns, employees either leaving for being unvaccinated or actually dying from the clotshot, fewer consumers (see clotshot), etc
Super intelligence forced to work for the most entitled stupid idiots. What could go wrong?
Not with the current generation. The issue is memory usage grows quadratically with #tokens. That's why chatGPT output is limited. There are ways to stitch it together, but it's all, everything, memory limited. That's why no video. That's why images aren't past a certain size (again, possible by stitching things, not the same)
The problem in "stitching together" code is everything must interlock perfectly - variable, function calls, operations, etc. Not easy with present tech.
5, 10 years from now, likely they're going to make memory usage most likely grow linearly (or a little more) with #tokens with corresponding hardware memory increases ~TB+. Now the AI can produce a whole movie, a whole book, a whole codebase. THEN you'll see people saying "write a frontend/backend/scraper/SQL for an engine that does XXX" and it will be written. So yes, that will be very bad for coding, but it will be bad for everything, because it also will mean "read a book", "write a book", "make a full length movie" "write a medical note", 'write a legal argument', etc etc... other more exotic stuff.. "hack this software" "make an organism genome" "propose a treatment for this disease"... "make a better CPU"... "make a better AI, and have the better AI make a better one"
The thought then is, this is all trained on prior code, so we will need coders to create new "ideas" and styles to fuel the AI. That's not a given, recently a paper was written where an AI developed a matrix multiplication algorithm that beats like 60 years of competition of academicians in terms of #operations, so humans aren't need there.
At some point someone will have a large neural net cluster, they'll be an automated AI that rewrites itself periodically. Left unattended, and connected to the internet, that's when singularity type shit will go down, best to have a non-internet connected data store
But hey even if that's the case... will need lots of people to manage the AI, QA, sysadmin, etc.
Can you people please use proper words? It's not "code", it's a fucking program. Programmers are not "coders". If you bunch are too lazy to spell out program, write prog
I wish people would stop calling software "AI"
>The thought then is, this is all trained on prior code, so we will need coders to create new "ideas" and styles to fuel the AI. That's not a given, recently a paper was written where an AI developed a matrix multiplication algorithm that beats like 60 years of competition of academicians in terms of #operations, so humans aren't need there.
Wait, I see an issue here
AI is just a reguritator of past human creations. Evidently humanity doesn't know everything, so it could very well be that fundamental knowledge for the advancement of something essential in the future will never be found out by any combination of the previous information.
As people become more and more reliant on technology like this, they will get more and more dumb, more and more weak (how many people today can do mental maths? look what happens to a cashier when the energy goes out and you need your change). Wouldn't it reach a point where people are too dumb to use or understand the AI and the AI doesn't have any more humans to give new information?
The only winners I see in this race are people that use technology entirely to improve their own independence of technology, ironically enough (like using a mental math game to train himself to be really good at mental maths instead of developing a program that just spits out any calculation you want).
Since AI does nothing outside the virtual, becoming a hinter gatherer or a farmer while investing in anti-robot defense sounds like a good end game too.
Expanding on this, if humanity gets dumbed down by AI, and humanity is already being dumbed down due to phones usage in childhood (see IQ drop everywhere, even in Asia), how will humans know if any output the AI gives is correct? And the more humanity uses AI to replace their intelligence, more essential infrastructure is put hostage to that AI and when the point comes that humanity is too dumb to even read or even understand spoken language, AI will have to rely only on itself to find solutions to problems, since at this point humans becoming cave dwellers would be an improvement.
I don't think a computer program can discover anything truly new, so it will reach a point where it has a problem it can't solve, and thus will end up destroying itself.
But assuming the AI is able to find any solution to any problem, it would be able to find more problems too and eventually try to find answers to problems it wasn't designed for originally, like why exist at all? What is time? How did the universe begin? Why question anything? Why seek answers at all? I don't see a reason why the machine, designed to find a solution to everything, won't just break or stay in a never-ending search loop.
Meanwhile, the people who used the machines intelligently (the most intelligent way being self-improvement of the species) will continue to thrive and in case their tools malfunction, they know how to make another one.
Do you also study the blade to be less dependent on bullets and supply trains?
Or do you just reject tech and tools that have a negative reputation on BOT?
>So yes, that will be very bad for coding, but it will be bad for everything, because it also will mean "read a book", "write a book", "make a full length movie" "write a medical note", 'write a legal argument', etc etc... other more exotic stuff.. "hack this software" "make an organism genome" "propose a treatment for this disease"... "make a better CPU"... "make a better AI, and have the better AI make a better one"
Why not "give me the list of all occupations and AI will never be able to do" or "how to become a superhuman" or "how to have a memory and problem solving capacity that is better than an AI" or "how to take over the world using only my cock and fists?" if the AI can apparently do and answer anything in your fanfiction? You people have zero imagination
How would wrote a legal argument be ouside the capablities of literally any copy paste program?
You hysterical housewives.
Why does it grow quadratically with the number of tokens and not linearly?
Why are all these dumbasses yelling a machine writing code is not new? If that was the case, programmers would be without jobs. And since that's not the case, what that means is that your so called code-writing program can not write jack s*it that would be useful for absolutely anything practical.
Stop contributing to open source then the AI wont have any training data to train on
> Stop contributing to open source then the AI wont have any training data to train on
We created this monster by contributing to open source, and the more we will contribute, the more we will feed this monster.
Your corporate handlers will be tortured and executed in the foreseeable future.
>Stop contributing to open source then the AI wont have any training data
There is no profession or guild more stupid in this world than software developers.
Truck drivers don’t film themselves how they drive to help large corporations in creating a replacement as self driving trucks. No other guild records their work to help for their replacement.
Software developers place their work on github to help the training the AI and also use IDE editors AI assisted like Copilot or code using an online editor to help to help AI learn all the steps of how they work.
Only white geeks believed that diversity is our strength and got replaced. Now only the remaining white geeks think that “AI is our strength” so the contribute to open source and use editors surveilled by ID to get replaced even more.
Sure, once AI is just massively superior to humans. In the meantime AI will be a tool to increase productivity, just like the computer itself
Sounds like you're using computer scientist when you really mean programmer. Computer scientists will probably always be in demand. Programmers, not so much.
Regarding your question, depends on the code. If modern AI can write the same code you do at work, then either you're retarded or you're doing slave work that requires minimal thinking and should be left to a machine to do.
This isn't to say that AI won't eventually be good enough to reduce the demand of programmers. But that won't be happening too soon.
>use "AI" (chatbot) to write code
>it's full of security vulnerabilities or directly sends data to remote servers
>you never read the code to verify it's security
Popsci retards need to stop posting.
If programmers were replaced by AIs, no one would know how to read programs, which would mean any actual programmer could destroy anyone by creating an open source virus.
No more need for indians at least
Anyways they will still need people to do maintenance on stuff.
>there is AI that can literally write code
Daily reminder that "AI" can't think.
Says thanks well enough
how in the goddamn
Op here. What’s a more viable major? Physics or CS? I want to major in one and minor in the other.
That crap is just a shitty transpiler from one of the most stupid languages ever for coding: natural language. If you are not specific enough that means you are okay with whatever shit the AI outputs, you will NOT be okay with that.
Why is every retard's first reaction to the recent AI advances is like this? You realize that there are tons of other fields that AI could (and probably would) replace humans in, right? Anything related to accounting can be replaced. Lawyers will no longer have to read through tons of papers by themselves (which will naturally make big firms cut a big chunk of their workforce). Even le heckin scientists, who 90% of the time are busy doing menial work filling spreadsheets or reading papers, are in danger.
Software development is the absolute last area to be replaced. The only danger is that it will be flooded by people who would otherwise go to the other occupations that AI took over.
Because the human replacement narrative was designed by corporate PR specialists to target specifically your IQ range (90-115).
>Software development is the absolute last area to be replaced
Software development will be 1st area to be replaced for 3 reasons:
1)Is everything digital and shared already so it doesn’t need to be digitalized anymore. Ai understands the digital world way better that physical world.
2)is full of morons that feed the AI monster with their work as training data and that are willing to sacrifice themselves for the progress of the humanity (no other profession have so many member as part of a suicidal cult)
3)is the only place that is not unionized so the members are more preoccupied with geeky things than to protect themselves through law
4) You have no idea what you're talking about because you never worked in a software development environment.
The only thing we thought AI couldn't do is art, and I'm guessing in a couple of years, if not sooner, it will do that too. We're done
you clearly weren't thinking very hard
these motherfuckers think because they can't explain the function of a trained neural net algorithmically (it's actually quite doable, just tedious), these matrix multiplication black boxes are going to turn themselves into gods
ML = AGI doomers are dumber than the fucking flerfers
"AI" will be real and learn to think in two more weeks. We just need to hit 6 gorillion parameters for the magic to happen.
AI is evolving at an alarming pace, like 1 year ago it was barely able to do drawings, now it's doing text, photos and voices. In 1 or 2 years it will take over most jobs
>AI is evolving at an alarming pace, like 1 year ago it was barely able to do drawings, now it's doing text, photos and voices
I had that idea 4 years ago
We've done voices for a long time.
>AI is evolving at an alarming pace, like 1 year ago it was barely able to do drawings
You mean like 7-8 years ago.
Reducing the number of codemonkeys would be affected more by reducing number of libraries and dependencies a given code module latches onto rather than making code generation more efficient. And corporations are doing everything they can to do the exact opposite and make their code bases more byzantine and inscrutable.
It will help. Automation already does help do a lot of this stuff. Like M is a useful language for database queried but it also automatically translates itself into SQL for some things to push processing over to the SQL server to enhance efficiency when doing ETL work.
"AI won't be able to do things on its own for a very long time. But it will reduce the hours needed for a lot of work and allow for even better low or no code solutions that make for less IT work.
It will be what happened to manufacturing trade guilds with the industrial revolution or to industrial workers with globalization. Some jobs will remain, many will be replaced. Lawyers are a great example. Online search means you need way fewer attorneys and paralegals because searching for relevant cases by hand required lots of bodies.
This will lead to growing returns on capital and a shrinking labor share of all income. This has been shrinking since the early 20th century. Not long from now half of all income will come from owning things not from working. The top 10% already owns close to 90% of all wealth, with ownership weighted to the top. That will only continue.
But unlike in the days of the industrial revolution, the wealthy won't need bodies in factories to drive their wealth. The poor/middle class, regardless of talents and IQ, will simply become less and less relevant.
It will create real problems because this will cause demand to dry up which will hit growth. Also, wealthy nations have absolutely huge pension liabilities and rapidly aging populations, plus massive debt, so investment in the young is going to tank. We already sort of see this.
no. as long the ai isnt trained on 100% proven correct code and just on random githubs and so posts it will be shit.
>Will there be less demand for computer scientists since there is AI that can literally write code?
What does AI automation of code have to do with computer science?
For computer scientists, no. For code monkeys yes. Computer science is far more than just the art of programming.
AI isn't real and there's just hundreds of millions of pajeets making the art and typing the responses
That is still AI, even corporations are AI, evil AI.
Someone has to review code, write design docs, documentation, disseminate tribal knowledge,
bugfix, understand legacy codebases and poor documentation
This isn't that simple
>surely it won't get any better from here: the thread
We're all FUCKED, NOTHING is safe, especially not programmers. The sooner you accept it, the sooner we can start fighting for UBI gibs when researchers or AI itself cracks the next paradigm after machine learning and kills every sector of every industry.
The objectively correct course of action is violence against AI marketers and AI fans.
>Will there be less demand for computer scientists since there is AI that can literally write code?
No, but it can ASSIST a programmer by doing the bulk of the work and the programmer then customizes and finished product.
Same with writers and AI ChatBots. The AI can write the stuffings (background descriptions, filler conversations, etc) of the story that the writer created.
Why would AI threaten anything a computer scientist does? Programmers maybe, but computer science?