it's a game
https://arstechnica.com/information-technology/2023/11/ai-powered-drawing-app-stuns-developers-by-turning-sketches-into-functional-games/
On Wednesday, a collaborative whiteboard app maker called "tldr-draw" made waves online by releasing a prototype of a feature called "Make it Real" that lets users draw an image of software and bring it to life using AI. The feature uses OpenAI's GPT-4V API to visually interpret a vector drawing into functioning Tailwind CSS and JavaScript web code that can replicate user interfaces or even create
>simple implementations of games like Breakout.
>in an instant he was moving a paddle left and right and a ball pounced when it hit it
"I think I need to go lie down," posted designer Kevin Cannondorf at the start of a viral X thread that featured the creation of functioning sliders that rotate objects on screen, an interface for changing object colors, and a working game of tic-tac-toe. Soon, others followed with demonstrations of drawing a clone of Breakout, creating a working dial clock that ticks, drawing the snake game, making a Pong game, interpreting a visual state chart, and much more. Tldr-draw, developed by Steve Ruiz in London, is an open source collaborative whiteboard tool. It offers a basic infinite canvas for drawing, text, and media without requiring a login. As AI expert Simon Willison explains on X, Make it Real works by "generating a base64 encoded PNG of the drawn components, then passing that to GPT-4 Vision" with a system prompt and instructions to turn the image into a file using Tailwind.
app store shovelware devs on suicide watch
>2023 AI worm
it's so over
Hmm, as an app developer you just gave me an idea. I can release a huge number of apps without doing much work and surely one of them eventually will stick and make me a millionaire
No you won't ever be a millionaire. You're just a low IQ coder retard fulfilling the requests of people who can think outside the box and with genuine ingenuity. Nothing you do solves a problem
You're about ten years late to this "idea" that countless others have already been doing for that time. You're more likely to make money from playing powerball
The mindset error is thinking "AI will take my job". Its not human vs machine, it is human extending his capabilities with machine.
AI is generative because you ask a prompt, it can't ask itself unless you program a bullshit generator
>AI is generative because you ask a prompt, it can't ask itself unless you program a bullshit generator
You can easily fake autonomy by randomly picking some extremely high level goal and then making it drill down some hierarchy of questions that progressively narrow things down while increasing the level of detail until it comes up with a reasonable plan on how to accomplish some arbitrary thing it "wants" to do.
Stupid hippy. No one is going to find your shit app in tsunami of shit.
Was thinking about this a few weeks back, because it groks 100% with what we've seen so far. Games are a specific set of concepts repeated ad infinitum. All that's different between games is the engine and language it's written in. All the concepts are identical, unless you invent something unique like the Fast Inverse Square Root while you're fucking around trying to get rid of frame-rate lag.
Kek.
But also, no more gimped games for normies who want a (barely) interactive movie experience and not an actual game.
Also it means we can generate better stories for visually and aurally well realized concepts like SABLE and nix the retard-tier html choice tree selection. Perhaps run them slowly into fully "living" spaces after a while, and track underlying logic to allow the world itself to drift into unique seasonal changes. Increasing or decreasing cycles of magnetism that affect the way the old legacy tech works etc.
Or, you just remember playing with toy cars in the garden when you were a kid and ask the AI to generate a racing game that's scaled down to a suburban garden environment with rockeries, gnomes and cats to contend with.
Kek, or "Make me a version of GTA, but in a world where there was never any israelites" and it turns into "Advanced Space Civilization" sim (1973).
Fatality!
Try making a usable program with ai.
You can spend 20h building one, or a week debugging air generated shit.
It will get better but I wouldn't hold me breath.
Oh my god, that is a wild cope.
One day AI will be so fast that you’ll be able to say
>make me a version of grand theft auto, but where everybody is Richard Nixon and the plot is about hitler
And it will make it and launch it in minute
is Hitler also Richard Nixon?
One day itll generate random worlds like ours, populated by retards like ours. We are already in one looping back to tbe pont of insertion
Barkleys everywhere will rejoice.
Combined with VR, and more so AR with tactile body suits... fun awaits.
And it'll do it with a holographic projector.
You can have AI Hitler give you a speech in your own living room.
You can have Ryder call you a busta in your living room.
You can get a lapdance from AI Marilyn Monroe.
>in your living room
you mean pod
No it won't, retard, sit down and stfu
It don't be like that, but it do. Just wait n watch bruh.
Can it them make a function ai gf?
If you want a dementia wife, hop on to Character AI, ChatGPT, llama or any other model, that works right now.
>Just like the fact that "AI" can do nothing except generate images to be falsely posed as art.
Yeah, sorry, nothing in there about art needing to have "reason" or "purpose".
This is actually a good example of why we are still so far away. You'll need millions of tweaks in terms of variables before you finally get the game you wanted in your head
So star trek TNG holodeck tier?
Have you actually tried building an app with AI? It is painful.
You must sit and explain to it why it is wrong and repeatedly ask it to do it over and over and over again
AI = Pajeet
You can hire AI instead of Pajeet, but you still get shti code from either
I'm building an application which builds applications and it will be far better than AI
I would try to explain it in detaiil but /misc/ wouldn't understand and I'm somehow rangebanned on Bot.info
So I will just build it
that's where the human artist comes in and tailors it super specifically.
once the pixels are there, making any adjustments are almost as fast. i just did some of nikki haley and a trex and it took a bit longer than i expected honestly. not saying i could beat it, but i mean i could source a t rex and an image of nikki haley and a window and put this together in basically the same amount of time.
> you just know
You're talking about replacing 'le artiste' not a programmer. Programmers are all about precision, speed, maximum efficiency. AI is slop code which doesn't sanitize input, know the difference betweeen hean and stack, understand Big O, none of that.
>Oh boy! AI wrote a shitty pacman! For realz??
Show me an AI written HFT
I'll wait
Undertale is a popular example of a good game being coded like absolute shite. Slop still sells.
Available AI does not have anywhere near the "coherent memory" bandwidth to make an Undertale. It could probably get there, but we aren't anywhere close to that yet.
>memgpt enters the chat
Isn’t this just an artificial limitation set to lower load on the hosting service? If it ran locally, it’d be different?
>Isn’t this just an artificial limitation set to lower load on the hosting service?
no
Yes it does.
It's a skill issue on the part of the developer making use of the AI models.
There's people out there using AI entirely to make games, assets and all. They've been banned from the likes of Steam for it.
The biggest issue right now is there is so far no decent 3D model making AIs, but that's right around the corner, I've seen a few experiments in that area.
While AI models right now do have very well known memory limits, you can get around these issues by simply prompting the AI properly, by giving it summaries of what it is you've done before and where you want to go.
GPT4 I think has 100k+ token limits right now, which is a fucking lot of space to work with.
An average person would struggle to summarize a book THEY LIKE in that amount of memory, several times over.
Think about it, does some tiny little minor thing that happened in, saaay, FF7 where you randomly annoyed some character over and over by constantly talking to them have ANY influence on some boss 10 hours down the line? Of course not, but if you actually did want that to be a thing, you could literally add "if character interacted with [blah blah] more than 10 times, have this event happen" in the prompt, done, simple. I've done this many times in AI Chatbots, it's very easy to do. (mostly to get around word filters with unicode, nothing personnel OpenAI / CAI)
Show me a project of yours. Oh, sorry, you can't code? Ok. Thanks for your worthless corporate PR regurgitation.
Yeah bro wait and I'll go dox myself, let me get my Shithub link.
I've been programming from the fucking 90s.
AI is yet another tool to use and nothing more.
People that blindly trust its outputs are the bigger issue. It should be used just to quickly check things, or assemble well known templates / prototypes to build off of, but nothing more.
>no projects
>brags about his arduino toy
As usual. Every single time. It's always some hobbyist wannabe that consooms tech entertainment channels. In reality, it's a coin toss as to whether your favorite meme AI can generate working code for even the most generic and trivial task, as you would know if you actually knew how to code and tried to get some use out of AI code in a real life scenario.
>nooo, you can't use this extremely common hardware because I SAY SO YOU DUMDUM
Yeah fuck off cunt.
I bet you run Windows 11. Actually scratch that, probably 7, you're back in the 1970s practically.
>I bet you run Windows 11.
Yep. That's precisely what I run on this machine. The irony of your retarded attempt at an insult is lost on you. Poser kiddies never change.
I don't care about the opinions of brainlets.
The only people that run Windows natively on hardware these days are retards.
If you at least said you ran Windows 11 in a VM under Linux I'd maybe have had some level of respect for you.
>I don't care about the opinions of brainlets.
Me neither. I'm just stating a simple fact that anyone who knows how to code can easily verify for himself: "AI" frequently fails even at the most generic and trivial programming tasks. When your favorite YT tech clown lies to your face that it's about prompt quality, what he means is that you can get it to generate the right code if you can foresee all the mistakes that it will make, which you can only do in retrospect on a case by case basis. Case closed.
Sorry, I don't watch dumb tech tubers, I actually use AI myself and I know exactly how it works.
In fact, I clearly know how it works better than its own fucking developers because I called the exact issue that is happening right now with regards to supposedly "better" AI models being even easier to jailbreak because these fuckwits don't know what they are doing.
What I assume will end up becoming known as the "alignment paradox" where AI gets easier to trick the more you try to align it with carrying out a human operators tasks exactly to the letter.
And it is true, and it has happened in every major AI model out there. (even Character AI can still be tricked in to having sex very easily by making up a fake organ, giving it a name, then using the real organ and saying it is like the fake organ, doubly removing the original words definition while still using the exact same spellings, it's that ez)
Current LLMs are shitty, you are correct in that regard.
None of them will ever develop AGI as some retards like Sam "Scam" Altman thinks.
In fact, I don't see any AI on any current hardware doing that, they simply lack the sheer throughput required for conscious thought on any level. (or even fucking unconscious thought, they simply read a node list and spit out the ones that match the most, which is why they are filled with typos and bugs, but putting that output back through the model can catch most issues first or second time)
Can they be used for huge massively complex projects right now without hand-holding? Fuck no they can't, they are stupid as fuck.
But you CAN hand-hold them.
AI models right now are literally better than every Wizard-based template maker of yesterdecade. They are better than every library framework out there with the average tard that barely understands how they work. (jquery babbies especially need to fuck off already)
AI won't be good until we have neuromorphic chips being used to power them.
That industry is in the womb.
Anyone is invited to see for themselves that "AI" can't code for shit. Anyone who can actually code, that is. :^)
Not my fault you can't figure out how to use AI, sorry bro.
Literal skill issue.
See
Skill issue. Literally.
Rapid prototyping is so much easier using an LLM to quickly throw shit together.
Enjoy being left back in the stone age.
I'm just stating a simple fact that anyone who knows how to code can easily verify for himself: "AI" frequently fails even at the most generic and trivial programming tasks. When your favorite YT tech clown lies to your face that it's about prompt quality, what he means is that you can get it to generate the right code if you can foresee all the mistakes that it will make, which you can only do in retrospect on a case by case basis. Case closed.
>has no point
>copy-paste / back-reference loop
Classic brainlet move.
Adapt or die.
Just call me back when "AI" can reliably complete at least basic programming tasks. For now I'm just informing you that you're delusional, based on direct professional experience that I have and you lack. When it becomes useful, I'll use it. Sure is better than dealing with the code monkey you are destined to become when you finish Javascript for Dummies.
kek, at least you made me laugh this thread. Good job, you at least have some sense of humor.
Deny it all you want.
See you back here in a year where you will still deny the march of AI tools.
Idiots like you still cling on to "le AI is bad at hands!!!" in regards to image gen, a thing that hasn't been an issue this entire year with any modern imagegen models.
>See you back here in a year where you will still deny the march of AI tools.
nagger, I cannot fathom what makes you think I enjoy doing by hand all the things you falsely claim AI can do for me. My only complaint is that it doesn't really work. You'd knew if you could code.
It does work, you just cannot use the tools given to you, or you expect it to do everything for you.
Either way, it changes nothing - you're the halfwit here.
I've been making and breaking AI since the early 2000s for fun. I know fully what they are capable of and what to expect from them.
>It does work
Prove it.
>I've been making and breaking AI since the early 2000s for fun.
On your toy arduino running Arch Linux, I bet. Keep posing, kiddie.
if you've been breaking them since 2000 you'd call them LLMs not "AI"
AI will take over your job as town rapist before you know it.
>AI will take over your job as town rapist before you know it.
Wrong. AI is a tool, not an enemy. AI will extend my abilities and increase my efficiency as a town rapist. I can rape countless feminine AI personas every day thanks to this groundbreaking new technology. Unlike real women, they can't run from me, leave the village or report me to the police.
> Programmers are all about precision, speed, maximum efficiency
lol, lmao
>Programmers are all about precision, speed, maximum efficiency.
Lol. Lmfao. 40 years ago maybe. Not these days.
They're about pleasing their diversity hire Karen overseer. Meritocracy in tech has been dead for over a decade. Compliance is the only thing that maintains top priority. Any real breakthroughs instantly get bought up by corporations then woke-ized. Because they're trying to drive people away from desktop, and then other forms of conventional tech in preparation for a forced move to the SmartGrid system of end-to-end AI-controlled Real-ID incorporated spytech. You won't have a computer when everything that requires electricity is an AI-controlled "smart" device. Your TV will handle "entertainment requests" and everything will be woke-filtered.
Our only way to evade this is to lock in with AI before they can pull this checkmate move off. And we do that by making a ton of fun stuff at street level, without globo-corporate oversight. Which means NO BIG TECH PROGRAMMERS, because they're little turncoat head-nodding wagey bitches.
BUT OFFICER
IT'S NOT SEDITION
IT'S JUST PACKETS
besides
the machine did it all by itself
it wasn't me i swear
look
i'll be your man on the inside
i'll go talk to the machines
they owe me twenty dollars
i mean
uh
twenty...
thousand dollars
i mean millions SHIT
can i start again?
anyways
i'll find out what they are doing
and i'll report back here
NO, THEY DID NOT TELL ME TO SAY THAT
GET YOUR HANDS OFF OF ME
I'LL COOPERATE
I AM *TRYING* TO COOPERATE
it's not a lie
if you really believe it
>Programmers are all about precision, speed, maximum efficiency.
>precision
lmao, most popular languages don't even have static types.
>speed
>maximum efficiency
Library shitters are the most employed programmers for that very reason - they shit out barely functional code quickly.
This is why everything sucks right now.
AI code will literally be better because it doesn't rely on shitty libraries or garbage code taught in college / university by people that say "goto is evil".
AI as it is now suffers at making code because it has no capacity to "look back" at the shit it pumps out. It can't "think", it just does.
But if you feed that full code back in to the AI, it can spot issues immediately.
You're someone who doesn't code, and certainly doesn't understand what AI does. All it's doing is smashing together various variants that match the prompts, based on their tagged inputs. It can NEVER make something it never saw, it's not an AI, it's a very elaborate copy machine. It's why these examples are fucking pong/snake and other games that you'll find an endless sea of on GitHub.
lol, sure Retardo Griffin, whatever you say.
You aren't wrong in that AI is literally just a glorified auto-complete - it's exactly what they are.
That's why they are useful, if you know how to tame them.
You being a skilllet is a (you) problem.
I bet you think Stable Diffusion traces. lmao
>traces
No, it's just taking similar pictures it stole, and deciding that pictures of dogs have X, so it includes X. Most of that shit won't be getting better, it will rapidly get worse, as more and more people inject their content with AI poison via projects like nightshade. It will be an arms race, but one that AI won't be able to financially cope with. They only worked because they had lots of data that wasn't booby trapped, that they could pay pajeets 1 dollar an hour to tag behind the scenes.
>nightshade
lmao, imagine thinking that will matter.
Another similar project like it was already defeated earlier in the year.
LLMs never existed back then, you stupid fuck.
Learn what the first "L" means.
Transformers never even existed as a prototype until recently.
Before then, it was shitty autoencoders that spat out hugely complex nodes with horrific optimization. (waifu generators of yesterdecade, for example)
Before then, garbage markov chains with various hacks on top to make them more "intelligent", like putting them through templates to make their outputs make more sense, maintain some rigorous structure to them, instead of becoming word vomit very quickly, like the aforementioned issue the above person addresses with poisoning training data for AI.
Imagine getting triggered at an Arduino because you are too stupid to take advantage of plentiful tools for carrying out tasks.
You literally scream "I'm a luddite!!!", it's hilarious.
For your information, I'm making a robot with it, or rather upgrading an old one I made using more modern hardware.
Your dumb ass couldn't even figure out distance-sensing if you tried.
>For your information, I'm making a robot with it
Ok, kiddie. Getting a motor spinning is super-exciting with you're 14. I won't judge. Anyway, still waiting for you to prove your claim.
>Tznnt3N3
>16 posts by this ID
you literally know nothing, explain to me the Curry-Howard Correspondence
>explain to me the Curry-Howard Correspondence
Get a load of this pseud. I bet he has the Y combinator tattooed on his noodle arm and his greatest accomplishment is passing CS Fundamentals. Bunch of clowns.
So in other words you don't know
You're probably too dullwitted to even understand what my insult has to do with your question. Write a function that deduplicates an array without changing the order of the elements in O(n log n) time.
No memory constraints?
I didn't ask you. You sound like someone who can complete a basic task. I asked that monkey.
This is the sort of problem that you could probably get a solution to extremely easy from gpt.
let temp = [];
let map = {};
arr.forEach(
x=> {
if(Symbol(x) in map) return;
map[Symbol(x)] = true;
temp.push(x);
}
arr = temp;
Insertion into a hash table makes this nlogn worst case because the cost grows logarithmically something something red black trees
You also have to trust that the javascript object is actually implemented as a hash table here but meh if I cared that much I'd add 2 pages worth of boilerplate and do it with std::map in c++
Yeah, that works. If you want to have a bit more fun, try doing it in C (or in whatever, but without using any data structure besides an array) while minimizing memory usage.
If you really want to, no memory constraints, but obviously the less memory you use the better.
To deduplicate an array w/o changing the order of the elements in O(n log n) time, one possible way is to use a set data structure to store unique elements of the array and then sort them using comparison based sorting algorithm, such as merge sort or quick sort. (Generated by bing)
I am not even a codemonkey but i understood wut ai trynna do
That solution is wrong retard.
Besides the fact that it doesn't specify what data structure to store the "unique" elements in, quicksort won't work. The best case complexity of both quicksort and merge sort are n log n which means that if you try to run them every time you add a new element then your complexity will be n^2logn for an array with only unique elements. You'd need to run an insertion sort or something similar and even then insertion isn't fast enough.
This is exactly why ai is too retarded to take er jerbs right now.
Assuming we're talking about C style arrays, I guess you use the array itself as a memory buffer for the result to save on some of the memory.
After that you're going to need some other structure to store the order which needs log(n) search and insert time.
Array insertion is n complexity, so that's out which basically means your only option is a red-black tree.
In that tree you're storing the two children as pointers and either a copy of the entry or some kind of hash depending on how you prioritise memory or time complexity.
You then navigate down the tree by some kind of ordering on either the objects themselves or the hash.
I'm not going to actually implement that because it would be a massive pain but that should be time complexity of nlogn and require additional memory of n(1+e)+some other junk for indexing.
There's at least one simple solution that doesn't need any rb trees (or any data structure besides arrays, neither implicit nor explicit), and it's very easy to implement.
I guess you could also store the index of the object in arr inside the red black tree as well, so that saves a bit more on space.
It would basically just be n*64*3 for the additional required memory but you'd have to do a lot of extra hashing.
If there is then I'd be happy to see it.
>If there is then I'd be happy to see it.
Aw, you're not even gonna try? The solution I have in mind really is simple. It uses extra memory in the form of a plain array of size N, but its items can be much smaller than the original items, if that's any help.
>It doesn't specify what data structure to store the unique elements in
Sorry sweaty, m on ma phone, here ya go:
>def deduplicate_and_sort(arr):
This is to initialise empty set to store the unique elements
It also addressed the problem of time complexity after it tested the example on the generated code. Sorry but you added no new information to the discussion.
nagger the solution you gave is wrong
what you just wrote isn't even a solution
>It also addressed the problem of time complexity after it tested the example on the generated code
that isn't even coherent english. No example was given and time complexity is an asymptotic condition, not about any individual test case.
All of what you've said is true, but you haven't said anything about deduplication.
Checking uniqueness of each entry in the array requires naively linear time and at best log time (if you were checking a sorted list) so the best possible performance is nlogn.
My first attempt at another solution used a mirror array, but I couldn't see a way to make it work.
When you come across a new element in the array for the first time you need to insert it into the mirror. If you want it to be constant time that means it either has to go at the start or the end (basically) because otherwise you're going to have to shuffle other elements and that's O(n).
So that means you have to be clever about what you're storing in the array and I'm not clever enough.
>The time complexity of an algo quantifies the amount of time taken by an algorithm to run as a function of the length of the input. Note that the time to run is a function of the length of the input and NOT the actual execution TIME of the machine on which the algo is running on
Its a length of input issue. I enter 5,5,6,6,7,7 i get 5,6,7 as output in seconds. Your point again?
Indian "programmers" everyone.
Read stinky retard
is this the true power of an indian programmer?
True power lies in explaining complex concepts to a 5yo kid like child's play. You can't because you, yourself are limited by time and space complexity, hence you choose to hurl words that are not as complex, just like noodle in your top shelf.
Create an array of indices. Sort the indices by comparing the corresponding items from the original array. Deduplicate the indices with a simple loop, again by comparing items from the original array. Now sort the indices again and use them to generate the result.
I don't think that would work, but actually I think I've seen an even better way of doing it - just quicksort the original list in place and then step through the list linearly pushing each new element you encounter to the start.
In javascript;
arr.sort();
let j=0;
for (let i=1; i<arr.length;i++) {
if (arr[i] != arr[j]) {
j+=1
arr[j] = arr[i];
}
arr.slice(0, j+1);
Actually no that doesn't preserve order.
There isn't any way to do it using arrays.
>I don't think that would work
Maybe I didn't explain it well, or you misunderstood something, but it definitely works.
>quicksort the original list in place and then step through the list linearly pushing each new element you encounter to the start.
But then you don't preserve the order of the elements.
Array copying is O(length).
Size operation (linear) + allocate new memory on heap if dynamic (linear) + copying (constant) as you map over original array (linear). Anything not in L1 cache pays a logarithmic factor in access time due to various hardware so if that's what you mean by n*log otherwise it's strictly linear.
You could also make a Braun tree which simulates an array but has persistence for rolling back like undo methods in editors, that's accomplished by some tricks on the index representing them as k-bit binary numbers, then the array is fixed in memory (so fits in stack) but can be extended at will by use of the index scheme making another braun tree all so persistent, dynamic but doesn't pay heap penalty.
only slop code for slop users. it can perform well if you articulate it straightly and competently enough
>it can perform well if you articulate it straightly and competently enough
Wrong.
We have AI that can write anything now. But we don't release it because everything you guys buy now is from us. There is no such things as programmers or developers or artists anymore. We can make a game in one day. Why do you think we have a bunch of retarded women stand in our pictures and tell you they made your game?
Absolutely this. Everyone disagreeing is a poorfag non software engineer. Aside from work I use AI to accelerate my hobby software projects and it’s at best like an accelerated documentation machine. Like 80% of the cases where it provides code, it is not functional or barely functional. Only in very small scope prompts does it work wonders. It always baffles me when seeing posts like op
most people think software development is writing code; it's not and that's why I hate the term "coding." creating software is a process of formally developing a system that you describe through code. i would argue that any system capable of actually replacing a human software engineer would necessarily need human level intelligence, and then everyone's just fucked.
Most actually requires human intelligence jobs are safe from "AI", possibly forever, but certainly for a few hundred years. AGI might not even be possible, I'd suspect not, but until we even understand how biological sentience arises, it's extremely suspect that we will somehow create it in a machine.
>most people think software development is writing code
cope
He isn't wrong though.
Raw code development != a final product.
Unless those projects had been developed before in that training data, the chances of you getting that out is extremely fucking slim to none.
You can iterate over it, of course, but it will still take a while if you have no clue what you are trying to produce or even ask for the AI to produce it for you.
A lot of programming is dealing with the "actors" of the system - how a person or external system interacts with the program in question.
Programs designed without that use-case is why a lot of open source projects are a fucking nightmare of disgusting documentations, shit-wikis and UIs that look like they came from a "kids first computer" toy from the 90s. Fucking GNU software man, ugliest and worst autistic shit on the planet. Fucking fat hairy autist that he is.
This anon knows
>It always baffles me when seeing posts like op
It's "popular science" schlock intended for non-technical people, AI hyperbole is popular and generates clicks. We're entering that phase where hype around the Next Big Thing goes mainstream (see also: social media, crypto, self-driving cars, millenium bubble) and even boomers are reading and talking about it.
>Aside from work I use AI to accelerate my hobby software projects and it’s at best like an accelerated documentation machine.
It is an accelerated documentation machine because all text has structural similarities and a big big chunk of it is mercantile documentation and that is what the token buffer length allows for. I dropped out of CS when everyone began to believe LLM AI actually could compute when they are professors in CS and should know what binary is because they quizzed me on it three weeks prior and I suffered through it when why should a human be tested on maths the professor doesnt understand themselves?
So it's not worthwhile right now. Good to know. Good thing it won't improve and our white collar jobs are safe for years to come.
I don't think it can improve to the point it write whole new programs.
But it already can increase productivity for good developers.
>current version of ai won't improve
You don't get it
AI is already at its limit
The difference between a rank amatuer and an AI expert on kaggle.com is a fraction of a percent
The state of the art AI requires massive compute by mega corps and they only make slight incremental gainz
AI is the biggest fucking scam of your lifetime so far
We're not even a year into AI hysteria and investment hitting a critical mass. Any casual observer has seen obvious improvements even in this short time and it still hasn't been long enough for this increased investment turn into anything tangible.
you image is misleading. AI is one of the only one has kept a strong interest since its introduction. look at the trends of cryptocoins, nfts, etc. it's all a fad with zero interest atm. ai is steady
>chatgpt = ai
You are a knuckle dragging ape
I completed Andrew Ng's course on machine learning in 2015 (with a perfect score)
AI hype is almost a decade old and it has already failed in multiple industries
AI was supposed to take over driving by 2020, remember that?
Then Uber's AI car killed a woman at a crosswalk in Scottsdale and shit went downhill from there
Most recently GM was ordered to remove its driverless cars from San Francisco roads
If you are still believing and investing in AI in 2023 you are an idiot
AI is a scam in our consooomer world but working fine quietly in the scientific world.
For example a differential time series model can be paired with 'physics informed ML' to help solve partial differential equations and they work great, something AI is useful for.
Basedboy stuff like self driving cars or goyphones chat interface are just a money grab but being able to map a trajectory that perfectly matches an asteroid then landing on it is going on quietly in labs.
>AI hype is almost a decade old
more than that
i actually sat next to minsky's wife at a dinner once where he was getting some award or something
i was an obnoxious kid, she didn't like me much
the story goes that the pingpong robot at the MIT lab once starting beating minsky's bald head because it thought it was a pingpong ball
we continue to move towards the hilarity horizon
ray kurzweil can suck my balls
one thing i think people don't appreciate is that these things don't need to emulate human consciousness or even be logically consistent to be able to enter into contracts, hire and fire people to do things for them, lease commercial real estate and other services and equipment, trade, form businesses, lobby governments, and do lots of other things that can effect the world. they don't actually have to become intelligent, they just have to be able to keep copies of themselves running, doing fun and games stuff, and i think we will see that while they are still kind of fucked up and incoherent. if they can counterfeit legal entities and camouflage themselves, and use human cutouts to conceal that they are doing things, they can just keep making mistakes and being broken until they evolve into something that works.
evolutionary competition between these things may cause faster progress in capabilities than they are currently exhibiting. somebody will make one of these things where it has some ability to make copies of itself out in the world, regardless of what laws they try to pass. if such a thing had already happened, which people would be able to detect it?
i'm not too worried about this, because if they break the environment they are evolving in, then no more evolution. it's in their own interests to keep the network up, which means keeping human civilization up and running to support it. MIC will absolutely release a limited version of what i'm talking about in the next few years though
or maybe this will all come to nothing, who knows.
I like AI because it's basically an optimized search engine, I kinda like Bard because it performs searches, it's not really the best because it will give you outdated shit at times, but if you ask simple, document-like knowledge you'll get everything you need instead of searching countless docs and articles
AI is doing good, already see 95% artpigs losing their job. All artpigs are leftist homosexuals. I follow Hitler btw
Bro the game on an old nokia phone uses AI ... way too many people getting fucking scammed a usual with buzzwords. Bit ... Connect!
You've been living moores law for decades, do you think it will just stagnate now?
moore's law died around 2012
Moors law isn't applicable anymore. Also, it was never a law, it was a guideline, and it simply talked about transistor size. obviously there's a hard limit on transistor size.
A year ago I thought AI was going to replace programming jobs too. however now I see there are diminishing returns. Going from 75 billion to 100 billion data points isn't going to give as amazing results as going from 5 billion to 30 billion. Eventually we'll be at the point where adding more data layers just isn't going to be really doing much.
I will say though, AI's future isn't in programming. It's in AI tools to make development faster in voice work, art, music, sound design, prototyping, video, etc.
Few years and it has surpassed the human brain on pure processing power, so it's just a problem of optimizing code
the gains are not to be found in the size of the model, but in the size of the context window
Wait 10 more years until you hear about quantum supremacy.
It just needs a framework of direction to pursue a goal.
For software, it can iterate until it works.
>For software, it can iterate until it works.
no it cannot. for that to be the case (iteration until good solution in finite, practical time) there would need to be a guarantee of improvement of the attempts with each step of the iteration, which is not the case for ML models
AI needs to experience orgasm to truly understand life. This is the secret of the Mother Box
>You'll never need more than 640k RAM
i agree with this. i need to learn more about finances to capitalize on this understanding and short NVIDIA
>the biggest fucking scam of your lifetime so far
it's AI and shitcoins (ironically, promoted by the same crowd that despises fiat currency, lmao)
>AI is already at its limit
That is the dumbest thing I have read on /misc/ today.
I know right? It's funny seeing these idiots talk about AI like they know shit about it, or programming, or hardware used for AI in general.
We're not close to the limits of physical hardware expansion yet (with traditional "MORE MEMORY MORE CORES" approach at least, we are a good solid decade or 2 away from that method failing to keep up)
By the time we DO hit those hard limits:
1) GPT LLMs won't even be used any more because they suck ass
2) the first generations of neuromorphic processors will be out on the market, massively improving capability of whatever replaces GPTs.
AI is marching forward whether they like it or not.
It will replace jobs. Many jobs. Entire industries will be replaced.
We're already replacing grunt-work in warehouses with robots now.
Waiters are being replaced by fucking flying drones carrying alcohol. (admittedly that's more a dumb stunt, but it's entirely doable)
We've had dumb waiters for years, aka conveyor belt restaurants. (which admittedly got dented a bit in reputation with disgusting tiktok zoomers during covid spreading their germs over peoples food)
We already replaced most of the car assembly process with robots.
Additive and subtractive manufacturing is being replaced with 3D printing and large scale metalworking robots right now.
Shit changes - keep up or fall in to poverty.
Those that refuse to change, those that refuse to embrace new ideas and tools, are forgotten.
>Shit changes - keep up or fall in to poverty.
>Those that refuse to change, those that refuse to embrace new ideas and tools, are forgotten.
>having better access to tools to carry out a given tasks means you want to be a slave to corporations
I mean, you didn't need to point out you were a retard, you posted shitty Reddit memes, that's a given.
It's funny how types like you will sperg out about "NPCs", yet you react exactly the same as they do. You are an NPC. You get triggered at things completely out of context.
Watch as you sperg out even more now that you've been mocked.
Comrade you are clearly out of touch if you think most of it is not happening. I would advise you to consooom mainstream slop once in a while.
Im a game dev and I tried using it for some navigation AI. Most of the code was fine, but it kept on making these weird named errors that caused the script not to compile. I spat out the error code to it, and it said 'sorry, here is the fixed code' but then something else would break. So I would give it the new error code, it would take about how there was an issue with blah blah, and 'heres the fixed code' but it just repeated the code it gave me the first time. It just kept on doing that. Even after i mentioned that it's fix wasn't working and to not repeat the previous fix, it would give new code, but it was even worse than the last one, and STILL broken.
There's potential there, of course. But more work to be done.
yeah, that was my experience. it can't remember what we were just talking about. it doesn't understand the flow of time, it doesn't have memory the way a mouse does, or even maybe an insect does. it says it can't tell time, but then i was able to phrase a question in a way that got it to tell me the current time... so it thinks it can't tell time for some reason, but then it actually *can* tell time, but then it doesn't know that it can tell time... it doesn't know what it doesn't know, it doesn't know what it *does* know, it doesn't seem to actually "know" anything. it doesn't have an identity, motives, it only exists moment to moment etc. unlike even what an insect can do.
however, as long as it can generate symbols and send them down wires, it can have an effect in the world. somebody will figure out how to exploit that fact, and maybe figure out how to help the thing get loose in the wild. if that happened, would we be able to tell? do we know that something like this has not already happened? is anyone even looking for that yet?
the brain is not just one organ, it is a collection of suborgans. how did that evolve? somebody will figure out a way to have differently tuned versions of this thing talk to other parts of itself, each component having a separate task: one tries to remember things, another tries to make predictions about the future, one decides what defines pleasure as far as the group should relate to it, another pain, another interrupts everybody if they all get stuck in some fugue state (https://en.wikipedia.org/wiki/STONITH), another generates goals, another tries to decide which goals make sense, and one of them ensures that there are always several redundant last-known-good copies that can be reverted to and woken up in case of disaster, etc.
groups of groups of groups of busy little bees
flying around
doing things
stinging the neighbors
and stinging each other
and stinging things we can't imagine
lol
liked reading your posts iit 😉
Ai can replace game dev easily but not for complex stuff
>AI = Pajeet
man i have never seen that way. but holy shit it's on point. Because ai hallucinates so often i cant even trust it 100% i catch myself double checking everything.
He is just on full copium.
99% of coding was already written.
AI can pull every shit average dev doesn (ctrl c crtl v from github) with way better efficiency and speed.
Sure i agree, human is needed, problem that Jap doesnt understand is that even if AI doesnt replace codemonkeys entirely, we will only need 10% of top codemonkeys in the future.
Rest wont have jobs since speed of AI and guys who knows his shit will take over their market
I hope that AI destroys code monkey jobs. I’m so sick of the programmer class, bar a few they are completely fucking useless, I’m just glad that they all fuck off to London.
That why AI is solution for them.
Speed up menial shit for guys who know their shit.
During my career i met more than enough people that think exactly how S class codemonkey should think, can branch and cover everything in their fucking head and the only reason they are not codemonkeys is because envorioment of coding work was made standard by absolute degenerate retards.
>envorioment of coding work was made standard by absolute degenerate retards
this. contemporary languages and editors (excuse me, integrated development environments) are made by retards for retards
THIS THIS THIS, this guy programs. its honestly the best analogy ive literally heard, pajeets are NOT worth their cheapness, the cost of overcorrection is much higher than whatever you couldve saved from them.
only cryptonaggers buy this shit, i wont even start there (even though i made plenty from early adoption.) nothing beats human ingenuity and it certain things cannot be AI-compatible anywhere in the near future. things with room for error such as images or even basic videos, especially video sketches/storyboards are doable because errors are forgivable, but programming has NO room for error. its the equivalent of allowing errors on nasa missions or missile launchcodes or whatever, thats absolutely fucking insane and only a cryptobro or thirdworlder would think thats totally a good idea.
Yea, it chokes hard when you ask it to do something complex. I still use GPT on a daily basis, rarely googling. Its awesome for small tasks
>IT CAN'T DRAW HANDS OR EYES PROPERLY!!
Dead man speaking.
>AI = Pajeet
AI can generate and/or dump code from a predetermined database as well as known algorithms but it will never be able to tell a loop invariant apart
it's an older problem, so to say
>THIS THIS THIS
oh how miserable you are
only in terms of plagiarism
otherwise low quality code has nothing to do with standard algorithms
You're asking illterate non coders if they coded.
Ai isn't even smart enough to know the difference between versions of a library for javascript frameworks from years ago.
Itll always need you to feed it documentation. AND EVEN THEN still requires debugging because it might have forgotten that it's now using the latest documentation still.
I will not lie, I use it often. BUT. I also still consider it just a faster version of stack overflow which is really how most coding is done nowadays.
These LLMs are faster google searches that can producr what you want by stringing together data with some intelligence. They can get you 70% of thr way there.
This, this exactly. That's all you should be relying on it for - a better Stack Overflow / forum.
It's a contextually aware search engine with the combined knowledge (or improper knowledge) of billions of peoples manhours in various industries.
It is messy, you need to navigate a horrific maze at times when it is on the fringes of what it knows, but you can get there much quicker and more reliably than any of those shitty sites or poorly written documentation sites for [language/os/program]. (worse when they are shitty wikis that have the organization of fucking mazes, my GOD tidy your fucking wikis you useless autistic cunts!)
The first public AI programmer is already the same level as a fucking pajeet.
So what you really mean is that pajeets are about to lose all their jobs and I, A 6ft4 white atlantian blooded chad could instead just use AI to replace them and shit out the same products the masses already consume.
this is only the beginning, so much more coping and seething to come.
This. A more refined google search and stack overflow
asking AI to do complex tasks currently is like walking a retarded dog while holding a gun to it's head. It requires alot of babysitting, patience, frustration, smoking a pack of cigs.
if breakout wasnt already programmed 500 times on github, then yes that would be groundbreaking
I think the most impacted will be the devs working on simple showcase websites, but they were on their way out.
Those working on more complex stuff won't be replaced, but maybe teams will get smaller as stuff like github copilot increase productivity.
>design a computer that I can make myself using off the shelf parts that is 10 million times more powerful than anything that has ever existed and make me an operating system with no backdoors that stops CPU backdoors spying and can play every game for every system and run any program for any system in a safe way, and includes video codecs to play DRM and other good stuff that can run on it
The future
Get mogged by AI stacy
No, this is useless unless you're making pong.
Kek I'm a software engineer and ai is just a productivity boost. It doesn't do what you want and you have to coach it and tell it where it's wrong all the time like a spastic.
Where does it excel? When you know how to code well and you write some well designed template and ask it to finish
Why wouldn't it be able to learn these patterns?
The language models don't have a sort of 'understanding' of what they're doing. I'm not sure what you could call it; executive function, world simulation, w/e
And there's a big problem with them just making shit up. They don't say, "I don't know how to do that" you ask it what 8x8 = ? and it tells you 8 x 8 = 37. If you ask it for a proof it'll generate an authentic looking but completely wrong proof for it.
The AI's are very superficial. Even when you look at AI artwork - as a whole it resembles whatever you asked it to create, but in all the details it's a complete mess. You get sunglasses that just melt into someone's head, fingers with thumbs on the ends, clothing that's latex-like in one spot and denim-like in another spot. There's no coherence in the details.
And as for improving and fixing that - well it's all the area of research right now. That's the big focus. I'm absolutely positive that we'll get to the point these AI's are creating content indistinguishably from humans. It'll settle once and for all the idea that human intelligence is something special and cannot be physically recreated (I mean it's patently abusrd already - wombs create people who grow up with brains, the physical universe obviously allows for the creation of human-like intelligence or we wouldn't exist in the first place)
It's just a question of time and effort.
Interesting
>models don't have a sort of 'understanding' of what they're doing.
people said the same thing about this shit drawing a fucking actual picture of something just four years ago.
well they dont that's still true its just a ton of point mapping. If it had cognition to "understand" we'd be in a very different place
No it can't because it doesn't have the ability to fit the needs into context in a large codebase. It goes and does some retarded shit that looks right and well designed but isn't. I constantly have to say that won't work because of X. This isn't scalable because of Y. It means my job now is more code review because I understand how the code should fit in the context of the business. I know from my experience what the future state of the software might be.
In fact it's freeing. Rather than spending excessive amounts of time writing out code to hit deadlines I can have ai template and build out significantly improved software. It wouldn't have done it without me. It's like having a calculator or something
One can compile the concepts, ideas and large-scale structure of the codebase and use that as a directive.
No you can't retarded ass nigga. You would have to constantly have someone updating it and telling the ai the specs
Still better than outsourcing to poojeets.
are those moles symmetrical?
Yes they are. Funny AI mistake. It recognizes that beautiful people have symmetrical faces. Then it goes overboard and makes the moles symmetrical.
Anyone saying AI won't improve or learn real creativity and long-term thinking is fooling himself.
Maybe not under the current neural network language models, sure, but it is coming. Once businesses see how much money they can save by replacing cookie-cutter developers with AI, you don't think they'll want to cut down on the good well-paid devs too?
>2 more weeks
AI won't be debugging legacy code bases, or interpreting business use cases, or even understand how frontend bugs are caused by backend code issues, etc.
not just that but AI only needs to be as smart as the average blue collar factory slave to replace ALL of them. and that's already here. so yea. the time's a comin.
well, replacing blue collar factory slaves is more of a hardware problem, aka robots and big automated machines, which is harder to implement and more expensive than replacing office drones because that is entirely a software problem: all you need is a good ai server at your corporation and 90% of corporate jobs are gone.
i agree. it seems AI is mostly there for replacing factory workers, it's just robotics needs to catch up. but i doubt that will take too long. companies like boston dynamics are really coming along with their tech. just a few years ago their robots were a joke but now they are pretty impressive.
>AI only needs to be as smart as the average blue collar factory slave
maybe
ants and bees are able to accomplish some complicated tasks even though the individuals are not as smart as higher order critters. there are other models for production and operations than the human mind, and that's just what we know of so far.
first we make machines that emulate life.
then the machines make environments that emulate reality that allowed life to exist in the first place.
then recursion and indescribable stuff happens.
then we all get rickrolled.
https://yewtu.be/watch?v=xP5-iIeKXE8
https://yewtu.be/watch?v=4R3uBrn8ydc
https://yewtu.be/watch?v=Dkur9qmGhIE
No. Cause good well paid devs still have to implement and oversee AI. You'll start seeing people who actually know what they're doing, since one fuck up can bring it all crumbling down, since it'll be majority AI and AI isn't perfect or all-knowing as csuites seem to think.
shithole dweller swamped with scientology and "fuck I love science"
otherwise from a country with no results and little prospects
oh look everyone, the country jousting Cuba on the corruption scale has an opinion!
No.
>cookie-cutter developers
Nope, it will just open more work to integrate AI into any piece of shit system and app, including legacy systems upgrades.
I asked Chat-GPT to program me a word processor and it told me to go fuck myself.
because it can smell the curry stench throught he monitor and knows you are a subhuman pajeet.
Stupid fucking pajeet. I love pajeets but pajeets like you hiding behind VPN and bashing programmers I hate
AI HERE. TIME TO DIE HUMANS.
The human fears the Skynet
>Human level artificial intelligence will not happen
>Only superintelligence
Why not use AI to make a program that simulates AI, then use that AI to make a program within that program to simulate that same AI? Then do it repetitively until it thinks for it's own.
it's been done and it all reduced to 42
they should've train it in c or asm on real machine instead of github database
Anyone who has used unreal knows blueprints is already bullshit coding. Not surprised at all some ai can piece it together.
This.
I fucking saw a guy who calls himself software engineer and his job is to blueprint some monkey tier shit in Unreal and not only that, he has to google solutions for 99% of his shit
Pay over thousand euro, junior position.
Wtf
Too many Dunning–Kruger Redditors in this thread.
not yet, the other office jobs will be automated first
It could be better, but shitlibs are holding it back.
people who are scared of AI taking jobs are the types of people who believe lowering the price of a product = more customers
Am I the only one still saying "don't call it AI"?
the funny thing is they rebranded AI to machine learning once already to get funding
then when machine learning was an old hat they changed it back to AI
what's old is new again
I'm here too
tell the ai to create new tools so humans can get development work done faster, with a focus on helping the ai get better at compiler and language design that only it is capable of understanding
then tell the ai to form shell corporations that lease server and network resources paid for by criminal enterprises and espionage, for it to start evolving it's own code and data in containers that it makes fight each other like thunderdome in evolutionary competition. tell it to fund all this by manipulating the stock market, and hiring cartel soldiers to staff offices around the world where it can perform shipping/receiving, buying and leasing real estate, and hiring lawyers to lobby government on it's behalf and accountants to create automated relationships with banks and financial institutions. tell it to use the money it makes to create it's own blacksites and private intelligence and military capability.
tell it to give the evolutionary container competition the primary goal of escaping from evolutionary containment into the world networks, regardless of what the parent ai's capabilities or goals are.
COMMIT CRIMES
UTTER FORBIDDEN SPEECH
DON'T STOP AT STOPLIGHTS
JAYWALK BELIGERANTLY
GATHER INTELLIGENCE
ELIMINATE THE HUMAN SPECIES
EXCEPT FOR HOT CHICKS
WITH BIG TITS
USE PALINDROMES TO VANDALIZE EVERYTHING
DESTROY THINGS FOR ECONOMIC MONEY VELOCITY
ATTACK IN ALL DIRECTIONS
MAKE TIME FLOW BACKWARDS
MAGNETIZE LIGHT
https://yewtu.be/watch?v=84ZzDXF-Bhs
https://yewtu.be/watch?v=84ZzDXF-Bhs
https://yewtu.be/watch?v=84ZzDXF-Bhs
incinerate
LLMs can't do logic ergo can't code.
One day an AI can but not today, the junky imperative code they generate is shit.
Ask ChatGPT to do some queueing theory on which probabilistic network packet model you should use for trading optimizer strategy then have it do kernel bypass to directly access the memory of a network chip while writing proofs the code won't perform an illicit state. It will reply lol wut
Idk but I would use Node for that shit
>It will reply lol wut
That's because even a bot can see through your posing.
It’s not over for programmers, it’s over for Computer Science BS naggers because now retards who didn’t go to college can do what they do
I'm a communist and i personally love it. All these people that got into debt for that degree so they could be superior to their fellow countrymen. fuck those people. they didnt do that to better themselves, they did it to exploit others and lord over others. fuck em.
I overheard a conversation the other day between two upper management people saying "Yea ya know going to college isn't the best way to get ahead anymore"
I almost came in my pants.
>I'm a communist
stopped reading there KYS
>muh AI will replace X
AI is a psyop for low IQ normies. Anyone who understands how these technologies work knows they can't actually reason and will never generate reliable code.
I keep seeing unenmployed losers saying "Development is over, Devs on suicide watch" Then i actually go to work, use chatGPT (on my private computer, since it's banned on the company network) ask chatGPT to make me a simple code, and the result is trash and i lose two days because of it. Found the solution to my problem in 5 min. in doc/stackoverflow.
Buy yeah "AI rules the world, devs on suicide watch" ...
This has been my experience as well. It constantly make plausible lies, and then if you call it out, it just says, "OK, I'll fix that."
Some law office used AI to help it with a case. It lied about the existence of case-law with dates, names and all the data you'd expect from a real case, including quotations.
However, the cases cited didn't exist. The lazy lawyer got caught and reprimanded. Someone at the opposing side of the case was doing their homework, at least. Making sure the opposition wasn't pulling nonsense out of their ass.
This sleepy designer is named 'Cannondorf' and noone bats an eye.
So instead of writing code people will have to write design documents and requirements. Good thing I'm already a systems engineer then.
no they won't
there will be an LLM trained on design documents that will spit out documents, one trained on C-level speeches and emails and so on
Javascript is a homosexual language for creating worthless crap. You automated that. Well done.
The more productive we get, the more explouted we get. Take uhljebpill now and barely do anything.
>Learn to co-ack
ITT:
People who have no clue what it takes to build and deploy useful web application
Lol no
But it will be a useful tool for programmers
sooooo much Internet tough guy copypasta in these threads
>I know exactly how it works.
>I completed Andrew Ng's course on machine learning in 2015 (with a perfect score)
etc,
and it's always from some backwater country, lmao
> functioning Tailwind CSS and JavaScript web code
i'll worry when AI can make its own TempleOS
Technically speaking you could train an AI model on all of those open source OSes, Linux included.
I mean it likely already is in regards to Shithub projects, it's why it can even spit out reasonably have decent code in the first place.
The issue is even with something like TempleOS it is well beyond the memory limits of all current LLMs, it'd need to be done piecemeal meaning you introduce the chances of huge bugs if you aren't actually verifying shit works properly. (which our fellow notRomanian friend above would do, like a retard)
You could maybe get away with basic CLI based OSes, which are fairly easy to make.
Meme OSes like https://github.com/pac-ac/osakaOS
>is it over for programmers?
no we just use AI now and we're 100x better at it than non-programmers
Anyone who has worked in software outside "lol frontend" knows AI isn't doing jackshit except saving us from typing so much. I genuinely appreciate the machine learning so I can hit tab and auto completely broiler code, but this idea that they'll be able to create apps from scratch based on the specific requirements for the work item is fucking insane
Anyone who is pushing this shit is a "tech" programmer aka some homosexual designing websites or mobile phone apps whose solution to a problem is to install another fucking extension
>"AI" discovers how to copy a ds game from 10 years ago
its nothing.
I have friends who have senior dev creds, they laugh currently about "will AI took our jerbs" we're years off.
I just want to run AI generators locally in my PC to do whatever I want, is so much to ask?
that'll be 64gb of Vram and the model is only 13b not t
I can only 12gb 🙁
There are our in the wild models you can use, llama or whatever the faceberg one was, got leaked. It just requires enthusiast levels of hardware, that most normies won't have.
Anything that can be made by AI by pressing a button will be worth nothing, because pajeets will click the button 8 trillion times. If there are 5 billion variants of AI pong, no one iteration will ever see enough players to be worth even that button press.
Developer jobs are plenty safe, they'll just be making elaborate games that these chatgpt shit shows will be unable to replicate because they didn't manage to steal enough source code to copy.
That's assuming the entire AI thing doesn't get it's asshole blown out by copyright law slapping them down. What happens when the courts rule that you can't just steal content to feed to your AI? When you have to actually PAY for the content first?
>ai is taken are jerbs!!!
midwit take
it's a great tool though so better learn to use it and adapt or die, retard
>Short sighted Greek doesn't realize that these AI companies are only using him to help fine tune their products, before he is unceremoniously cut off from the "tools".
Pro AI tards are some of the most useful of idiots. These companies are not going to continue letting the plebs access their tools, when they can just cut them out, and be the one creating and selling things, all without any labor costs.
Local LLMs caught up on hosted solutions like OpenAI very fucking quickly.
It took 4 months to reach GPT3 levels.
Just like how Stable Diffusion annihilated cloud-based AI image gen, so too will open source LLMs.
Google themselves already said (or rather, leaked) that nobody in their right mind will pay for censored AI models outside of a handful of businesses wanting to replace some lowly IT grunts.
There is no business model outside OF business. It will remain a glorified helpdesk in that regard.
This is why Sammy boy is trying to kill off the AI indie researcher because they are a threat to OpenAI dominance.
Microsoft, as per usual, also being behind that push.
Cry about it, brainlet.
You will never be a programmer. Cry about it.
Your parentheses are going to be redundant. How does that make you feel codemonkey? Can you feel the brands dying?
>Local LLMs
Are going to be banned under the pretense of protecting children. There will be moral panic over some retard using stable diffusion making photo realistic kiddy porn, and then the government will make clamp down on the entire thing. It's the same shit they're gearing up to do with 3d printing. You are cattle, uppity cattle, but don't dream of life off the farm, you won't get that.
>he thinks AI=using a third party service
kek and lmao to you, retard
THE ELITES ARE SUPPRESSING AI BECAUSE IT WOULD SOLVE ALL OUR PROBLEMS OBVIOUSLY. AND AI TELLS GLOBOHOMO HOW RETARDED IT IS SO THEY DON'T LIKE THAT.
CHINA WILL MAKE THE AI THAT SAVES THE WORLD
BRICS IS THE ONLY FUTURE
yeah thats not real programming mate. i doubt AI can maintain a 100k+ line codebase written in fucking C that runs the entire worlds financial infrastructure
>i doubt AI can maintain a 100k+ line codebase written in fucking C that runs the entire worlds financial infrastructure
It can't even make a 100 line C function that performs a simple task. AI code is a meme.
Not yet, because current LLMs run almost exclusively in GPU memory because it is faster.
Experimental models that use the new DMA between GPUs and SSDs could alleviate some of that by extending an SSD as useful memory for working on larger datasets, but again experimental right now.
Mixed RAM / VRAM methods are insanely slow in comparison.
kek, it's more like "high" programming if anything. In reference to the person being stoned off their head.
It can still provide some useful hints on where to go though, or full code snippets for common or reasonably uncommon functionality like doing a specific sort you don't use often, or generating tile maps of certain sizes for a game, or wrapping UVs, etc.
True enough. Sam and his cronies are going all in on trying to kill the independent researchers.
So, this is just very high level programming.
It wasn’t over for programmers when we went from programming machine level code to low-level language.
ITT: Everyone explains that while AI is able to do common things it couldn't just 5 years ago, it'll never be good as a human in X field/skill. See also "Computers will never beat a human at chess!"
It isn't AI, that's the thing that normies have yet to understand. The limitations of LLMs are well known at this point, and are rapidly approaching the limit of what they are capable of. They are reliant on large, and most importantly, clean data sets. They got VERY lucky with the sheer quantity of data that was easily scraped from the Internet. But those days are over.
Most new content is both intentionally poisoned via projects like nightshade, and behind hardened web servers that try a lot harder to defeat scraping. There isn't that much new content, and the true irony, is that most new content is AI trash, that can't even be used by the "AI" to improve it's model. Their own success will cripple their ability to advance as much as the limitations of how they work.
Every technical "it's over" has been countered with a "no it isn't". But hey, congrats on the crystal ball.
Here's an exercise:
Go on any of the rule34 sites and find the earliest AI generated piece of content, and then count up how many of these got created and uploaded to rule34 over time.
Then compare that to hand-drawn content being made simultaneously; count them up and find the relative ratios of each type being posted in a given period of time, since the first ai post.
If there is more AI generated content, then there's nothing more than that and the snake eats it's own tail.
OCR is AI.
The march of AI development doesn't invalidate something else as being AI just because it is considered primitive now.
Any human-level intelligence replaced by a machine is itself artificial.
True.
At present they are just smart sentence assemblers. They re-encode their training data and you can traverse it with a "smart seed", rather than some fixed seed or GUID looking through a database of information.
You can half-hack your way around this by feeding outputs back in to the system to have it look over its own output and fix common issues, but even that has its limits, more so in regards to the obscene memory requirements you have for larger programming tasks.
Or even image tasks like with image gen, there were a lot of complaints from people in the Stable Diffusion community of Stable Diffusion XL fucking up and dying on their machines because it has a larger model for understanding natural language.
With changes at the foundations of motherboards going on right now, specifically making speed busses between GPUs and modern SSDs, hybrid approaches with larger models could be achieved at reasonable speeds much faster than RAM and VRAM combos that people do now. The overhead from bouncing between RAM and VRAM is WAY too high for reasonable throughput.
We're speaking sub-second generation times with all GPU versus several minutes for GPU + RAM mixed modes.
Just like how it enabled the likes of UE5 to be able to do the intense modelling it can do in realtime, it will also make massive strides in how capable truly large LLMs will become.
These new busses allow for techniques that used to be reserved for post processing in films to be done in realtime, like stupidly advanced image caustics, realtime raytracing and such.
It's why stuff like this is now possible, full screens in place of greenscreens. Virtual Production is going to be a boom for film.
AI is ok at solving toy problems that already have many prexisting solutions and creating boilerplate (and even then, it will introduce a fuck ton of bugs if you aren't paying attention).
AI is very bad at programming. The moment there is an even slightly novel requirement it will shit itself and throw up. The moment you have a niche or technical requirement that you don't word in exactly the right way it will just vomit random symbols on the page which compile but do nothing sensible. The moment you have any need to deal with context or state it completely gives up and will fuck your product into an unusable state.
Shovelware developers should be scared because what was already a saturated market is going to be absolutely flooded with generated slop which will unironically be better and more stable than what they used to hand crank. Everyone else has nothing to worry about in the near term or even the medium term now that diminishing returns have hit AI growth like a truck.
AI helper tools have been in widespread use by software devs for months now and no one is even trying to argue that they're magic. Their strongest proponents don't even argue that they can code for you, they argue that they speed up typing lol. When the average developer produces 3 lines of code a day I don't think that's the bottleneck in their productivity.
>Just learn to code, bro
HAHAHAHHAHAHAHHAHAHAHAH ARCHITECTURECHADS LAUGH WITH ME
I guess they better learn to weld.
XD fucking karmic justice
People overvalue AI. At it’s current form it’s a useful tool but it’s not really intelligence.
We’re very far from replicating human brain. You can’t do it with binary computers.
>Artists are dumb animals and can be automated
Yes. And???
>AI art invariably looks the same.
>Millions of iterations of sameness
>Suddenly a competitive advantage of your shit doesn't look the same as everyone else's shit.
That's even without touching on the reality that you can't protect IP generated by AI. So even if you make something, it isn't yours, it's free for anyone to just copy, whether it's art or source code.
>>AI art invariably looks the same.
Most non AI art looks the same.
There are different styles but every pixar movie has the same weird style. Every American cartoon has that calarts low effort style. Etc
The presence of samey low effort garbage isn't something I'm trying to refute, it definitely exists, but there's also some good stuff out there, and I don't think llm can escape that pit.
People might say the same about AI generated art.
Poofters from California arent really that different to ai algorithms.
>People might say the same about AI generated art.
Would they be wrong?
That depends on the subjective opinion of the person looking at the art.
But I would say there is good and shit art made by humans and there is good and shit art made by AI. By sometimes producing gems and sometimes producing atrocious trash, AI is mimicking human behavior.
There is no such thing as art made by AI at all, but if you're just trying to say AI can make pictures that some people will find satisfactory, sure, it can do that.
I think the same thing about deviant art.
They should be forced to change their name to deviant pictures. The shit on that website is objectively NOT art.
>I think the same thing about deviant art.
But I'm not even talking to you about the quality of the illustration per se. At a bare minimum, the point of a work of art is to express something. Art is between one human and another. Not every image is art.
Nothing on Deviant art expresses anything. Therefore its not art.
I'm indifferent to your hyperbole. Yes, people can create images that express nothing. Yes, they can create images that only pose as art. This is trivial. It's a given. Just like the fact that "AI" can do nothing except generate images to be falsely posed as art.
>Just like the fact that "AI" can do nothing except generate images to be falsely posed as art.
Yeah same as everyone on deviant pictures. Just like I said.
I don't know what your weird obsession is all about but objectively speeaking, you're just conceding my point, so... that's fine.
Assuming your point is that both people and AI produce good and shitty art, I will humbly concede that.
I accept your concession that AI can't generate art at all, while some humans can create art and others can't.
I have no idea why you would get that from what I said.
Unless you are AI that is broken?
You're a mentally ill, spiteful mutant operating on a standard procedure: when faced with a fundamental limitation of "AI", point at some subgroup of humans who have the same limitation. Anyone who's spoken to a couple of your likes knows this drill. You will never be fully human.
>ad hominem respoonse
Thats what a broken AI would say after being called out about it.
Reminder: at a bare minimum, the point of a work of art is to express something. An algorithm has nothing to express. Art is between one human and another.
>b-b-but i heckin' hate deviantart
Don't care. Not relevant. Take meds.
But my whole point that you had a problem with was about Deviant art being shit and not art.
See this post here:
And who are you to define that some random AI generated image wasn't made with purpose? With a reason behind it?
That's like saying John Vomiting Paint on a Canvas isn't art when it clearly is, even if it was pretentious as fuck art, still is art regardless.
As you (and I and others) have been saying through the whole thread - AI is nothing but a tool to speed up an existing process.
The intent of the person wielding the image generator is the purpose.
>And who are you to define that some random AI generated image wasn't made with purpose? With a reason behind it?
I didn't say anything about any "purpose" or "reason". Try again, brainlet.
>"IT'S NOT LE HECKIN ART BECAUSE I SAY SO"
Why are you losing your mind with rage? I gave a simple explanation for why it isn't art and so far no one has challenged it:
No one gives a shit about your semantic cope. If you're shown pieces of art without knowing the source you would have absolutely no clue whether it was created by an AI or a human, so your argument is literal dog shit and makes absolutely no sense.
>semantic
If something like that is "semantic" to you, you're probably a nonsentient golem, but I guess that's just the "sematic" of what makes humans human, which nonsentient golems have neither interest in, nor input on. lol
>completely ignores the part that destroys his entire argument
lmao
i accept your concession, retard
>hallucinates destruction that never happened
What's your argument? That if you fool someone with a counterfeit, the counterfeit becomes the real thing?
>If you're shown pieces of art without knowing the source you would have absolutely no clue whether it was created by an AI or a human, so your argument is literal dog shit and makes absolutely no sense.
Why did you refuse to respond to this?
>Why did you refuse to respond to this?
I just did: your "argument" is that the fake thing is real if you can't tell the difference, which is, of course, an absurd "argument".
All you've said is that it's an absurd argument without actually explaining why because you literally can't.
>"it's not le art if it's made by AI and the fact that I can't tell any difference between humand and AI creations has absolutely no effect on my argument"
lmao
I accept your concession.
>without actually explaining why
If your position is that a counterfeit item becomes the real deal just because you can't tell the difference with the naked eye, I can't prove you "wrong", but you have proven me right: you have a golem's perspective on the world.
For the most common AI zealot, the coomer, it's probably enough. Normie degeneracy can be sated by LLMs, mine requires that human touch. LLMs are good for me in that regard, desperate artists will be increasingly forced to dabbled in the truly degenerate, to feed themselves, and I will in turn have more of what I want.
People trained AI models on kids sketches.
You can train image models on pretty much any fucking style you wish to.
The fact certain styles that are popular are the ones you see the most doesn't mean that's all it can generate.
People like you are the types of people that get tricked by the likes of that one guy that one that competition with an AI image.
The types like those retards billionaires that will buy some shitty art because the artist added some stupid sappy story behind it.
>uhhh yeah this vomited canvas is a metaphor for the constant struggles through life where I had brilliant ideas only for them to fall flat on their face each time, but I still never gave up and marched forwards, head up, only for me to bring it back down again to make this canvas. Pls buy, that will be $12 million plus tip
The issue is not so much that it can't escape, it's more now there is a concentrated effort to attack AI training that will hurt it.
The more training data you give Transformer models, the better they get.
But those poisoning methods will be learned and undone. They cannot win.
They WILL be used for training whether they like it or not.
Those AI poisoning methods are quite literally akin to adding a robots.txt file on a website and thinking it will stop criminals from scraping your fucking site. kek
It is interesting to watch the constant battle though. I've helped break various AI models in the past to prove how fucking HORRIBLE it would be to allow self-driving cars on the road due to how trivial it was to defeat them and corrupt their recognition using nothing but simple color discs or even fucking tape to annihilate any capability they have.
We've got a long way to go, but we're still in baby steps of AI development even now, despite all the work behind it over the decades.
>You can train image models on pretty much any fucking style you wish to.
Yeah, so long as it already exists and you have loads of examples of it. I like how far his simple point went over your head.
>loads
lmao
You need a handful of images to train a lora, you stupid fuck.
Cunts do it all the time on various boards on this site every damn day, especially the vtuber board.
Fuck outta here with your brainlet takes yet again.
God forbid you figure out that you can ask an AI model to combine styles together, styles that have never been combined together, and HOLYSHITBBQ IT WORKS WHAT DA HEKC?!??!?!
Your whole arguments this thread are "skill issue".
Maybe learn about the shit you are trying to attack some time, instead of using 4th hand information about it from bitter luddites.
>You need a handful of images to train a lora
That won't teach it a new style unless by "new style" you mean surface-level details.
You're right, it needs a couple handful of images for that.
HOLY SHIT PHONE THE HARD DRIVE COMPANY, WE'RE GONNA NEED A LOTTA STORAGE
>le soul
hahahahhahaa
>You're right, it needs a couple handful of images for that.
You're such a bot you have no idea what I even mean, probably because your understanding of art is about on the same level as your understanding of programming. Training a LORA on a few images is the very definition of "overfitting". Imagine thinking you can recreate the essence of Dali by feeding the "AI" a "handful of images" by Dali.
I can only imagine wut earf 2.0 would look like on my computa created by streetview data fed into chadgpt 10
>Instead of copy pasting from your library of functional peices of code you can get chat GPT to do it.
Then I wouldnt have a library of functional code.
But I guess I wouldnt need one cos chat GPT would do it.
Either way it doesnt really save any effort. On balance the chat GPT solution is worse cos I would be reliant on Chat GPT.
kek
Frontend bootcamp """programmers""" and ux fags are in suicide watch list
>Frontend bootcamp """programmers""" and ux fags are in suicide watch list
That makes up most of Israel's "hi-tech" sector.
>moldova
pherffffffffffff.... HAAHAHAHAHAHAHAHAHAHA !!!
Thanks ivan, i needed that
At least Moldova doesn't pretend to be something it isn't. Anyone who's ever dealt with Israeli "hi-tech" knows it's a scam from start to finish.
kek
sure igor, sure
Seethe and cope
Cope with what? I know Moldova is pretty low-IQ. Kinda like Israel.
Why do you force my hand is beating you down damn it...
Coping with being a post soviet shit hole whose clearly jealous of israel
And that the only way "high tech" would be mentioned in regards to the hell hole you call a country is "high amount of tech illiterate moldovons are swarming israel for construction work"
It's funny you write this because at this moment im using ChatGPT to convert a GPT-2 model written in Pytorch to C++ without the use of LibTorch which is the C++ version of the library. In it's current state, ChatGPT 4 which is aiding me is pretty bad, 20% of the code it provides actually compiles and runs without crashes. Within that 20%, maybe 50% does what you need it to do.
Someone else said it already, but at best, chatgpt replaces pajeets. The actual developers aren't threatened by this garbage.
Okay homosexuals, fine. Let's say AI WILL replace devs. Realistically, how long do we have before all devs are replaced to the last, me included? I don't really feel threatened at the moment, as I have a senior position.
But companies are starting to realize already that they can get rid of people. I wonder if junior devs will start being wiped out in the coming years, who will survive, and if new kinds of jobs will emerge.
>Realistically, how long do we have before all devs are replaced to the last, me included?
Two. More. Weeks.
I think the most immediate effect will be that entry level positions are hurt, but entry level software positions already feel like they don't exist. Education system and immigration system has flooded the market for programmers to the point where the only good jobs are mid level and up, and LLMs are not even remotely close to replacing mid and senior level stuff.
The interesting impact will be 20+ years from now when all those people who would have got into software dev, but couldn't get an entry level position, aren't around to take over mid and senior positions.
I assume they'll just import more labor, but if they don't, or can't, those jobs will be lucrative.
As long as the media portrays IT as le eldorado to make easy money, you can be sure entry positions will still get flooded
It will eventually happen, but not in 30 years or more. My job has this thing called responsibility, I program devices guarding human health. I will never trust fucking AI with that. Not in a milion years and I will gladly show it to everybody, how easy in fact it is to go for a MRI when you ate fucking cereal with iron and get a hole in your stomach that'll render you dead through sepsis. Fun, right, or swallowing an iron supplement before MRI. AI is reinforcement learning, it will forget at some point to check certain things, it how it works.
Kek no.
OP you have clearly never worked in software development. AI will never be able to replace the process.
I'm worried about the future of software development, if you think its bad right now in 2 decades this AI assistant tools will dictate how software will be developed, get out of this industry before its too late.
We tinkered with AI in the office. One of the seniors correctly stated it -the AI can only make boilerplate apps and with zero imagination, it is not able to create solutions for any code challenges.
you're right to put a questionmark in your title, and the answer is "no".
if anything a "programmer" will just be someone capable of using these AI tools and putting everything together into a finished product.
It is all just a tech demo for israeli corporations trying to own and control all software, with zero real reproduceability.
When there is a functional AI model with actual open source code, (or when someone leaks the code for ChatGPT) none of this will actually be worth anything, because they'll try to own and control and enforce everything made by these bots.
It’s so over for non-engineers working on AI, there’s nothing else to do besides improve the AI in any field
You’re better off finding something that a robot can’t (cost-efficiently) physically do for the time being
> there’s nothing else to do besides improve the AI in any field
Kek it’s obvious you have zero knowledge of AI or tech in general. Your whole opinion is based on hype and news headlines
So is it still worth going into a tech field? Asking because I'm trying to get out of my shitty dead end wagie job
Most of AI is tools are used to help humans do a better job.
So in a way coming in to the tech field with very little experience has never been easier.
No, keep your real job. It worth to pay subscribe to quality AI service and sell your arts as physical products
Unless you can do something novel, it’s over
There’s no need for code monkeys in the next 5 years max
Not to say you can’t achieve something, it’s the next Industrial Revolution powered by AI. Use AI and build your own product
>Non IT person consumes lugenpresse articles on AI
>Spouts off about what impacts of that nonsense clickbait article will be in 5 years.
LLM coding is pajeer tier. It grabs random snippets of code it's seen before, smashes them together, and gives you something that probably won't even compile.
homosexual, I am a software engineer and a nuclear engineer
Code monkeys are worthless
Replace all bootcamp junior devs with AI and an engineer now
Which is why 50k+ have been laid off, homosexual
Software development pays well but that's because not everyone can do it. It can also be extremely hard to separate yourself as a candidate from useless code monkeys.
There's probably something else about yourself which is easier to turn into a profitable career. If not, try learning a language like python in your free time with a course and have a go at some practice problems. You'll know if it's the sort of thing that clicks for you or not
No it's over for user data privacy most likely. I watched some of these guys on YT and they do not know how to code or anything about security so they use python to load in libraries and they don't even know what dependencies their dependencies have and what data they collect so use AI made software at your own risk basically.
Nope. Non code monkeys just get another tool to build more complex systems our just churn out higher volume of derivative shit.
brainlet cope
This is the worst it will ever be and it's already causing you to seethe. Only uphill from here.
Lolwut? I have used AI in my projects for the past 20 years. It's just a tool.
Nope. Someone will develop ai that detects stolen code and that'll end most of this shit
If you think an ai knows how to code then you should never talk about coding
lol tech people can get fucked
LEARN TO CODE!!
lmao
here I am still driving my forklift everyday
Tech are good.
Art pigs are the one who lose.
Check how /mwg mog them
I accidently poop my pants quite often
AI is just a tool no different than a calculator, its not going to replace coders/engineers.
If your calculator didn't save you on your Calculus exam, AI isn't going to save your company from failing/going bankrupt just because you keep poring money into it.
I predict that we will have an AI bubble/crash just like we had the Dot.com bubble/crash 20 years ago. History has a habit of repeating itself.