I don't ask for code, I ask for advice. That is the true way of using an LLM to aid you code. Here's a better way to rephrase that: > what are some efficient algorithms to detect collision ...
nah LLMs are just google from 15 years ago when it could actually find things other than dead links and troony porn, you should use it the same way.
I copy-pasted the code in and it just worked.
winograd is a simple algorithm you can read the paper and implement it it was definitely part of the training data. Your claim was AI would be doing the thinking for you but now you cope and move the goal post. It can't even advice you since it gets the implied hardware structure wrong 99% of the time.
>Your claim was AI would be doing the thinking for you but now you
obviously I was being hyperbolic you autist, and also winograd is an intentional ambiguity test, it checks whether the AI can still understand a shit prompt, just learn to prompt bro
sounds like a prompt issue.
chatgpt has acces to the entire python library with EVERY obscure addon/lib.
its impossible for chatgpt-4 to not find the correct solution for your coding problem and then translate the solution into your obscure programming language
The other day it shitted itself trying to suggest me something unexistent about FDCAN node masking on a STM32 microcontroller.
You don't know what "obscure" is.
In my experience if you're skilled, these ai coding assistants end up wasting your time more than they help, unless you're generating some very basic boilerplate code.
Worst is when they base their solution on some non existent function/library they hallucinated.
>"write me a haxe build macro which parses the GADT type parameters of an enum's constructors and stores them in a static runtime map" >even the most advanced AI can't even write a snippit that compiles, much less even comes close to working correctly
never been able to get an AI to code anything that I don't already know how to do.
The thing is AI only really works for already solved problems, which is why midwits and codelets are so enamored by it.
In the real world tho, you don't get paid to figure out solved problems. In most cases, if I were to use AI for coding help, the time wasted to feed all of the relevant data and context just to get a chance at a helpful answer, going over all the details and explaining to it the things it got wrong, and then carefully reviewing and testing its code is simply not worth it compared to me writing the solution myself.
I honestly don't get it, if you just want someone else to do the work for you, why would you become a programmer? The whole appeal of the job is the problem solving aspect of it. Programming feels like playing a puzzle game. AI doing the work for you is like looking up answers to all the puzzles. What is even the point then?
The algorithms are the only part of programming that's actually interesting. Everything else is mostly just boilerplate and data structures and design patterns that have been done a million times already.
Doesn't this code fail?
Consider a square, with a point to the left of a square with edges parallel to axes.
I get that it will return true even though its outside.
Is it an off by one error? Or a missing check?
I don't ask for code, I ask for advice. That is the true way of using an LLM to aid you code. Here's a better way to rephrase that:
> what are some efficient algorithms to detect collision ...
nah LLMs are just google from 15 years ago when it could actually find things other than dead links and troony porn, you should use it the same way.
I copy-pasted the code in and it just worked.
Based
I do the same, I don't like mindlessly copy pasting, I like understanding and learning
Aaaand you just wasted 1 hour implementing what the AI gave you. Your competition is already shipping the product. Too bad.
ask it to implement wingograd or something more advanced for gpu you will see how good it is. It always fails.
>just put "fafafadfgayaha" into google and you'll see how good it is. It always fails. Checkmate Chud
winograd is a simple algorithm you can read the paper and implement it it was definitely part of the training data. Your claim was AI would be doing the thinking for you but now you cope and move the goal post. It can't even advice you since it gets the implied hardware structure wrong 99% of the time.
>Your claim was AI would be doing the thinking for you but now you
obviously I was being hyperbolic you autist, and also winograd is an intentional ambiguity test, it checks whether the AI can still understand a shit prompt, just learn to prompt bro
Now ask it to do something in some random obscure proprietary API that's hardly documented online
looking up obscure API functions is exactly what its best at
Every time I try it because I'm stuck it recommends I use functions that don't even exist
skill issue
sounds like a prompt issue.
chatgpt has acces to the entire python library with EVERY obscure addon/lib.
its impossible for chatgpt-4 to not find the correct solution for your coding problem and then translate the solution into your obscure programming language
learn to prompt
Sounds like you have no idea what you're talking about. Come back again when you've had something more complicated to code than fizzbuzz
>no argument
okay
>Python
The other day it shitted itself trying to suggest me something unexistent about FDCAN node masking on a STM32 microcontroller.
You don't know what "obscure" is.
You're right because
is moronic, AI hallucinates functions that don't even exist. It's best to just use AI for explanation or with popular APIs.
No, it's literally the worst use case, often it doesn't even know what functions exists nor how di they work,and it invents functions out of thin air
still better than every junior developer in existence. god im glad they'll be obsolete.
>LLMs are bad at parsing text
do jeets really?
You're a moron, I never said nor implied anything like that. Zero reading comprehension
>now ask it something thats not part of the training data
why are autists like this?
It fails at all(!) WPF questions that are not literally lifted from some trivial online example.
In my experience if you're skilled, these ai coding assistants end up wasting your time more than they help, unless you're generating some very basic boilerplate code.
Worst is when they base their solution on some non existent function/library they hallucinated.
>"write me a haxe build macro which parses the GADT type parameters of an enum's constructors and stores them in a static runtime map"
>even the most advanced AI can't even write a snippit that compiles, much less even comes close to working correctly
never been able to get an AI to code anything that I don't already know how to do.
this is true, if you don't need to look up anything anyway, AI can't do anything for you
The thing is AI only really works for already solved problems, which is why midwits and codelets are so enamored by it.
In the real world tho, you don't get paid to figure out solved problems. In most cases, if I were to use AI for coding help, the time wasted to feed all of the relevant data and context just to get a chance at a helpful answer, going over all the details and explaining to it the things it got wrong, and then carefully reviewing and testing its code is simply not worth it compared to me writing the solution myself.
Why do you people have this weird idea that a search engine is supposed to do your job for you?
I literally explained why it's not fit to do your job. Maybe you should use AI to teach you basic reading skills.
No one said it was going to "take our jobs" you seething moron
Can I run a local AI coding assistant?
Checked those prophetic trips.
99.9% of BOT can't think through a novel programming problem to (literally now) save their lives.
what AI/assistant is that?
I honestly don't get it, if you just want someone else to do the work for you, why would you become a programmer? The whole appeal of the job is the problem solving aspect of it. Programming feels like playing a puzzle game. AI doing the work for you is like looking up answers to all the puzzles. What is even the point then?
The same reason you look for the pieces of the puzzle in books and on stack overflow
OP's example wasn't looking up pieces of the puzzle, it was looking up the entire solution to the puzzle.
>algorithms are the ACKtual puzzle not the program you are writing to solve a problem
You sure you're a programmer?
The algorithms are the only part of programming that's actually interesting. Everything else is mostly just boilerplate and data structures and design patterns that have been done a million times already.
rewriting an algorithm 50 years old IS boilerplate
Which model should I try to integrate to my IDE?
I have wizardcoder downloaded but it wasn't that good...
Doesn't this code fail?
Consider a square, with a point to the left of a square with edges parallel to axes.
I get that it will return true even though its outside.
Is it an off by one error? Or a missing check?
Nevermind: Squares aren't concave, but square is a useful case to set up simple examples of the code breaking.
Consider a tetris block.