If the question is worded so absurdly poorly that it makes humans think you just had a stroke, why should anyone be surprised when a computer can't parse it worth a damn?
Everyone understands that. The thing that's losing us (and the AI) is his nonstandard use of "every second" where "every other" would be the more appropriate phrasing.
>How did you get 17670?
I think the point is to show how easy it is to make GPT think give you the wrong answer. Anyone looking at that would see that both answers are clearly wrong just by log rules. The machine doesn't understand dick, so it just regurgitates the best set of tokens it can fine. Interestingly ln(x) = 2.303 => x ~ 10
Yeah I got that but that has always been obvious. Ofc you can just force the AI to believe something you say and that will fuck everything up.
But I still care if the AI can, without being fucked with, give a good logical answer to something it hasn't already seen and which has a "medium" complexity. The kind of thing a human who knows middleschool level math would be able to solve
>give a good logical answer
I'd argue that it didn't give a good logical answer either. It went from trying to compute ln(0.25)/ln(0.5) to trying to compute ln(10) * ln(0.5) for no reason. It also multiplied a positive number with a negative number and got a positive number. Also 2.303 * 0.693 * 5730 = 9145.
It failed at every possible step. and then just lied when it was told what the "real" answer was.
I agree with the premise that GPT is retarded and fails at understanding even the most basic logic scenarios, but if you word your question like a knuckledragging retard you can't expect it to work at all
>if you word your question like a knuckledragging retard you can't expect it to work at all
Even so, it's not showing any signs of understanding math of the kind that kindergarteners get.
Which is exactly as expected. GPT always produces just plausible-sounding bullshit. That works with some things, but not with math or reasoning.
I'd be surprised if gpt didn't just directly copy/paste some free code when asked to write a basic program function. Anything more complex than that and brainlets are going to have a fun time tearing their hair out trying to rationalize why their genie in a bottle doesn't magically solve all their problems for them
Your ESL english is so bad I made the same mistake as chatGPT and thought you were saying "remove a cow every second that passes" instead of "remove every second cow".
It's has conceptualized and mastered languages beyond any human's understanding, it's a LLM after all and not a large logic/mathematics model.
You should understand that language is much harder to conceptualize and master than logic and math so such models will follow...
native english speaker here, after having a stroke for a few minutes i realized you meant you kill every second cow
cows: [1] [x] [2] [x] [3] [x]
pigs: [4]
chatgpt was right, why did you sperg out over this?
1 cow and 1 pig = 2 farm animals
if you have 6 cows and cut down every second the third cow after the second one was cut down would become the 2nd instead and therefore get cut down leaving only the first one to survive
Learn English you retard. Of course it doesn't understand you.
conputer
how is this wrong?
I don't even understand what "cut down" means.
Yeah it's retarded when it comes to that
Am I the only other person ITT that knows that "cut down" is a euphemism for "kill"? OP is saying "I kill every second cow".
Lol, stupid computer.
If the question is worded so absurdly poorly that it makes humans think you just had a stroke, why should anyone be surprised when a computer can't parse it worth a damn?
So then can you explain why OP think's the answer is wrong? Because it seems correct
It is correct. I think the point was to show the stochastic nature of LLM's (and indeed all AI models).
Everyone understands that. The thing that's losing us (and the AI) is his nonstandard use of "every second" where "every other" would be the more appropriate phrasing.
How did you get 17670?
I got ln(0.25) as -1.386
so / -0.693 it's about 2
so final answer is about 11460
>How did you get 17670?
I think the point is to show how easy it is to make GPT think give you the wrong answer. Anyone looking at that would see that both answers are clearly wrong just by log rules. The machine doesn't understand dick, so it just regurgitates the best set of tokens it can fine. Interestingly ln(x) = 2.303 => x ~ 10
Yeah I got that but that has always been obvious. Ofc you can just force the AI to believe something you say and that will fuck everything up.
But I still care if the AI can, without being fucked with, give a good logical answer to something it hasn't already seen and which has a "medium" complexity. The kind of thing a human who knows middleschool level math would be able to solve
>give a good logical answer
I'd argue that it didn't give a good logical answer either. It went from trying to compute ln(0.25)/ln(0.5) to trying to compute ln(10) * ln(0.5) for no reason. It also multiplied a positive number with a negative number and got a positive number. Also 2.303 * 0.693 * 5730 = 9145.
It failed at every possible step. and then just lied when it was told what the "real" answer was.
true
the only thing I'd like now is to test this on gpt4 instead of chatgpt so I can destress knowing that AGI is still at least 1 year away 🙂
2*5730=11460
Why are russian pigs so bad at english. At least morons know ebonics
>Why are russian pigs so bad at english. At least morons know ebonics
I agree with the premise that GPT is retarded and fails at understanding even the most basic logic scenarios, but if you word your question like a knuckledragging retard you can't expect it to work at all
>if you word your question like a knuckledragging retard you can't expect it to work at all
Even so, it's not showing any signs of understanding math of the kind that kindergarteners get.
Which is exactly as expected. GPT always produces just plausible-sounding bullshit. That works with some things, but not with math or reasoning.
I'd be surprised if gpt didn't just directly copy/paste some free code when asked to write a basic program function. Anything more complex than that and brainlets are going to have a fun time tearing their hair out trying to rationalize why their genie in a bottle doesn't magically solve all their problems for them
ESL moment.
if i have six cows, caboozle dingle dingle then buy a pig, how many farms animals do I have?
the only issue here is that it tries to answer your bs at all instead of calling you a retard.
Literally what the fuck is that question anon? It's like those stupid ass bait pictures where it wants you to count the value of fruit
>russmoron can't into computers, math, or English
Many such cases
did i solve it
sirs why can't computer cut down my holy cow
please sirs I need to cut down the cow
I’m sure it would solve your issue if you could communicate it better than a toad
maybe ask a less retarded question you fucking retard. GPT did my 2nd stats final for me and it got 100% lol
did you mean every second as unit of time? for how long do you redeem the cows, sir?
Your ESL english is so bad I made the same mistake as chatGPT and thought you were saying "remove a cow every second that passes" instead of "remove every second cow".
>cut down every second
What the fuck are you trying to say? Nobody could answer this because your English is broken
>pass since 2021
is this supposed to impress me?
>native english speaker here
You’re probably American, so that’s arguable
It's has conceptualized and mastered languages beyond any human's understanding, it's a LLM after all and not a large logic/mathematics model.
You should understand that language is much harder to conceptualize and master than logic and math so such models will follow...
cowbros.. is GPT right?
>cut down every second and buy pig
>cut down and buy pig every second
also
>9 seconds
>native english speaker here
>t. Julio Vasquez
>retarded mass-reply pass user filtered by modern BOT captcha thinks he has a point
lol
Final cow question
loos
>still shits on the street
>...and by lord Krishna, my cows are redeemed
kekaroo
Yeah bro just keep asking it the same ambiguous question over and over again, I'm sure you'll get the answer you're looking for
garbage in garbage out
I hope no one cuts you down when you rope, trannoid.
Human Powered Intelligence here.
You will have 3 living cows, 3 dead cows and a pig. So you have 4 farm animals.
It's a language model not a calculator. You've literally just proved that you are not a sentient being.
>dude GPT is so smart!! it'll take away all of our jobs!
I'm surprised it didn't get that one
GPT4 definitely would
gpt4 is available in india sandeep. pay for it if you're going to shit up this board with these posts.
native english speaker here, after having a stroke for a few minutes i realized you meant you kill every second cow
cows: [1] [x] [2] [x] [3] [x]
pigs: [4]
chatgpt was right, why did you sperg out over this?
That you are able to translate illiterate rambling into English doesn't mean that the AI can do the same.
If I were to play out continuous lewd scenarios of me being an elf at the mercy of a goblin tribe, what would be my best bet in terms of chat AI?
1 cow and 1 pig = 2 farm animals
if you have 6 cows and cut down every second the third cow after the second one was cut down would become the 2nd instead and therefore get cut down leaving only the first one to survive
>another ClosedAI thread
You should try to rephrase this question in English.