Thanks but I thought this was going to be something like OpenAI's Playground where I just type text and get results.
Looks like it's a bunch of files and shit that I need to compile or whatever. Not my scene
9 months ago
Anonymous
YEAH u put it into openai Playground. some anons did it but their links expired so imma do it myself. maybe i'll come back and post it
9 months ago
Anonymous
aw fuck nvm it was the gptBOT playground. I will make use of this model anyway and btfo huggingface. as i said ill prolly post it here. by prolly i mean if we dont get nooked.
>The capabilities of AI are clear as day.
OCR that still can't differentiate i from l, image recognition that still can't differentiate a dog from a cat or a black person from a gorilla, and product recommendation algos that think I still want to buy a DVD-R/RW drive because I bought one 20 years ago. True magic.
are you being sarcastic or are you retarded? even the flashiest modern AI is absolute dogshit tier and each paper that comes out proves it more and more to be a dud. AI is almost as much of a meme as "blockchain"
Some elements of machine learning are easy. Others are stupidly hard. Taking some nicely pre-packaged dataset and throwing Keras or PyTorch or Scikit-Learn or whatnot at it? Not hard. Figuring out how to take real world data that isn't all a bunch of continuous values and turn it into a format that an ML model can work with? Sort of hard, depending on the data. Trying to do hyperparameter tuning on a dataset that just doesn't want to give you good performance? Somewhat of a pain, and can end up with you doing lots of pointless trial and error. Trying to tune hyperparameters on a GAN so you can generate a dataset with lots of highly categorical values, knowing that your evaluation metric involves training a classifier on the generated data to predict real data, leaving you with three areas where poor optimization could fuck everything up? You know, I think it might be a little bit difficult!
le ebin reddit screenshot thread
For redditors, channers only known how to whine, be sarcastic, spout memewords and never finish any projects
Cope all you like, the proof is in the pudding. The capabilities of AI are clear as day.
Memes aside, how do I start. I got GPT-BOT downloading. Yes i found the model lol
You got a link?
answer my question first
Not that guy
k
https://archive.org/details/gptBOT_model
Thanks but I thought this was going to be something like OpenAI's Playground where I just type text and get results.
Looks like it's a bunch of files and shit that I need to compile or whatever. Not my scene
YEAH u put it into openai Playground. some anons did it but their links expired so imma do it myself. maybe i'll come back and post it
aw fuck nvm it was the gptBOT playground. I will make use of this model anyway and btfo huggingface. as i said ill prolly post it here. by prolly i mean if we dont get nooked.
t. pol browsing disease spreader
>The capabilities of AI are clear as day.
OCR that still can't differentiate i from l, image recognition that still can't differentiate a dog from a cat or a black person from a gorilla, and product recommendation algos that think I still want to buy a DVD-R/RW drive because I bought one 20 years ago. True magic.
Btw, the idiom you're looking for is
>The proof of the pudding is in the eating
Not
>The proof is in the pudding
Which is nonsensical gibberish. Hth
no one says the former ESL
the proof is in these nuts
are you being sarcastic or are you retarded? even the flashiest modern AI is absolute dogshit tier and each paper that comes out proves it more and more to be a dud. AI is almost as much of a meme as "blockchain"
>implying I start any to begin with
We get laid tho
Bruh BOT is incel central
Only after the election because you people flooded in
>channers
holy fuck go back moron
>projecting
we're not you
You should already know calculus and nonlinear optimization.
AI is God
this was me...yesterday. We code the AI nerd. Im not even gonna bother explaining. Anons r stubborn
Machine learning requires a pretty strong grasp on probability and statistics which is imo is harder than calc 1 2 3 and linear algebra.
Its really not easy, but those machine larning libraries definitely make it seem easy when everything abstracted away for you.
ML is easy peasy compared to statistics. That's the main point he's trying to say in op's pic.
>ML is easy bro, just use a pre-made python library lmao
Some elements of machine learning are easy. Others are stupidly hard. Taking some nicely pre-packaged dataset and throwing Keras or PyTorch or Scikit-Learn or whatnot at it? Not hard. Figuring out how to take real world data that isn't all a bunch of continuous values and turn it into a format that an ML model can work with? Sort of hard, depending on the data. Trying to do hyperparameter tuning on a dataset that just doesn't want to give you good performance? Somewhat of a pain, and can end up with you doing lots of pointless trial and error. Trying to tune hyperparameters on a GAN so you can generate a dataset with lots of highly categorical values, knowing that your evaluation metric involves training a classifier on the generated data to predict real data, leaving you with three areas where poor optimization could fuck everything up? You know, I think it might be a little bit difficult!