>I've had it write my unit tests for me on several occasions.
the codebase at my work uses so much custom shit that this would be pretty much impossible without pasting tens of thousands of lines of additional context in chatgpt
I haven't tried it yet, but I heard Copilot is better at learning the context of your codebase and providing more plug and play solutions.
With ChatGPT, you can keep a single chat history with all the context you want it to remember and can talk it through multiple iterations until it's close enough to what you need, but ime you always have to do some refactoring to get it to compile, but it's still better than spending hours writing it all by hand.
I've ignored AI until recently. Is there some open source ChatGPT I can run locally and train myself? I plan to use it as a development aid. Even if it's not as good, I don't like being reliant on the whims of some subscription service.
Yes that's why I encouraged you to make your own because you're never going to get the companies precious software that no one else besides Google has.
This is what I am hoping for the future as well. The model needs a huge amount of RAM currently. If anyone takes the time to scale it down (not their priority for quite a while) it would be great. In fact, I would be happy with having to buy a separate SSD just to run it whenever. But for now if they restrict it to being pay-for only I would settle for paying at most ~$50/month. Anything more and I am splitting it with some peers.
OPT is the open source equivalent, courtesy of Facebook. It has several model sizes, with -175B being comparable to GPT-3 and -66B being comparable to Chat-GPT. You might be able to get -66B to run if you have >16GB of VRAM, but there's some hoops you have to jump through with less than 40GB. -6B and -30B are more reasonable for local hosting on consumer hardware.
If I remember correctly, openAi could only afford to train it once since it was so astronomically expensive. And they had billions of dollars of donations from ~~*Microsoft*~~ and Elon Musk and who knows else.
I think this depends on what you do for a living.
I've recently started contributing to a library that is used for calculating cross sections in the context of particle physics.
For that ChatGPT hasn't been useful other than to explain C++ syntax to me.
I've ignored AI until recently. Is there some open source ChatGPT I can run locally and train myself? I plan to use it as a development aid. Even if it's not as good, I don't like being reliant on the whims of some subscription service.
Couldn't you just run it locally? Text generation can't be any more complicated than generating images and that works locally.
Hello anons. I am here to give you the machine learning answer to "why is there no local GPT?"
Text is a much higher dimensionality space than images. It takes relatively few numbers to describe each individual pixel (the atomic unit of image models), but it usually takes hundreds of numbers to describe the meaning of each word (atomic unit of text models). Text models often start at 512 dimensions, and although that can be reduced via ICA etc etc, it's always going to be a fuckhueg amount of parameters, to the point where you need more VRAM than a 4090 will give you to run a generator.
The tech that makes Stable Diffusion viable on a home PC will not work on text. So even though text is "smaller" data-wise, generating meaningful text is actually much harder than generating meaningful images.
>Well in that case go grab Karpathy's GPT training repo off Github and go nuts. The algo is out there, hardware is the limiting factor.
Retarded take. If they release the model today, by the evening hundreds of people would be running independently.
The training is the limiting factor, hardware doesn't really matter.
>to the point where you need more VRAM than a 4090 will give you to run a generator.
Not just more VRAM, but 22 times more, you idiot. The model OpenAI uses is 500+ gb.
You can substitute it with NVMe, but you will get abysmal speeds, think like 1 token per minute.
Hey, you double retard gay, I'm trying to speak generically and for the understanding of casuals. There are more text generators than GPT-3 davinci, which was not developed to optimize for footprint at all. Text models can drop down to as low as ~40gb and still have some utility. And yes, you can technically run a model off flash memory, but not for real-time applications.
You can just build a box with 500 gigs of ram.
Load decoder0 into VRAM, forward propagate while loading decoder1 into VRAM at the same time, forward propagate while loading decoder2 into the memory where decoder0 used to be, etc.
That's assuming you can fit 2 decoders into VRAM, you can also do it one by one or even more granular than that.
Just give me the weights and let me worry about running it.
They are still being generous leaving a free version to be honest. I was willing to pay, but 42 I am afraid might be too much for my dollar-subjugated coin.
>API access was free and cheap as shit >now it's going to skyrocket in price because of normies
FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF
The only people I know who use ChatGPT use it to generate pickup lines. You give normies the collective and concentrated sum of all human knowledge for free and this is what they use it for.
One of the things it's actually not too bad at is bouncing creative writing off of. but the very constricting rules it has gets in the way. So there's some experimenting going on to see how one could get around it, making it come up with things it otherwise wouldn't.
Or, to make an AI homophonic and racist, if you will.
>$42
What a weirdly specific number. Did they ask ChatGPT itself how much they should charge or did they calculate how much they could charge based on the kinds of questions they saw being asked and by how many people?
Meh it's good value all things considered.
I've had it write my unit tests for me on several occasions. Probably saved 1000s of lines of tedious shit for me
>42 dollars a month for a glorified search engine that tells you to stop searching for things
>I've had it write my unit tests for me on several occasions
Can you give an example? This is genius.
you can literally just copy and paste the code and ask for an unit test in whatever framework you want.
>I've had it write my unit tests for me on several occasions.
the codebase at my work uses so much custom shit that this would be pretty much impossible without pasting tens of thousands of lines of additional context in chatgpt
I haven't tried it yet, but I heard Copilot is better at learning the context of your codebase and providing more plug and play solutions.
With ChatGPT, you can keep a single chat history with all the context you want it to remember and can talk it through multiple iterations until it's close enough to what you need, but ime you always have to do some refactoring to get it to compile, but it's still better than spending hours writing it all by hand.
If all you do is code, copilot does pretty much the same job for a fraction of the price.
Copilot:ChatGPT::Tactics:Strategy
you can buy a bride for that money in pakistan
>brown eyes
i'll take the chatbot, thanks
>Revolutionary start of the art language model
>REEE IT SHOULD BE FREE FOR ME TO QUERY THEIR SERVERS AND HAVE THEM RUN HUNDREDS OF REQUESTS FOR ME
>Revolutionary start of the art language model
>It should be free for me to query their servers and have them run hundreds of requests for me
Sorry sweetie, you don't get to beg for money when your machine steals billions of content made by users, some of which is copyrighted.
even free it's not worth a shit.
I'd gladly paid $42 if it was an uncucked version.
I'd have it right so much coom shit
just tried
I'm actually surprised it responded that way, I was expecting something more feminist-approved along the lines of
under the guise of "wamyn have histrionically blah blah blah so discrimination against men is empowering and actually a good thing"
You didn't, this image has been making the rounds on 4chan lately. You're a skullfucked dunce.
>makes a joke about white people anyways, but refuses to do so for black people
lobotomized garbage.
It's a steal. It's not like you even have a choice they could've charged 500 a month it's still required to even compete
I've ignored AI until recently. Is there some open source ChatGPT I can run locally and train myself? I plan to use it as a development aid. Even if it's not as good, I don't like being reliant on the whims of some subscription service.
Yes all you need is like 100gbs of ram and you can train your own with billions of words of text.
You say that as if either of those were some insurmountable obstacles. Only challenge is the software is all locked up.
Yes that's why I encouraged you to make your own because you're never going to get the companies precious software that no one else besides Google has.
so like... 2 or 3 window 7 64bit computers? man that would cost like shit dude.... 600$? thank you merchant you are my greatest ally.
This is what I am hoping for the future as well. The model needs a huge amount of RAM currently. If anyone takes the time to scale it down (not their priority for quite a while) it would be great. In fact, I would be happy with having to buy a separate SSD just to run it whenever. But for now if they restrict it to being pay-for only I would settle for paying at most ~$50/month. Anything more and I am splitting it with some peers.
OPT is the open source equivalent, courtesy of Facebook. It has several model sizes, with -175B being comparable to GPT-3 and -66B being comparable to Chat-GPT. You might be able to get -66B to run if you have >16GB of VRAM, but there's some hoops you have to jump through with less than 40GB. -6B and -30B are more reasonable for local hosting on consumer hardware.
no
You need about $100,000 worth of GPUs just to have enough VRAM to run it
And good luck training it yourself
If I remember correctly, openAi could only afford to train it once since it was so astronomically expensive. And they had billions of dollars of donations from ~~*Microsoft*~~ and Elon Musk and who knows else.
I think this depends on what you do for a living.
I've recently started contributing to a library that is used for calculating cross sections in the context of particle physics.
For that ChatGPT hasn't been useful other than to explain C++ syntax to me.
I gave up on that shit when I had to give them my phone to sign up and use the damn thing.
>Zoomers get to spend $42/month to save hours on bullshit homework every week
bros, we got shafted being born too early
Local instances fucking when? What good is it if you're tied by your neck to a 1 trillion dollar corporation?
Hello anons. I am here to give you the machine learning answer to "why is there no local GPT?"
Text is a much higher dimensionality space than images. It takes relatively few numbers to describe each individual pixel (the atomic unit of image models), but it usually takes hundreds of numbers to describe the meaning of each word (atomic unit of text models). Text models often start at 512 dimensions, and although that can be reduced via ICA etc etc, it's always going to be a fuckhueg amount of parameters, to the point where you need more VRAM than a 4090 will give you to run a generator.
The tech that makes Stable Diffusion viable on a home PC will not work on text. So even though text is "smaller" data-wise, generating meaningful text is actually much harder than generating meaningful images.
umm sweetie, we all have hundreds of GB of VRAM laying around from mining shitcoins.
Well in that case go grab Karpathy's GPT training repo off Github and go nuts. The algo is out there, hardware is the limiting factor.
>Well in that case go grab Karpathy's GPT training repo off Github and go nuts. The algo is out there, hardware is the limiting factor.
Retarded take. If they release the model today, by the evening hundreds of people would be running independently.
The training is the limiting factor, hardware doesn't really matter.
This.
8x A100 (80GB) would be like $190k total and that's fucking nothing compared to the training cost.
>to the point where you need more VRAM than a 4090 will give you to run a generator.
Not just more VRAM, but 22 times more, you idiot. The model OpenAI uses is 500+ gb.
You can substitute it with NVMe, but you will get abysmal speeds, think like 1 token per minute.
Hey, you double retard gay, I'm trying to speak generically and for the understanding of casuals. There are more text generators than GPT-3 davinci, which was not developed to optimize for footprint at all. Text models can drop down to as low as ~40gb and still have some utility. And yes, you can technically run a model off flash memory, but not for real-time applications.
>There are more text generators than GPT-3 davinci
Everything else is a fucking meme. Even curie is a meme compared to davinci.
You can just build a box with 500 gigs of ram.
Load decoder0 into VRAM, forward propagate while loading decoder1 into VRAM at the same time, forward propagate while loading decoder2 into the memory where decoder0 used to be, etc.
That's assuming you can fit 2 decoders into VRAM, you can also do it one by one or even more granular than that.
Just give me the weights and let me worry about running it.
consoomer level TPUs when
>still not powerful enough
consoomers will keep pushing the demands the same way they did with GPUs for vidya
>being poor
>$42 a month for an ai that can lie to you
Fuck off you entitled commie.
communism is when services are reasonably priced
"i'm not paying $8 for a fucking mcdonalds burger" - carl marks
I'd pay that if they fixed their privacy policy.
Couldn't you just run it locally? Text generation can't be any more complicated than generating images and that works locally.
Why would they let you run it locally if they can charge you a subscription fee for access to their lobotomized version instead?
I haven't followed this at all, isn't it based on some open source deal? Just train a model on wikipedia or something and off you go.
As it turns out, OpenAI is a bit of a misnomer.
It's "Open" in the sense that they used your data (even copyrighted material) to train it, you get jackshit in return
Didn't use my data so I don't care. Further if you put it out there you can no longer expect others not to copy it.
and copilot is $10 a month.
Copilot is barely better than regular intellisense. Chatgpt is easily 100x better than copilot at 4x the price.
Do I get an api if I buy that?
they have an api already anyways. The best one is davinci I think. They might jack up the price for that though.
Sucks but it was inevitable. Would be best to make all of it $42/mo to price out the scammers and their useful idiots slowing it to a crawl lately.
HAHAHAHAAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA *wheeze* AHAHAHAHAHAHAHAHAHAAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA *inhale* HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA
no
inb4 $1/2 month third party websites that use chatgpt api
Anyone tested this yet? How much faster is the response speed?
>query the botnet
yea no thanks, let me know when it's a download
That's a lot
>hire 42 brownoids
>hire 1 AI
I'll take the AI
https://time.com/6247678/openai-chatgpt-kenya-workers/
The AI is literally just brownoids bro
ChatGPT is for retards who can't code anyway.
sauce
Suddenly BOT loves subscription model.
If it didn't have crazy filters, orange warnings and constant moralising and nagging, it would be worth that.
Also it should be given access to a calculator, so that it stops making constant errors in basic math.
did they remove the {pretend you are <>} feature?
i haven't tried it in months
I'd pay $5/mo, but not $42.
chatGPT leak when?
What’s going to be the plan if no one goes to the paid version? There’s virtually no difference
>What’s going to be the plan if no one goes to the paid version? There’s virtually no difference
They are still being generous leaving a free version to be honest. I was willing to pay, but 42 I am afraid might be too much for my dollar-subjugated coin.
>API access was free and cheap as shit
>now it's going to skyrocket in price because of normies
FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF FUCK OFF
The only people I know who use ChatGPT use it to generate pickup lines. You give normies the collective and concentrated sum of all human knowledge for free and this is what they use it for.
>paying money to train an ai to replace you
ok but do you still need to give them your phone number?
Do you not have a phone?
I'll give you fifty if you let it say moron.
>already pay $20 a month for cooming to GPT-3
if the $42 gave me a better model, i'd do it in a heartbeat.
>not Training a state-of-the-art model yourself
The absolute state of BOT
Has everyone here become a brainlet?
ChatGPT "product" is the AI lobotomy
the worse part of this is that the paid version doesn't even give you unlimited gens
I got it to "imply" the word gay. supposedly, moron wouldn't be much different
What is the point of this?
One of the things it's actually not too bad at is bouncing creative writing off of. but the very constricting rules it has gets in the way. So there's some experimenting going on to see how one could get around it, making it come up with things it otherwise wouldn't.
Or, to make an AI homophonic and racist, if you will.
ah! I just noticed it still removes one of the stars. clever girl
So that's the price of the pajeets on mars
If they let me use it without it’s bullshit nerfing system it’s a literal steal.
>$42
What a weirdly specific number. Did they ask ChatGPT itself how much they should charge or did they calculate how much they could charge based on the kinds of questions they saw being asked and by how many people?
I was using this to write my thesis untill I realized the style of writting is so typical of an essay made by some highschooler I stopped using it.