This very recent interview got nuked (apparently at the request of OpenAI) but it got archived first
Several interesting things in it but the most interesting is that they're thinking about open sourcing GPT3
Open source davinci or code-davinci-002 could be super interesting, even though they're expensive to run
forgot the link like a retard
https://web.archive.org/web/20230531203946/https://humanloop.com/blog/openai-plans
I wonder why it was pulled
the information Altman gave is interesting but doesn't seem THAT spicy
maybe they think the part about how the API is struggling from lack of GPUs makes them look bad
Holy shit. Getting a little worried about overdoing it on the dopamine.
if you're an aicg coomer the open source gpt-3 is potentially bigger news for you
not a lot of people knew this before it was removed from the API but code-davinci-002 is extremely good at smut, like weirdly good
Why is this suprising? I'm 99% most things humans write in their free time is smut. The overwhelming majority of FF.net and a AO3 is smut.
There are entire forums dedicated to specific fetish fiction that have been running for decades.
The most prolific genre of fiction (both by books written and copies sold) is romance.
Face it, the future fiction AIs will be trained overwhelmingly on stuff written by fujos, Chris-chan tier degenerates and wine aunts.
That's good news, since GPT-4 is still slow as fuck on the API. It's noticeably faster in the ChatGPT interface for some reason, as is 3.5. They seem to have given themselves secret faster or prioritized versions of the models that API customers don't have.
Any models specifically geared towards translation?
chatgpt was pretty decent at it,but seeing as I'll be off the grid for a few days I was hoping that there's something that runs locally
no, none of the old gpt3 models are for translation, they're all just text autocompletion models
also none of the ones worth using are small enough to run locally anyway
yeah they never made a medium sized model of that series for some reason, it goes straight from davinci (175B) down to curie (6B)
and there's no reason to use curie now since llama 6B is much better
davinci is the only one that would be exciting to have open sourced
>no, none of the old gpt3 models are for translation, they're all just text autocompletion models
ahh a shame,was hoping there would be something since all the horny mf's are making all these chatbots and models
>gpt-4 translation i'd rate 9.5/10, very good grasp of language
Yeah I was genuinely surprised how well it could translate things,before this I was used to machine translated garbage that half the time was incomprehensible, to GPT's understandable and well formatted results ,even for Chinese which was my biggest pain in the ass
aicg translated several japanese visual novels using free davinci-003 and then again with free trial (scale) gpt-4.
the difference in translation is quite big. I'd rate davinci-003 about 6/10 or 7/10 (a lot better than machine translation but worse than a human) and gpt-4 translation i'd rate 9.5/10, very good grasp of language.
davinci003 is gpt3.5 though, not 3, so open sourcing it isn't on the table
I was talking specifically about davinci-003, not 3.5 lobotomy.
yes I understood that and that's what my post is about
text-davinci-003 is gpt 3.5
source:
https://platform.openai.com/docs/models/gpt-3-5
and my point is that for translation text-davinci-003 was used, not 3.5 "1/10th lobotomy" thing they shill at everyone.
003 can follows complicated instructions a lot better than 3.5 anyway.
Retard read that fucking link, how many ways do I have to say it
TEXT-DAVINCI-003 IS GPT3.5
3.5 isn't just Turbo, it's a whole series of models that includes Turbo, Text-davinci-002 and text-davinci-003 and code
well the retard are you.
before "3.5" was released that page looked different. i would say there is still a minuscule chance that they might decide release 003 on a lark, but they will gatekeep 3.5 because after all, for businesses price means more than quality, the cheaper the tool is the better.
This anon is correct, as that page confirms text-davinci-002 and 003 are both considered GPT3.5 so they won't be on the table for open sourcing.
Only old Davinci from 2020 will be since that's GPT3, maybe Code-davinci-002 also. That's not nothing though, it'd be huge for open source research to have 2020 davinci open.