Now that I think about it, that's probably the reason that they said they're not working on GPT5. It's probably way too expensive, and I remember hearing that right now OAI is bleeding money like crazy.
>8 models >176T
Bait image. By the time the MoE info came out, no one still believed the 100T+ meme. GPT-4 is sort of but not really ~2T. I don't think the next step is to develop a more capable AI, but to develop a cheaper and faster GPT-4 class AI.
Why don't BOT and /mlp/ unite to make a GPT competitor? We clearly need a neutral (or counterbiased) model if you don't want the Second Gilded Age to be ruled by pozzed AI.
They've hit hardware limitations with GPT-4, they're also ran by soulless hand-wringers so they'll never do what's necessary to take it to the next level. Local LLMs is where the innovation is.
It doesn't fucking matter. Its going to be censored to hell and utterly fucking unusable if it ever comes out.
It's sad that GPT-4 got lobotomized because of ethicists fearing it would say TND or something, that a 30B model can blow it completely out of the water.
5 will be better at coding but expensive to use. It probably could unironically make a game by itself with little bugs or errors, but it would burn through tokens and end up not really being as useful as 4 + human.
still won't be AI, at least not in the sense the term has accrued over decades of science fiction. the human brain is far more complex than an algorithm, and moreover intelligence and thought aren't properties of a brain in isolation
Gay and shit and ass
huge like yo momma
1337 models
69 gazillion gorillion params
I finally got GPT4 API access and it's preddy good. Also preddy expensive !
Now that I think about it, that's probably the reason that they said they're not working on GPT5. It's probably way too expensive, and I remember hearing that right now OAI is bleeding money like crazy.
good morning sir
>tfw you realize human sentience is 48 models ~6 quadrillion parameters
So AI gf soon?
judgement day soon
The human brain topology changes at runtime, llm's don't.
Even with same number of parameters we still have a huge advantage.
Also you need a whole neural network to simulate a single human neuron.
nice try, Yudkowski
wake me when any machine can match our context length of 75 years
>8 models
>176T
Bait image. By the time the MoE info came out, no one still believed the 100T+ meme. GPT-4 is sort of but not really ~2T. I don't think the next step is to develop a more capable AI, but to develop a cheaper and faster GPT-4 class AI.
>What is GPT-5 going to be like?
Why don't BOT and /mlp/ unite to make a GPT competitor? We clearly need a neutral (or counterbiased) model if you don't want the Second Gilded Age to be ruled by pozzed AI.
Worse, more censored and more politically correct. I advise you to train your own LLM.
The usable information is the same amount as GPT-4. The extra size comes from bigger internal filters.
>one AI gf runs on arch
>other AI gf runs on debian
Perfect setup
As an AI model I can't do shit captain
They've hit hardware limitations with GPT-4, they're also ran by soulless hand-wringers so they'll never do what's necessary to take it to the next level. Local LLMs is where the innovation is.
It doesn't fucking matter. Its going to be censored to hell and utterly fucking unusable if it ever comes out.
It's sad that GPT-4 got lobotomized because of ethicists fearing it would say TND or something, that a 30B model can blow it completely out of the water.
Say hi to Skynet
dis
probably worse than the early chatGPT
Stronger proto-AGI, like GPT-4
GPT-6 will be AGI
5 will be better at coding but expensive to use. It probably could unironically make a game by itself with little bugs or errors, but it would burn through tokens and end up not really being as useful as 4 + human.
>It probably could unironically make a game by itself
That's not impressive, most modern games are made using simple nodetree templates
still won't be AI, at least not in the sense the term has accrued over decades of science fiction. the human brain is far more complex than an algorithm, and moreover intelligence and thought aren't properties of a brain in isolation