And you won't have the hardware to run it if it was. Big Compute and Big Data mogs everything.
https://openai.com/blog/openai-elon-musk#email-2
And you won't have the hardware to run it if it was. Big Compute and Big Data mogs everything.
https://openai.com/blog/openai-elon-musk#email-2
I will never, ever, ever, ever, ever give Sam Altman money. :^)
Good thing that's completely irrelevant.
Someone please leaku OpenAI nodels.
>nodles
heehee
If it ain’t open, it can’t possibly be true AI
I literally said this a fricking year ago
the nature of AI relies on a massive money moat. Only the biggest companies in the world have enough money for researchers, data, and compute. All the local models rely on big tech giving them handouts. Local models would be dead if it wasn't for Meta (lol) and Mistral (who recently sold out to microsoft). Image models rely on StabilityAI who took the approach outlined in that email.
>It seems to be today that OpenAI is burning cash... if you can't seriously compete but continue to do research in open, you might in fact be making things worse and helping them out 'for free"
this is the EXACT state StabilityAI is in. They are bleeding money and continue to release open models which just get shit on by Midjourney and Dall-E in everything except porn. And Stability refuses to acknowledge porn and they actively try to censor it, so they basically have no place in the market.
Why would anyone pay for some StableDiffusion service when they can get Midjourney that produces better images out of the box? When local is constantly self-sabotaging by cucking their datasets and removing copyright to appease the twitter crowd that doesn't even use AI. The fact that any local models exist at all is a blessing, and the fact that some of them are even half as good as the closed models is a miracle. Though the moat will continue to grow especially as the hardware requirements continue to increase and the local community has a hard time justifying paying $8000 just to run a heavily quantized model trained on GPTslop outputs that gives you the same soft censored responses that GPT would.
The fact that coomers still prefer jailbreaking GPT and Claude to using 'uncensored' local models tells you all you need to know. Local can't even win in the SINGLE THING it stands to do better than SaaS at. It's a flash in the pan that will be complete replaced by SaaS within 5 years. There are no open source fighter jets.
True AI won't be owned by a company, it will own you.
At no point did they consider a copyleft license
Have those ever been tested in court?
This whole thing made both sides look really bad.
OpenAI admits it compromised on it's ideals.
Elon Musk, when informed about this, gave the OK anyway.
Its not expensive. A bunch of shutins are making much better stuff with used nvidia cards
>they can copy any advance
Yes like everyone (including OpenAI) copies Googles Transformers and Attention is All You Need paper.
And diffusion models, and Diffusion Visual Transformer.
It's not really "copying" to implement a research paper, plus the architectures have evolved a lot since then largely because of open research.
it's amazing that whites from south-africa somehow still manage to be Black folk. i guess it really isn't a racist term after all.
>Unfortunately, humanity's future is in the hands of ____.
Based Elon
i'm so glad we have Elon supporting open source with this lawsuit
anyone know where i can download the code, weights, and training dataset for Grok?
Quantum computing gonna fix that.
>And you won't have the hardware to run it if it was.
What does this mean? People are running image generators and llms that work pretty well on their PCs. In 2 graphics card generations 4090 performance will cost $500. It's only going to get easier
Sam Altman just call Ilya Sutskever for posting emails.