Lot of people were making bullshit children books with trash AI generated "illustrations" and sold on amazon. Do they sell? I don't know. Apart from that I can't think of anything else, maybe coding.
it does have a lot more power than previous NLP models but to make things with it other than basic chat will likely require additional training of custom models to suit more targeted purposes, which would probably take more time to setup than what people here would like to spend
I don't know. If you find out, do tell us.
I am exploring using it to generate endless procedural content for a text adventure video game thing. i don't expect it to make money.
Build an app, any app, that is "powered by gpt4". After that shill the fuck out of it on hackernews. You will get at least 500 webshitters to subscribe to your AI cat watering system or whatever. If you name it after a capeshit move/character you can probably dupe double that.
The more I ponder it, the more I realize it's almost impossible to use it for anything interesting. 8k tokens is not enough. Not if you have to fill it full of examples, detailed instructions, and past outputs. Then all the data it needs to complete a task.
Even something boring like "write a wikipedia article on this subject". It might need to have 10 books open to pull information from. You can't fit more than a page from each, and even that is pushing it.
Finetuning might solve a lot of this. Not everything, because finetuning does not teach it new facts, only styles of writing. I haven't looked at their finetune prices. But i'm sure it's something absurd.
i don't know much about how they work. so far as I understand the majority of information is lost and the document is compressed into a single small vector.
chatgpt is supposed to return different results for the same prompt. It's controlled by the "temperature" parameter of the api >What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic
https://platform.openai.com/docs/api-reference/completions/create
but this one does seem more detailed than the others
https://i.imgur.com/hS13TkM.png
and maybe this one is actually a longer response and the image is cut off
Why don't you just ask it?
because it gives me some bullshit response basically saying it's ai model and can't
Lot of people were making bullshit children books with trash AI generated "illustrations" and sold on amazon. Do they sell? I don't know. Apart from that I can't think of anything else, maybe coding.
At least a honest answer...
Everyone acting like this thing is mind blowingly crazy and going to change the world...
bonzibuddy was way more impressive than this thing
it does have a lot more power than previous NLP models but to make things with it other than basic chat will likely require additional training of custom models to suit more targeted purposes, which would probably take more time to setup than what people here would like to spend
Make youtube videos about making money with chatgpt 4
Let companies hire you to install a on premise ai that learns on their data. If possible make the ai perform automated tasks.
Ask yourself more productive questions.
How can you use ChatGPT 4 to make dicky?
I don't know. If you find out, do tell us.
I am exploring using it to generate endless procedural content for a text adventure video game thing. i don't expect it to make money.
Build an app, any app, that is "powered by gpt4". After that shill the fuck out of it on hackernews. You will get at least 500 webshitters to subscribe to your AI cat watering system or whatever. If you name it after a capeshit move/character you can probably dupe double that.
As an AI language model, I am unable to directly build an app for you.
The more I ponder it, the more I realize it's almost impossible to use it for anything interesting. 8k tokens is not enough. Not if you have to fill it full of examples, detailed instructions, and past outputs. Then all the data it needs to complete a task.
Even something boring like "write a wikipedia article on this subject". It might need to have 10 books open to pull information from. You can't fit more than a page from each, and even that is pushing it.
Finetuning might solve a lot of this. Not everything, because finetuning does not teach it new facts, only styles of writing. I haven't looked at their finetune prices. But i'm sure it's something absurd.
>what is embeddings
i don't know much about how they work. so far as I understand the majority of information is lost and the document is compressed into a single small vector.
I'm doing my job in half the time. For freelancers, this is a revolutionary tool. It's like having an autistic savant assistant.
whats your job?
WTF
ask it what BOT is, or what /po/ is
interdasting
chatgpt is supposed to return different results for the same prompt. It's controlled by the "temperature" parameter of the api
>What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic
https://platform.openai.com/docs/api-reference/completions/create
but this one does seem more detailed than the others
and maybe this one is actually a longer response and the image is cut off
I just let it code for me while playing fortnite
it won't even code for me
Scrape the web for emails, generate sales emails, ???, Profit
integrate it in blockchain to predict stock merket long-time change sir
somewhat reasonable
getting a bit opinionated here
definitely some bias here
makes me sick.
gaygpt has its uses, but dive into anything remotely political and it goes full globohomo in seconds