This is it. My PC for AI for late 2023. No one can stop me
>I5-13500
>4060 Ti 16gb
>DDR5 64gb
>random asrock mb
>500w psu
>4tb ssd *2
This is it. My PC for AI for late 2023. No one can stop me
>I5-13500
>4060 Ti 16gb
>DDR5 64gb
>random asrock mb
>500w psu
>4tb ssd *2
That's AI?
Commission artist bros, we are fucked...
Can still tell it's AI. But give it time.
We're truly in the end times.
When do you think code monkeys will be replaced?
I am a first year CS student and don't want to lose, already.
Just change to IT already. AI cant plug in Ethernet cables.
>I am a first year CS student
You'll be fine m8 seriously.
I have used chat gpt to make some small scripts but I had to re write the prompt like 10 times. And it still was a little off. The amount of time I spent writing to it I could have just wrote the script it. I was fun though.
The ai currently can't plug in several languages and frameworks together. It doesn't. Know how the system you are working on works.
They're might be a point in which ai can be used by individual companies and it can do stuff that works but even in the end there needs to be a human to fix its poop pants shit it steals online.
You're ok anon don't worry. Learn how you would always learn. Through trial and error.
i devised a way to write any complex program with chatgpt, its simple, you just ask it to make a detailed list of how it would program it written out and then next you ask it to write that program, i wrote flappybird chess space invaders and many more programs a webscraper that sumemrizes whatever you type in automatically googles and pulls a few links sumemrizes that, easy took 3mins for each https://streamable.com/o788zh https://streamable.com/pcecke
This 100%.
Ask GPT about tasks more intricate than writing whatever rudimentary bs they’re copypasting daily and it will fail you.
Myself I tried it out for a brief period but ultimately had to cut it out of my workflow because of all the errors and loss of productivity from me being sent down endless rabbit holes. This goes for GPT-4 as well. It can be a great tool when learning new APIs, but for production, given that you’re at the appropriate level in your journey? It won’t make you omniscient.
90% of what you read about the matter is either said by corporate bootlickers chasing clout/views, or blinded fanatics. It’s without doubt a great means, but far from an end in itself.
>complex program with chatgpt
shitty 50 liner 2D "games" in python, complex programs?
Sir you need to calm the fuck down dear sir.
here's a video of a guy using ChatGPT to try to optimize some code from Mario 64. It seemed like it was right about 70% of the time or something like that
An alternative I found was using spot VMs on azure. It's about $0.19/hour for a VM with 32 cores, 112gb ram, 700gb ssd and an AMD Radeon Instinct MI25 16gb gpu. It's the NVv4-series one on this page
https://azure.microsoft.com/en-us/pricing/spot-advisor/#pricing
And the detailed specs are here
https://learn.microsoft.com/en-us/azure/virtual-machines/nvv4-series
I initially read about doing it this way here
https://github.com/rcarmo/azure-stable-diffusion
I think it's more for people that know what they're doing though so you don't end up spending a lot of money just learning how to train a model. I don't really know what I'm doing though. But overall the few cents an hour for a VM could end up being a lot cheaper than buying a computer for thousands of dollars
>let them know my fetish
brilliant
They already know
You can buy a computer yourself with those specs for under $200.
Can you show me?
>4060
3090/4090 or don't bother retard. 24GB VRAM is the absolute bare minimum
get a life tranny
Waste your money however you want you braindead nagger just dont shit up whatever AI general with your bitching when you realize your paperweight cant do shit
3060 works for SD u will need more vram if u want to run the best llms but a lot u can run fine on a $350 3060
did you add chemtrails in the prompt
no he added "john brennen stratisfiric areisal injections bill gates block out the sun" https://www.youtube.com/watch?v=WBG81dXgM0Q
not 13400f
>assrock mb
>500w psu instead of 650w
>not posting psu model
retard
>>500w psu
That shit is going to fry
i run a dual gpu dual gfx card setup
dual psu*
>image search
>0 results
I hope you have a good house and life insurance.
open up any outlet in ur house it will look worse, i do electrical work and am very confident in my wiring
Great idea, plan on a similar setup with the 4060 TI 16gb. Plan using one and buy a second one with enough money, so I could NVLink them for a total of 32gb vram. With regular ram I only plan for 32 gigs, but it isn't very costly to upgrade.
Glad I was a WaitFag and will also get the 4060Ti 16GB version for my AI prooompting.