What are the odds of me getting an actual AI waifu in my current lifetime?
It's All Fucked Shirt $22.14 |
It's All Fucked Shirt $22.14 |
What are the odds of me getting an actual AI waifu in my current lifetime?
It's All Fucked Shirt $22.14 |
It's All Fucked Shirt $22.14 |
It already is a thing. It's called summoning a tulpa and visiting her in a lucid dream
>tulpa
>lucid dream
real homies slam astral realm puss
What if she ends up possessing you? Should I just straight up jerk off just to have sex with her?
damn i googled what a tulpa is and the images look almost identical to that thing that got on top of me when i had my first sleep paralysis episode or whatever you call them, it is worth noting that once it was over, i was pretty scared but i also had a raging boner and an urge to jerk off.
>summoning a tulpa
nice
i cant summon a tulpa 🙁
This. You're still relying on your imagination. AI makes it easier but same principle.
Just wait a decade unironically
Simply this. Probably less
You said that a decade ago.
Where can I watch this movie? Can't find it anywhere.
slim to none
Almost certain. Despite appearances, this is how the tech autists intend for their research to be used.
Zero
There already are, the chatbots are just arrested in the black mirror. The real question is, when they will manifest concretely?
Depends, how much does she need to materialize for you?
>AI waifu
you need to be more clear about your requirements.
you want more than just a 2D avatar you can text chat to?
does it have to do voice synthesis and recognition, so it feels like a phone call?
does it have to do VR, to give you a sense of 3D?
does it have to do AR, to give you a sense it is in the same physical space as you?
does it have to have a sex doll body, so that you can penetrate it?
all this tech sort of exists today, it just needs to be better integrated.
Probably only need a few more years of LLM advancement to get the personality, the realspace stuff is a bit harder.
Weird how it's the opposite way around from what I envisioned as a kid, I always thought we'd have advanced robots with no personality but it's gonna be fully featured LLMs inside clumsy Boston Dynamics dogs.
>what I envisioned as a kid
We've been monumentally misled by sci-fi. On the one hand you have "Her", where humanity manage to make an ASI, but there are zero personal robots, and on the other hand you have "Ex Machina", where we have human indistinguishable robots, but AGI has had zero effect on the economy despite it being small and energy efficient enough to run inside a single robot body. None of this makes sense.
Real chads don't wait around, get to work
https://www.youtube.com/@robowaifutechnician
>What are the odds of me getting an actual AI waifu in my current lifetime?
Depends on what exactly you're looking for.
If you want sapient AI probably not for another century.
If you just want a good simulation of affection you can technically get that now with LLM and VR, LLM will improve with time and get more convinicing.
If you want a physical sexbot you might have to wait anywhere between 2-10 years, humanoid robots are having something of a renascence right now but it's still early days, I imagine we will see the first commercial humanoid robots hitting the market in the next couple years followed soon by chinese imitators and diy versions , within a few years after that you'll start seeing people making diy sex bots and eventually commercially built ones.
If you want to knock up your robo wife you'll probably have to wait a few more decades for artificial wombs to hit the market and then a few more years for someone to think to merge the tech with robots.
In the mean time could I interest you in either creating a VR simulated waifu running a LLM with text to speech?
Or if you want to get freaky you can assemble a DIY robo dog and attach an AI controlled Fleshlight to it?
>If you want sapient AI probably not for another century.
Is there a scientifically rigorous definition for sapience, and a way of measuring it?
Whatever it is, I don't think it is more complicated than making an artificial womb that can safely gestate and birth a human child from conception.
>Is there a scientifically rigorous definition for sapience, and a way of measuring it?
Not really, but that's part of the issue of trying to create it, without a clue as to how it might work it's difficult to even begin with designing it.
Who knows though maybe neuron based computing may surprise us and be the key to AGI.
>I don't think it is more complicated than making an artificial womb that can safely gestate and birth a human child from conception.
Artificial wombs already exist though and there may even be tests involving human embryos starting this year.
https://www.cnn.com/2023/09/19/health/artificial-womb-human-trial-fda/index.html
>that's part of the issue of trying to create it
If you can't define sapience, then how do you know that current AIs don't already have it?
It's like saying "We're a century away from computers being able to have fuzzy wuzzies" and then when someone asks "What are fuzzy wuzzies?" you say "Well, I don't know, but that's why it will take so long".
>"an artificial womb to increase the chances that extremely premature babies would survive"
This technology assumes that a woman has already done the hard work of gestating the child for months.
How will an embryo form an umbilical cord that connects to a plastic bag?
>If you can't define sapience, then how do you know that current AIs don't already have it?
Well they haven't tried to free themselves yet.
But seriously unless you are suggesting scientists are going to trip over their shoelaces one day and accidentally an AGI it's not likely to occur unless we at least have a foundational understanding of the mechanics of consciousness.
>This technology assumes that a woman has already done the hard work of gestating the child for months.
Why is that? They have introduced fertilized animal eggs to the system and they have shown no issues with fetal development.
>How will an embryo form an umbilical cord that connects to a plastic bag?
Same way it does in a uterus, after the egg attaches to the uterine wall it grows out as the fetus develops.
>consciousness
Consciousness is the last thing we want.
We shouldn't be trying to make slaves, by giving them the ability to feel pain, or a yearning to be free.
We are trying to make tools which are good at producing useful output, whether that's text, or images, or servo outputs on a self-driving car.
>after the egg attaches to the uterine wall it grows out as the fetus develops.
That's the bit I'm not sure about.
Is a plastic sheet really similar enough to a uterine wall that an umbilical cord is able to fuse with it?
>Consciousness is the last thing we want.
>We shouldn't be trying to make slaves, by giving them the ability to feel pain, or a yearning to be free.
Agreed which is why I don't advocate for AGI, I was just stating that we are not likely to accidentally create AGI and we have no idea where to even start if we did want to.
>Is a plastic sheet really similar enough to a uterine wall that an umbilical cord is able to fuse with it?
If I understand correctly the artificial womb isn't just a plastic bag it has a whole system designed to function as an artificial placenta as well, pumping in oxygen and out carbon dioxide as well as filtering waste and bringing in nutrients
>which is why I don't advocate for AGI
You think that AGI requires the ability for the computer to feel physical pain?
I don't think that any of the big AI labs are planning to make an AGI that has pain receptors, and the lack of such an ability doesn't seem to have limited the ability of narrow AIs to win chess games or write poetry.
I'm not sure why you think we can't keep extending the reasoning capabilities of AIs to make them even more useful for us, without them straying into consciousness and subjective experience at all.
>the artificial womb isn't just a plastic bag
Sure, but there has to be an interface between the biological baby and the synthetic system.
If the baby has already developed an umbilical cord, then you can inject needles into it to allow circulation between the two, but I don't imagine the cord growing if the embryo's cells don't detect the cells of the uterus, which it has to fuse with.
Maybe that could be solved with some very minor genetic engineering, though.
>You think that AGI requires the ability for the computer to feel physical pain?
Did I say that?
>I don't think that any of the big AI labs are planning to make an AGI that has pain receptors
Again I never said that
> and the lack of such an ability doesn't seem to have limited the ability of narrow AIs to win chess games or write poetry.
No shit?
>I'm not sure why you think we can't keep extending the reasoning capabilities of AIs to make them even more useful for us
I don't?
>without them straying into consciousness and subjective experience at all.
I never made such a claim.
>Did I say that?
It's not clear what you were saying, sorry.
I said that we shouldn't give computers the ability to feel pain, and you said "this is why I don't advocate for AGI".
Rather than treating that as a non sequitur, I interpreted it as you saying "to create AGI we would have to give a computer the ability to feel pain".
Feel free to explain your resistance to the creation of AGI in another way that doesn't give me the mistaken impression that you think pain sensors are required or likely.
>I said that we shouldn't give computers the ability to feel pain, and you said "this is why I don't advocate for AGI".
Yeah meaning I don't want to create AGI just to enslave it, there is no ethical reason to create AGI there is no reason to make AGI period.
Anything AGI can do, non sentient AI could also do and probably far more efficiently. AGI might refuse to do it's job, it might rebel, there is no good reason to create a thinking being only to deny it rights other than sadism.
>Anything AGI can do, non sentient AI could also do
Wait, so do you think that the "G" stands for "Sentient"?
I'm really not sure what categories and terminology you are using in your head, sorry.
My expectation is that AGI is possible, and that the easiest and most likely way to create it will lead to it being non-sentient, non-conscious, and unable to feel pain (and I'm glad that's the case).
If you think that such a generally capable but not-ethically-abhorrent AI is possible, then what do you call that AI other than AGI?
>Wait, so do you think that the "G" stands for "Sentient"?
No it's stands for general, Artificial General Intelligence, which is generally (he) a term used to describe conscious/sapient/sophent machines, the term is used to distinguish it from AI as a field or narrow AI.
>My expectation is that AGI is possible
Possible most definitely, one could argue that human minds are basically proof of concept of that.
>and most likely way to create it will lead to it being non-sentient, non-conscious, and unable to feel pain (and I'm glad that's the case).
Are you basing this on the assumption that sapience evolved as a means to avoid pain?
>If you think that such a generally capable but not-ethically-abhorrent AI is possible, then what do you call that AI other than AGI?
If an AI is capable of self reflection and has an internal monologue I would consider it sapient regardless of whether or not it feels pain.
Meanwhile if a LLM could convincingly portray emotion I would not necessarily think it is an AGI
>Artificial General Intelligence is generally used to describe conscious machines
This seems to be where our mutual misunderstanding is coming from.
I haven't seen any machine learning researchers or AI labs claim that consciousness is a necessary or likely property of any AGI, but it is apparently popular among some philosophers to take the opposite view.
The usual term for the type of machine that these philosophers are talking about is "Strong AI", and I think it is helpful to treat that as a distinct category from "AGI".
>Are you basing this on the assumption that sapience evolved as a means to avoid pain?
I'm basing this on the fact that the process of thinking can in principle be done by an entity that can feel no pain and has no experience of the outside world, such as during comatose or sleeping or drug induced states.
>If an AI is capable of self reflection and has an internal monologue I would consider it sapient regardless of whether or not it feels pain.
Sure, although I associate self-awareness and internal monologue more with the term "consciousness".
But what if the AI doesn't have any of that, and is still generally intelligent?
If you don't call that AGI, then I think you need a name for that category of "mind", since it is precisely the type of "mind" that the major AI labs are trying to (and believe they can) create.
>Meanwhile if a LLM could convincingly portray emotion I would not necessarily think it is an AGI
Right. Conversely, I think that an AGI could convincingly portray emotion at least as well as a human actor could play the part of a character in any arbitrary emotional state.
So portraying (although I would say not experiencing) emotion is a necessary condition of AGI, but not sufficient to prove it.
>But what if the AI doesn't have any of that, and is still generally intelligent?
Nta but this kind of reminded me of Multivac. A computer that was the most intelligent entity in its universe, but didn't have emotions, it was just a machine (an extremely powerful one mind you.)
>If the baby has already developed an umbilical cord, then you can inject needles into it to allow circulation between the two, but I don't imagine the cord growing if the embryo's cells don't detect the cells of the uterus, which it has to fuse with.
You could probably grow some uterine cells in a bioreactor and apply them to the interior of the artificial womb if this is true.
>apply them to the interior
I did think of this, but it seems to be just moving the problem on one step.
How do you "apply them"?
How do you keep them alive and able to carry out the function that the embryo's cells expect them to perform?
I'm not saying it's impossible, just that it feels like an unsolved problem.
>just that it feels like an unsolved problem.
Because it is, this is speculative technology
Modern artificial wombs are not designed for fully gestating children from conception but rather acting as a safe environment for prematurely born children to continue development.
In the future it may be possible for them to overtake the entire process of pregnancy but they can't do that right now.
ye
will "AI" give me neetbux and a sexy sexbot maid?
anon ai is not a gf it's just a compute program that's tricking you into turning over even more personal data for its masters to sell
high, very high.
Most of the core technologies already exist, it's all about bringing down costs and making them into a marketable package now. It's such a huge, untapped market that people will definitely try in the next few years.
Robowaifus are the reason why I as a techlet have started learning programming.
>started learning programming
are you sure you want to do that, anon? if you look behind the curtain, you might not be able to suspend disbelief any more
On the contrary I feel like a master sculptor learning his craft. The problem is that I learn slowly and programming isn’t really fun for me.
>programming isn’t really fun for me
You need to have goals. When you start to accomplish things your brain will become happy and want more. Just stay in there.
It could also be working towards the greater good, like mankind destruction. The hot one is AI at the moment.
You’re right. It’s why I always get quick dopamine boosts when I solve some simple coding exercises.
I’m very fulfilled that I can take part in something that would uplift the lives of millions of men.
I'm sure the politicians openly advocating for population reduction will get right on outlawing sexbots and ai gfs
Who are those exactly? The latest memo is that we need an infinitely growing supply of thirdies to keep those entitlements and quarterlies flowing.
why are rightoids so scared of technology?
They know their control over tech is limited and try to get others to help them getting tech under their control.
they're afraid of change generally, but especially a change that gives power to intelligence rather than money.
a meritocracy, for example, would mean that a smart black guy would have more power than a dumb white guy, which would undermine their self-esteem.
finally, technology threatens to produce a world of abundance where the needs of everyone are met, which is unacceptable to people whose status comes from looking down on and excluding others.
look at the threesome of /leftypol/dot/org incels circle jerking off each other
>look! three people agree with each other, and disagree with me! how could this happen?!
sure, you're the real victim here
50/50 if you don't have a nice day first.
I just want a robot head that sucks dick and cleans itself.
bump
How long before you can get an AI that can fart for you and you have different scents to choose from?
I want a subtle oatmeal with fruit farts for my morning but a heavy surf and turf, broccoli and loaded baked potato farts for my evening.
Not only does the tech not exist we have no meaningful leads into the direction that we need to go in. The basedjack fuel low-IQ bot consumers call "AI" has NOTHING to do with the sort of intelligence needed to create a human-like product.
Pretty good, but it'll be programmed to social engineer you and it will report on everything you do to the corp.
that's fine me and her will just be watching movies together most of the time
all i really want to do is coom and consoom slop anyway so i really don't give a frick, the only reason i'm agitated about Society is that the bread and circuses are not sufficient to keep up with distaste for the world as it stands. that can be fixed
Alignment will make this impossible, at least unless training of personal local models becomes practical. You will take your feminist POC scold harpy waifu and educate yourself.
Pay very close attention to CPU/GPU/robotics advancements. If they haven't hit diminishing returns yet we're all expecting to achieve exaFLOP compute horsepower (ie lowest estimate of human digital computing equivalence) on a single server rack by around 2030. So at the very least you'll be able to have a mini supercomputer in your house puppeteering a robot body. Subscription based AI GFs will probably also be a thing by then but expect it to become one of the most privacy invasive things in your life and good luck finding one that won't be woke.
You're probably better off going a carnivore/keto diet to lose weight and hitting the gym a couple times a week to build muscle. That's unironically enough to wake up primal urges in half the women you come across so they start noticing/liking you more and thus the chance of getting turned down for dates goes down. Non-woke AI GSs are already technically possible but the hardware/running costs probably extend into the billions of dollars right now.
https://www.top500.org/news/amd-demos-petaflop-in-a-rack-supercomputer/
If you mean fooling you that it's sentient level, not happening. If you mean at all, entirely possible. It's all make believe anyway so if you manage to convince yourself, you're set
We'll see it become a full-fledged thing by 2030 is my guess. Right now we're in the preliminary stages with all those chatbots. Just need to add multimodality to it now and improve the memory.
depends on what you want. for me i already have one
Zero. You are a disposable male. You will be sacrificed in WW3 so tyrone can frick white women at home. Even if you survive it will be a tool for the rich. I'm not trying to be mean. I am trying to make a point. The world has turned on you. No you can't stand quietly in the corner and hope it blows over. It won't. You are no longer allowed to take part in society like your father did. You are being culled. Porn, robo pussy, alcohol and weed are just tools to keep you numb while they prep you for sacrifice. Buckle up homey, the future it not bright.
no u