It's a specific term with a specific meaning to reduce uncertainty.
Transformative artificial intelligence (TAI) is artificial intelligence that constitutes a transformative development, i.e. a development at least as significant as the agricultural or industrial revolutions.
>Transformative artificial intelligence (TAI) is artificial intelligence that constitutes a transformative development, i.e. a development at least as significant as the agricultural or industrial revolutions.
Umm, sweaty? Hormone blockers are a transformative technology. Helping trans children will literally transform society and is as important as the industrial revolution. Prove me wrong.
Because it's not too much effort sure >Helping trans children will literally transform society and is as important as the industrial revolution
The industrial (and agricultural) revolutions are typically categorized by a dramatic and unexpected increase in wealth and population.
I would normally say that it's highly unlikely trans kids would result in that, but you're just going to turn around and say that I didn't PROVE you wrong.
I'll break the issue down further and we'll eventually work our way down to you admitting you're a solipsist. I tell you to jump off a bridge because nothing matters. Another anon comments how you make your position unassailable by not engaging with the issue. Shit's predictable. Don't you ever get tired of it?
1 year ago
Anonymous
>dramatic and unexpected increase in wealth and population.
So AI is about to cause a dramatic and unexpected increase in wealth and population?
1 year ago
Anonymous
absolutely. with infinite virtual resources at their disposal, the numbers on people's bank balances in the metaverse will have so many zeroes at the end that you won't believe we used to use quaint terms like "millionaire" or "billionaire". and anyone who wants to will be able to be a woman!
1 year ago
Anonymous
Yeah, maybe when this shill talks about population increase, he means Tamagochi children.
1 year ago
Anonymous
The duplicator is a thought experiment that's easier to understand.
If you're interested I could go into specific issues on why AI is extremely difficult to align with human values, but it's basically Goodhart's law.
1 year ago
Anonymous
So AI is about to cause a dramatic and unexpected increase in wealth and population? Yes or no?
1 year ago
Anonymous
According to Will, there's a 50% chance transformative AI arrives between 2040 and 2050.
It would be disingenuous to make predictions with higher degrees of certainty with the available evidence and how future technology is historically difficult to predict correctly.
You can't have a simple yes or no, but I'm sure you've prepared some smoothbrained Aristotelian logicis I said yes and I wouldn't want to let that go to waste so go ahead.
1 year ago
Anonymous
So there's a 50% chance that AI is about to cause a dramatic and unexpected increase in wealth and population because "Will" said so?
1 year ago
Anonymous
There's a lot of guesses, but very rarely do experts believe we have more than 80 years. >oh he said expert. Jump on that
1 year ago
Anonymous
Why would AI cause unexpected increases in wealth and population when the explicit goals of the people funding the research is to decrease the population in numbers and strip individuals from any social power?
1 year ago
Anonymous
Because your model of the world is inaccurate >No, it must be everyone else who's wrong
Study after study shows that people don't change their mind after being presented with facts.
1 year ago
Anonymous
You didn't answer my question. Why would AI cause unexpected increases in wealth and population when the explicit goals of the people funding the research are to decrease population numbers and strip individuals from any social power?
1 year ago
Anonymous
For where Y = "the explicit goals of the people funding the research are to decrease population numbers and strip individuals from any social power" >Why would X when Y
Tell you that Y doesn't exist >Yeah but why would X when Y? You're not answering my question.
Wall, meet forehead
1 year ago
Anonymous
See
You didn't answer my question. Why would AI cause unexpected increases in wealth and population when the explicit goals of the people funding the research are to decrease population numbers and strip individuals from any social power?
Don't tell me your only defense is psychotic denial of reality.
1 year ago
Anonymous
No one wants to cull the population because they're "inconvenient". That makes no sense. Global wealth is driven by specialization. The developer takes longer to make money if there isn't anyone making their coffee.
Social power sure. Corruption and authoritarianism is on the rise. No one is disputing that.
1 year ago
Anonymous
I don't care about your psychotic fantasies. I'm going by explicit stated intentions of your owners, and the empirical evidence showing a decrease in population, decrease in freedom and increase in centralized control. You are subhuman.
1 year ago
Anonymous
I've said before that people don't change their mind according to facts. I would encourage you to explore the possibility that you have attributed phenomena to the incorrect cause. It's really difficult to see when you're wrong from the inside.
I wish you the best on your mental health journey
1 year ago
Anonymous
China and The West both implemented a centrally planned 1 Child policy in the 1970s. you were only told about China's
1 year ago
Anonymous
>I've said before that people don't change their mind according to facts.
Do you mean the way you continue to be psychotic despite your handlers constantly bombarding everyone with propaganda about the dangers of overpopulation and the benefits of centeralized control? Or the way you don't change your mind despite the empirical evidence of population decline, decreased freedom and increasing control over the population? Thanks for making it very clear that you cannot be reasoned with and will need to be dealt with by some other means.
1 year ago
Anonymous
the central planning models created by the ~~*people funding the research*~~ find humans to be very inconvenient, because they don't necessarily do what they are told. so they want to keep their centrally planned economic models, but switch to using fake humans in them rather than real ones. that way the numbers always go up, and reality never interferes
1 year ago
Anonymous
I know people overuse dunning-kruger, but this is a perfect example.
1 year ago
Anonymous
Your posts are a perfect example of a nonsentient bot spam campaign. Every single post of yours is 100% generic tripe.
1 year ago
Anonymous
bro imagine taking 18 years to raise a baby when you could just buy 1000 tamagotchi kids and raise them in 18 minutes. and like 20% of them will probably be newton or shakespeare tier. think of the productivity, bro
>no of course it isn't sadistic garden gnomes making decisions, no humans are responsible for anything at all, it is just "AI" acktually, and for some inexplicable reason we have to keep centralizing power to unaccountable bureaucracies governed by processes they have no interest in understanding
Transformative AI need not be AGI. A strategic AI for war decisions is transformative. A propaganda/thought control AI for preventing dissent and arresting protestors/poisoning the well of internet debate is also transformative
Why would AI cause unexpected increases in wealth and population when the explicit goals of the people funding the research is to decrease the population in numbers and strip individuals from any social power?
>it's uhhh... transformative
It's a specific term with a specific meaning to reduce uncertainty.
Transformative artificial intelligence (TAI) is artificial intelligence that constitutes a transformative development, i.e. a development at least as significant as the agricultural or industrial revolutions.
>Transformative artificial intelligence (TAI) is artificial intelligence that constitutes a transformative development, i.e. a development at least as significant as the agricultural or industrial revolutions.
Will the Big Obvious chud reply with another NPC or are they so smoothbrained they lose in a turing test
Umm, sweaty? Hormone blockers are a transformative technology. Helping trans children will literally transform society and is as important as the industrial revolution. Prove me wrong.
Anons being triggered by the term AGI because they don't willfully refuse to understand the concept of generality as opposed to specificity.
Use the word transformative to label a more specific phenomena and they start babbling about chuds
I know being oppositional is new and exciting for you kid, but it's old hat for the rest of us.
Still waiting for you to prove me wrong.
Because it's not too much effort sure
>Helping trans children will literally transform society and is as important as the industrial revolution
The industrial (and agricultural) revolutions are typically categorized by a dramatic and unexpected increase in wealth and population.
I would normally say that it's highly unlikely trans kids would result in that, but you're just going to turn around and say that I didn't PROVE you wrong.
I'll break the issue down further and we'll eventually work our way down to you admitting you're a solipsist. I tell you to jump off a bridge because nothing matters. Another anon comments how you make your position unassailable by not engaging with the issue. Shit's predictable. Don't you ever get tired of it?
>dramatic and unexpected increase in wealth and population.
So AI is about to cause a dramatic and unexpected increase in wealth and population?
absolutely. with infinite virtual resources at their disposal, the numbers on people's bank balances in the metaverse will have so many zeroes at the end that you won't believe we used to use quaint terms like "millionaire" or "billionaire". and anyone who wants to will be able to be a woman!
Yeah, maybe when this shill talks about population increase, he means Tamagochi children.
The duplicator is a thought experiment that's easier to understand.
If you're interested I could go into specific issues on why AI is extremely difficult to align with human values, but it's basically Goodhart's law.
So AI is about to cause a dramatic and unexpected increase in wealth and population? Yes or no?
According to Will, there's a 50% chance transformative AI arrives between 2040 and 2050.
It would be disingenuous to make predictions with higher degrees of certainty with the available evidence and how future technology is historically difficult to predict correctly.
You can't have a simple yes or no, but I'm sure you've prepared some smoothbrained Aristotelian logicis I said yes and I wouldn't want to let that go to waste so go ahead.
So there's a 50% chance that AI is about to cause a dramatic and unexpected increase in wealth and population because "Will" said so?
There's a lot of guesses, but very rarely do experts believe we have more than 80 years.
>oh he said expert. Jump on that
Why would AI cause unexpected increases in wealth and population when the explicit goals of the people funding the research is to decrease the population in numbers and strip individuals from any social power?
Because your model of the world is inaccurate
>No, it must be everyone else who's wrong
Study after study shows that people don't change their mind after being presented with facts.
You didn't answer my question. Why would AI cause unexpected increases in wealth and population when the explicit goals of the people funding the research are to decrease population numbers and strip individuals from any social power?
For where Y = "the explicit goals of the people funding the research are to decrease population numbers and strip individuals from any social power"
>Why would X when Y
Tell you that Y doesn't exist
>Yeah but why would X when Y? You're not answering my question.
Wall, meet forehead
See
Don't tell me your only defense is psychotic denial of reality.
No one wants to cull the population because they're "inconvenient". That makes no sense. Global wealth is driven by specialization. The developer takes longer to make money if there isn't anyone making their coffee.
Social power sure. Corruption and authoritarianism is on the rise. No one is disputing that.
I don't care about your psychotic fantasies. I'm going by explicit stated intentions of your owners, and the empirical evidence showing a decrease in population, decrease in freedom and increase in centralized control. You are subhuman.
I've said before that people don't change their mind according to facts. I would encourage you to explore the possibility that you have attributed phenomena to the incorrect cause. It's really difficult to see when you're wrong from the inside.
I wish you the best on your mental health journey
China and The West both implemented a centrally planned 1 Child policy in the 1970s. you were only told about China's
>I've said before that people don't change their mind according to facts.
Do you mean the way you continue to be psychotic despite your handlers constantly bombarding everyone with propaganda about the dangers of overpopulation and the benefits of centeralized control? Or the way you don't change your mind despite the empirical evidence of population decline, decreased freedom and increasing control over the population? Thanks for making it very clear that you cannot be reasoned with and will need to be dealt with by some other means.
the central planning models created by the ~~*people funding the research*~~ find humans to be very inconvenient, because they don't necessarily do what they are told. so they want to keep their centrally planned economic models, but switch to using fake humans in them rather than real ones. that way the numbers always go up, and reality never interferes
I know people overuse dunning-kruger, but this is a perfect example.
Your posts are a perfect example of a nonsentient bot spam campaign. Every single post of yours is 100% generic tripe.
bro imagine taking 18 years to raise a baby when you could just buy 1000 tamagotchi kids and raise them in 18 minutes. and like 20% of them will probably be newton or shakespeare tier. think of the productivity, bro
>no of course it isn't sadistic garden gnomes making decisions, no humans are responsible for anything at all, it is just "AI" acktually, and for some inexplicable reason we have to keep centralizing power to unaccountable bureaucracies governed by processes they have no interest in understanding
you think a cabal of garden gnomes is making all the dall-e 2 results in some basement retard?
Dall-E is a tool. use it if you want. what does it have to do with being transformed?
Transformative AI need not be AGI. A strategic AI for war decisions is transformative. A propaganda/thought control AI for preventing dissent and arresting protestors/poisoning the well of internet debate is also transformative
>transformative doesn't actually mean anything specific
What a profound insight.
meant for