Will modern machine learning techniques (neural networks, stochastic gradient descent, etc) be sufficient to recreate consciousness?
Will modern machine learning techniques (neural networks, stochastic gradient descent, etc) be sufficient to recreate consciousness?
No. Could it convince someone of it’s consciousness, even though it’s not really consciousness? Yes
Pure vitalist nonsense
t. optimistic high schooler
Consciousness is not science, it's a word people use to cast an esoteric light over the processes of the brain
>modern machine learning techniques
>stochastic gradient descent
How come machine learning is such a tremendous moron magnet?
You tell us.
unnecessarily aggressive post
Yes, it's almost here already
Which organization do you think is getting close to developing AGI?
Ok portumoor, go back to the copatorium. Consciousness is still not a scientific issue, consciousness cannot be related to anything else (for consciousness is all that exists, experience) thus it cannot be understood within a scientific framework
watch the video
I've watched half of the video, it's a dumb interview where the guy looks like he's high on weed, it reminds me of a David Foster Wallace interview I saw on yt minus the eloquence and content. Can't you just tell me what the video says? I'm sure it doesn't reach 2 small paragraphs.
Do you know who Ed Witten is? Do you not know the PHENOTYPE??
As of right now, consciousness is impossible to measure. You literally cannot prove that other people are self aware and conscious as opposed to their brains just controlling them like a CPU controls a printer or a headphone
You also can't prove you don't work just like a cpu controls a printer or a headphone
Sure
I think it's an interesting problem to think about
Here's another cool one. An observer who si right now near the event horizon of a black hole could see the future you. Do you have free will? Have you done everything you're about to do? Is the science wrong about event horizons?
If you start asking those kinds of questions like consciousness youre opening a can of worms that science isn't quite ready to deal with
Maybe in 1000 years
Free will just means I don't know what I am going to do next or what the future will bring me, it says nothing about my character and is ultimately meaningless
That is to say, I can't know what I will do next. Why? Because there are a million of things to consider and without having all that data we also can't plot the relations between my environment and what I will do accurately enough to consistently predict my actions, what limits our understanding is that it is impossible to consider all of the factors that influence my actions and we don't understand how humans work (they aren't easily predictable in a way that applies universally and they change over time in unique ways, at least from an idealistic point of view (what they believe, what they know, what they are interested in))
There is no way objective, empirical way to test for consciousness, so there is no objective answer to this. The opinions people spout on it are always just reflections of their own gut feelings, and if your gut feeling tells you that you are essentially the same thing as a program emulating symptoms of consciousness, you deserve to be treated like an object by the rest of humanity.
>modern machine learning techniques
Feedback systems and stochastic gradient descents in general have been a thing for decades anon, it's not modern in any way shape or form
I implore every CS moron in this thread who has ditched every possible math course they could to at least take a class in signal and system, because seeing you praise ML as this sort of new emerging technology is getting really annoying.
Consciousness isn't a software, its a hardware problem. You need to design the hardware to have the conscious self organizing loop.
Autopoiesis of hardware is needed. The mind of organics are embodied, they're not software running on hardware, the minds are the hardware.
When the js want it to. Read suicide note by mitchell heisman to find out why.