Ray Kurzweil, Cyberprophet or Crack-Pot?
I've never really understood why people are so excited about "mind uploading". I mean, so you upload your exact neural network pattern into a machine capable of emulating its function. Great. There is now an exact functional copy of your brain, which is effectively immortal.
So what? Is that really 'you' or is it just somebody else who thinks he is? How do YOU, personally, achieve immortality like that? Or do you figure we'll have a scientific solution to the "hard problem" by then?
So what? Is that really 'you' or is it just somebody else who thinks he is? How do YOU, personally, achieve immortality like that? Or do you figure we'll have a scientific solution to the "hard problem" by then?
I've heard that suggestion before. (Which means I probably should have addressed it...) Several years ago I got into an argument about this sort of thing on the Master of Orion 3 forums. As I recall I had to admit that it sounded like it should work. However, the nature of consciousness is still not understood scientifically, and I suspect it never will be (it's not a phenomenon. How do you scientifically study something that can't be detected in principle?). It's still too much of a risk for me...
I don't know that it's undetectable.. We'll know that it could be when we've got the physical brain integraly mapped out. I think there will be a lot for every day people to learn from once we know all there is to know empiricaly about the brain.
I reckon that since absolute matter control is supposed to allow at least one brain power in the size of a grain of sand, that we ought to be able to at least gradualy replace the biological brain with an artificial alternative of an equal volume... Eventualy. The minimum requirement is a 1:1, identical image of the supporting hardware (wetware in this case). I don't think it's so far fetched, given that growing new neurons is already something that the brain heavily relies on. We just provide a branch for the biological "mind" to cross over onto the hardware.. Whatever the mind is. Already, we see that mind altering substances have an effect. I think it will work, eventualy.
I reckon that since absolute matter control is supposed to allow at least one brain power in the size of a grain of sand, that we ought to be able to at least gradualy replace the biological brain with an artificial alternative of an equal volume... Eventualy. The minimum requirement is a 1:1, identical image of the supporting hardware (wetware in this case). I don't think it's so far fetched, given that growing new neurons is already something that the brain heavily relies on. We just provide a branch for the biological "mind" to cross over onto the hardware.. Whatever the mind is. Already, we see that mind altering substances have an effect. I think it will work, eventualy.
Mapping is only the first step. Then comes meaning.Betruger wrote:I don't know that it's undetectable.. We'll know that it could be when we've got the physical brain integraly mapped out. I think there will be a lot for every day people to learn from once we know all there is to know empiricaly about the brain.
I reckon that since absolute matter control is supposed to allow at least one brain power in the size of a grain of sand, that we ought to be able to at least gradualy replace the biological brain with an artificial alternative of an equal volume... Eventualy. The minimum requirement is a 1:1, identical image of the supporting hardware (wetware in this case). I don't think it's so far fetched, given that growing new neurons is already something that the brain heavily relies on. We just provide a branch for the biological "mind" to cross over onto the hardware.. Whatever the mind is. Already, we see that mind altering substances have an effect. I think it will work, eventualy.
Engineering is the art of making what you want from what you can get at a profit.
http://www.orionsarm.com/eg/p/Pa-Pd.htm ... ity_theory93143 wrote:I've never really understood why people are so excited about "mind uploading". I mean, so you upload your exact neural network pattern into a machine capable of emulating its function. Great. There is now an exact functional copy of your brain, which is effectively immortal.
So what? Is that really 'you' or is it just somebody else who thinks he is? How do YOU, personally, achieve immortality like that? Or do you figure we'll have a scientific solution to the "hard problem" by then?
pattern identity theory
The theory that "I" am the same individual as any other whose physical constitution forms the same or a similar pattern to mine.
http://www.orionsarm.com/eg/c/Co-Cq.htm ... ity_theory
continuity identity theory
The theory that "I" am the same person as various future and past selves with whom I am physically and temporally continuous.
I'm no fan of uploading, but IMO Pattern Identity has more going for it.
Duane
Vae Victis
Heh, I wrote a post about the identity and consciousness problem about this a while back.So what? Is that really 'you' or is it just somebody else who thinks he is? How do YOU, personally, achieve immortality like that? Or do you figure we'll have a scientific solution to the "hard problem" by then?
If I make a copy of you, which one is really you? The answer must be both, and neither.
First off, consider that the you of this moment is not, strictly speaking, the you of a decade ago, a year ago, or even a minute ago -- you constantly acquire new knowledge, evolve new heuristics, etc. So, sadly, you are going to cease to exist by the time you finish reading this. Mourn the you of a second ago, he's gone now.
It seems therefore that there is -- and this is a bit disturbing, because it cuts against our compulsion for bodily self-preservation -- no difference between you just floating along evolving in your body as you normally do, and being physically destroyed and precisely recreated anew each second -- or, for that matter, being duplicated into dozens or trillions of new copies. They're all you for an instant, and soon none of them are you, and you're not you anymore either.
But, of course, you're not temporally continuous. You experience regular periods of unconsciousness.The theory that "I" am the same person as various future and past selves with whom I am physically and temporally continuous.
As for physical continuity, it seems relevant until you start asking which pieces we can remove and have you still be you. Our soft tissues are replaced every 90 days iirc, so "physically" you are not continuous either. What seems to matter are the patterns, so if we can replicate the patterns in silicon that could be said to be a new you as well - for an instant or so, before it becomes someone else.
Last edited by TallDave on Tue Aug 19, 2008 11:44 pm, edited 1 time in total.
n*kBolt*Te = B**2/(2*mu0) and B^.25 loss scaling? Or not so much? Hopefully we'll know soon...
I think what really bothers most people is the dilemma of the man in the water trap in The Prestige -- what if "I" am the copy that dies?
Looking at your double watching you die, the instantaneous nature of identity becomes apparent. The instant you're created and face different reality as another copy, you cease to be the same person.
Looking at your double watching you die, the instantaneous nature of identity becomes apparent. The instant you're created and face different reality as another copy, you cease to be the same person.
n*kBolt*Te = B**2/(2*mu0) and B^.25 loss scaling? Or not so much? Hopefully we'll know soon...
Consciousness is complex but not that mysterious once you throw out mystical notions. It's just the interplay of our ten thousand compulsions (hunger, sex, status, sleep, self-preservation, etc.) with the processors that attempt to satisfy them using the mass of sensory data constantly flowing in.However, the nature of consciousness is still not understood scientifically, and I suspect it never will be
Is a cat conscious? A housefly? A bacterium? A thermostat? Yes, to increasingly limited degrees.
n*kBolt*Te = B**2/(2*mu0) and B^.25 loss scaling? Or not so much? Hopefully we'll know soon...
The issue here seems to be continuity of consciousness. And yes, we do have periods of unconsciousness, but we have a *sense* of continuity. That sense is not possible either in The Prestige scenario, or in the mind-upload scenario because the continuity of consciousness is broken. As long as the process allows you to walk away and leave a separate copy, you have not maintained a continuity of consciousness.TallDave wrote:I think what really bothers most people is the dilemma of the man in the water trap in The Prestige -- what if "I" am the copy that dies?
Looking at your double watching you die, the instantaneous nature of identity becomes apparent. The instant you're created and face different reality as another copy, you cease to be the same person.
And then there's the Star Trek transporter scenario: would you use that thing knowing that your body is going to be obliterated? I think your experience would be instant death, while a copy of you materializes elsewhere. The only way there could be continuity of consciousness is if there was a soul that magically transferred from one body to the next.
All of this talk of instantaneous identity is somewhat beside the point. I've thought about that before, and it doesn't explain why I am I and you are you, which is the main issue here.
As for unconsciousness, brain states and scientific detectability thereby, is it really unconsciousness, or do you experience things that simply aren't recorded in your memory? Or is something else going on? Can consciousness exist in and of itself, without things to be conscious of? Simple self-awareness? There's really no way to know for sure; we're always dealing with brain-activity-based proxies...
Are you familiar with the idea of the 'zombie'? That is, a biological system functionally identical to a human, but with no conscious experience - nothing inside looking out? There would be no way to tell from the outside that this was not a conscious human being like me. There is nothing whatever in science that precludes this; your reaction to the contrary is simply based on the unfounded assumption that all of reality must be physical; ie: predictably phenomenological and objective, and thus scientifically tractable - which clashes with the very concept of conscious identity.
One scientist actually denied the validity of the 'zombie' concept on the grounds that he couldn't imagine what it would be like to be one! People talk themselves into knots when consciousness is the subject.
I do suspect that the 'zombie' is impossible - though my reasons are more philosophical than scientific...
This is fun stuff. Unfortunately I've got a project to work on that needs to be done last Friday, so I may not engage in timely fashion...
As for unconsciousness, brain states and scientific detectability thereby, is it really unconsciousness, or do you experience things that simply aren't recorded in your memory? Or is something else going on? Can consciousness exist in and of itself, without things to be conscious of? Simple self-awareness? There's really no way to know for sure; we're always dealing with brain-activity-based proxies...
Sorry, but I can't let that go. I've heard that one before too (in a book, even), and it's neither more nor less than an attempt to dodge the issue. Consciousness is conceptually distinct from information processing, or indeed any physical phenomenon.TallDave wrote:Consciousness is complex but not that mysterious once you throw out mystical notions. It's just the interplay of our ten thousand compulsions (hunger, sex, status, sleep, self-preservation, etc.) with the processors that attempt to satisfy them using the mass of sensory data constantly flowing in.
Are you familiar with the idea of the 'zombie'? That is, a biological system functionally identical to a human, but with no conscious experience - nothing inside looking out? There would be no way to tell from the outside that this was not a conscious human being like me. There is nothing whatever in science that precludes this; your reaction to the contrary is simply based on the unfounded assumption that all of reality must be physical; ie: predictably phenomenological and objective, and thus scientifically tractable - which clashes with the very concept of conscious identity.
One scientist actually denied the validity of the 'zombie' concept on the grounds that he couldn't imagine what it would be like to be one! People talk themselves into knots when consciousness is the subject.
I do suspect that the 'zombie' is impossible - though my reasons are more philosophical than scientific...
This is fun stuff. Unfortunately I've got a project to work on that needs to be done last Friday, so I may not engage in timely fashion...
Sure you can tell zombies. They walk kind of stiff and have a vacant look in their eye and they take orders from some one else. Usually the bad guy. Also they tend to be frequently found where a lot of Bob Hope jokes get told.
Engineering is the art of making what you want from what you can get at a profit.
Actually, I think instantaneous identity does explain that. Even if we were physically identical -- twins, say -- we're not the same person because we have experienced different realities. I stayed after school and met a pretty girl and got married, you watched the news and admired a politician and became a senator. Of course, it's far, far, far more complex than that, but that just means there are far, far more diverging experiences than in my crude example.I've thought about that before, and it doesn't explain why I am I and you are you, which is the main issue here.
Sure, it's not just information processing. It's hugely complex information processing that's dealing with thousands, perhaps millions of competing priorities and compulsions.Consciousness is conceptually distinct from information processing, or indeed any physical phenomenon.
How do I know you're not, in fact, a zombie? Maybe you're just programmed to argue that you're not.Are you familiar with the idea of the 'zombie'? That is, a biological system functionally identical to a human, but with no conscious experience - nothing inside looking out? There would be no way to tell from the outside that this was not a conscious human being like me.
If you're talking about something complex enough to imitate human consciousness such that it could not be distinguished from a human, then almost by definition it would have to be at least as complex as a human brain. I'm not sure there's much difference between it and a human consciousness; in fact something capable of perfectly imitating us might actually be as superior to our experienced consciousness as we are to a cat or a fly.
But let's explore the zombie down the scale of consciousness. With a cat a zombie would be something different than a cat, but one might argue the fly is simple enough that it makes little difference, the bacterium even less so, and the thermostat is obviously so simple it makes no difference what animates it. The less complexity, the less it matters. Thus it seems "consciousness" as something separate from "zombieness" emerges from this complexity. Emergent properties don't have to be nonphysical, or any less awesome for their physical nature.
The difference between sleep and nonexistence seems arbitary. If all that's required is a "sense" of continuity, we can program that feeling into your copy when he wakes up in 2200 A.D.And yes, we do have periods of unconsciousness, but we have a *sense* of continuity. That sense is not possible either in The Prestige scenario, or in the mind-upload scenario because the continuity of consciousness is broken.
n*kBolt*Te = B**2/(2*mu0) and B^.25 loss scaling? Or not so much? Hopefully we'll know soon...
What has that got to do with the fundamental question of selfhood? Even if two people were physically identical and experienced the exact same things their entire lives, they would still be two different selves. I think you're missing the point.TallDave wrote:Actually, I think instantaneous identity does explain that. Even if we were physically identical -- twins, say -- we're not the same person because we have experienced different realities.I've thought about that before, and it doesn't explain why I am I and you are you, which is the main issue here.
All that could conceivably happen without the actual experiencing part. You're assuming that just because (you believe) consciousness is an emergent property of the operating human brain, that there is no conceptual distinction between the operations of the brain and the actual awareness that accompanies them. Even if you're right about the emergent property thing, the distinction between the concepts is still important.Sure, it's not just information processing. It's hugely complex information processing that's dealing with thousands, perhaps millions of competing priorities and compulsions.Consciousness is conceptually distinct from information processing, or indeed any physical phenomenon.
You don't.How do I know you're not, in fact, a zombie?
I'm not. Human consciousness cannot be "imitated" because it is not phenomenological - there's nothing to imitate. Human BEHAVIOUR, on the other hand...If you're talking about something complex enough to imitate human consciousness
It would be exactly as complex as a human being, brain included. Physically, there is no difference.such that it could not be distinguished from a human, then almost by definition it would have to be at least as complex as a human brain.
How do you know any of those are conscious? Even the cat? I'm pretty sure a thermostat is NOT conscious; it's just a thing, not a self.But let's explore the zombie down the scale of consciousness. With a cat a zombie would be something different than a cat, but one might argue the fly is simple enough that it makes little difference, the bacterium even less so, and the thermostat is obviously so simple it makes no difference what animates it. The less complexity, the less it matters. Thus it seems "consciousness" as something separate from "zombieness" emerges from this complexity. Emergent properties don't have to be nonphysical, or any less awesome for their physical nature.
Uh... surely I don't need to point out that people are indeed often (maybe always) conscious while sleeping. Some people say they don't dream, and this could be true or it could be that they simply don't remember their dreams. In any case, the important thing is that you wake up the next morning with your selfhood intact. If this is assumed to not be true, we have other problems, and you certainly can't sell mind uploading on the basis of "well, no, it won't be you, but neither will the you a second from now".The difference between sleep and nonexistence seems arbitary.
Okay. What good does that do ME?If all that's required is a "sense" of continuity, we can program that feeling into your copy when he wakes up in 2200 A.D.
It's no good just assuming that instantaneous identity is actually how it works, and it's not much help if it is. I personally think instantaneous identity is akin to solipsism in terms of philosophical usefulness, as well as in likelihood of being correct. It's certainly not scientific, if that's what you're after...