This is the correct way IMO. “Uploading” your mind to a computer is making a clone/copy, but the original dies the same.
I agree.
But here is an interesting thing to think about:
What is the perceived difference between falling asleep and waking up the next day, vs going to sleep and copying your consciousness to a machine/new body.
The body. It’s feeding you vast amounts of information every moment, it’s the one making decisions, you’re the AI assistant providing analysis and advice
If you clone a tree, you get a similar tree. The branches aren’t in the same place. If you clone a human, why would the nerves be laid out the same way? Even if it’s wired up correctly, without a lifetime of cooperation why would your body take your advice?
Imagine you wake up. Red looks blue. Everything feels numb. The doctor says “everything looks good, why don’t you try to stand up?”. You want to cooperate with the doctor, but you don’t stand up. You could move, but you don’t. Rationalizing your choices, you tell the doctor you don’t feel like it. You feel your toes, you shift to get away from the prodding of your doctor, but you just can’t muster the will to stand
Imagine you wake up. Your sight is crystal clear, you feel your body like never before. The doctor says “don’t move yet”. With the self control of a child, you rip out the itchy IV to get the tape off of you. The doctor says something in a stem tone, and you’re filled with rage. You pummel the doctor, then are filled with regret and start to cry
Emerging science suggests this kind of situation could lead to brand new forms of existential horror
Your brain is still functioning while you’re asleep. If it turned off all the way then you’d become brain-dead.
Some sleep is conscious (dreaming) but they’re easily forgotten. Perhaps being unconscious still always has a grain of consciousness (but is just forgotten).
It seems there is a grain of reduced experience while sleeping. Copying seems to imply it’s always a clone (a different ego, a different person).
Maintaining continuity of consciousness is the only thing that would make me feel comfortable with converting myself to a machine intelligence.
I hate to break it to you, but our meat brains don’t even have continuity of consciousness. We become unconscious all the time. The only real constant is the “hardware” our consciousness emerges from, but even that is always changing.
I don’t get the down votes. Did y’all forget about sleep? No one vividly dreams every night all night long. Often it’s the fade to black going to sleep then the sudden awakening.
I think the only way we know it is us for sure is if we are conscious in both the original and clone at the same time. Like… okay… I know this is me in the new brain, I’ll shut down the other one.
Like… okay… I know this is me in the new brain, I’ll shut down the other one.
the other one: i’m pretty sure you’ve got it backwards, pal