Artificial brains and mind uploading

mashers

Stubborn ape
OP
Member
Joined
Jun 10, 2015
Messages
3,837
Trophies
0
Age
40
Location
Kongo Jungle
XP
5,074
Country
I've been thinking a lot recently about artificial brains and mind uploading - that is, the concept of transferring a human consciousness to something other than a human brain. I've been pondering whether I believe this is desirable, or even possible.

Proponents of this concept argue that consciousness is the product of the complex neurological structure of the human brain. They theorise, therefore, that a synthetic brain which was of sufficient complexity and which mimicked the structures of the human brain could achieve consciousness. How we would establish that this had actually happened is debatable. If a computer is sufficiently (artificially?) intelligent to behave in such as way as to utterly convince a human that it is conscious, then how do we know whether it is or is not? For that matter, the only human I actually know to be conscious is myself. I have no objective evidence for the consciousness of anybody, or anything, other than the consciousness I can directly observe. Everything outside my own mind has the potential to be an elaborate hoax.

My next dilemma is whether something artificial can achieve consciousness. I have no spiritual belief whatsoever, so I don't need to worry about concepts like the soul or the spirit. So, the question is purely about whether something man-made can achieve consciousness. Would a human cloned from another human be conscious? Most biologists, spiritualists and laypeople would agree that it is. How about a real human brain which was grown in a jar, artificially nourished, and with artificial means of communicating and interacting? (i.e. some kind of interface to the outside world). Would this be any different to a human brain which grew inside a skull on top of a person's body? I believe it would not - I believe it would function exactly the same way as any other human brain (albeit with a different developmental trajectory as its experiences would be very different to those of a brain inside a human body (though the perception of those experiences could be fed in artificially to cause the brain to believe it is experiencing childhood experiences such as play and social interaction)). What about a 'brain' made of artificial neurones designed to behave exactly like organic ones and arranged into a structure which physically and functionally resembles a human brain? Or a similar artificial brain which represents a human one functionally but not structurally? What about a software representation of the neural network of a brain?

I feel that the more abstracted the idea gets from an organic brain, the harder it is to conceive of it achieving consciousness. A human brain is alive, whereas an artificial one is an object. It is really difficult to understand how an object could be self-aware and have thinking and reasoning skills, or even emotions. But why shouldn't it? As a non-spiritual person I believe that the human brain is just a clump of nerve tissue which has arranged itself slowly and successively to carry out certain functions which we call 'thought' and 'consciousness' and 'language' and 'emotions' and ... . I don't believe there is anything magical about this, so I don't understand why I struggle to conceive that something man-made could achieve the same thing. What I keep coming back to as the stumbling block is two reasons: it's not alive. It's an object.

Now, lets assume it's possible to create an artificial brain and transfer a human consciousness into it. Lets also assume that the transferred mind is an exact replica of the original, including all memories, cognitive skills, attitudes, beliefs etc etc. This raises a few issues. The mind would, presumably, be provided with either a form of physical interface with the outside world, or an artificial environment in which to communicate with other uploaded minds, or both. Since the whole point would presumably be to enhance the experience of consciousness and to experience things that are not possible in the 'real world' I would assume that an artificial environment would be provided. Since not all people would want to be uploaded, I would also assume that there would also be an interface to the outside world for communicating with biological people, and also for maintaining the infrastructure which hosts the minds and provides their environment. This brings me to my first issue. Who provides this environment, who decides what goes into it, who provides protection from cyber-attacks, and who controls the whole thing? Of course the same questions could be asked about the physical world - are we really free to live our lives how we want, or are we being controlled (directly or indirectly) by structures and influences beyond our purview? Similar to concerns about natural resources for biological entities, digitally uploaded minds would need be concerned about what happens if a server farm shuts down or if the company that is running it goes under. We're not talking about closing an online store here, or deleting a forum and losing all the posts - in this scenario, loss of digital infrastructure could effectively kill people.

Another consideration is what happens to your biological brain. A bit like with the theories on dematerialisation/rematerialisation in Star Trek-like transporter technology, it is possible that you would end up with a duplicate of your mind. Does the original then continue to function as a biological being, knowing that it exists elsewhere in digital form? Or would the biological brain die in the process? Or would your original body be killed? Thinking about this reminds me of a film called 'Advantageous' in which a woman's brain is transferred from her original body to a new (organic) one. It's interesting to see how this affects her relationship with her daughter and what she learns about her original brain.


Even having thought about this (a lot), written it all down, and re-read it, I would have to say that given the opportunity I would have my mind uploaded to an artificial environment, but only under a few circumstances:
  1. The life of my biological body is coming to an end or I am suffering an illness which will severely debilitate my body or remove my quality of live/dignity or kill me soon anyway
  2. I have some control over what I experience within the digital environment
  3. I am guaranteed that my digital form will be an exact replica of the consciousness within my meat-brain
  4. I have the option at any time to end my digital existence (essentially, purging my mind from the host system, allowing me to die if I choose to)
  5. I am able to interface in some way with the outside world so I can continue to communicate with biological people. I don't care about physically being in that environment (i.e. in a temporary surrogate body or android) as I could reproduce the Earth within the digital environment.


Discuss! :P
 

mashers

Stubborn ape
OP
Member
Joined
Jun 10, 2015
Messages
3,837
Trophies
0
Age
40
Location
Kongo Jungle
XP
5,074
Country
Another aspect I have just been thinking about is modification of the uploaded mind. Once a mind has been digitised, could we apply 'patches' to it? This could remove traumatic memories, delete depression, enhance cognition, or implant skills. If this were possible, should we do it? Things like this are already a contentious issue. Ask somebody with autism if there should be a cure for autism and brace yourself for the ferocity of the response; with regard to memories, some would argue that memories of trauma shape our identity and to erase them would alter our personalities; and who gets to decide what stays and what goes? We don't want to end up with a situation like the film 'Equilibrium' where certain (or all?) emotions are considered distasteful or dangerous and are therefore edited out. But then, aren't there situations where certain things should be removed? Murderous intent? Paedophilia? Racism? Extremism? But who decides what stays and what goes? Or, could the mind be broken down into modules which can be enabled or disabled?
 

Justinde75

Well-Known Member
Member
Joined
Feb 14, 2016
Messages
2,529
Trophies
1
Age
23
XP
4,559
Country
Germany
It is really creepy to think about that you give your Mind, Memories and your Personality in the Hands of another person. They can change it to thier liking, and because they have your Mind at thier control, they can make you a mindless Zombie, without own opinions or that they can even change things like your Sexual orientation. Its pretty scary stuff. And because they have your Memories they can just wipe them off so you don't remember transferring your mind at all.

--------------------- MERGED ---------------------------

But your mind would probably be something like Siri, or Cortana. A bot that would answer questions the way you would IRL
 

FAST6191

Techromancer
Editorial Team
Joined
Nov 21, 2005
Messages
36,798
Trophies
3
XP
28,348
Country
United Kingdom
"How about a real human brain which was grown in a jar [snip]"
The brain is hugely complex and thus when studying it a popular thing to do is to is to start with something broken*/different and see how that plays out. A basic start for that would be what if someone starts out blind, deaf or missing some sense and then gains it later in life. That could provide something to contemplate.
*perhaps a poor choice of words given I mention deafness in the next line.
There is also a philosophical pondering about if someone grew up in black and white but read everything about colour, they then get let out and see colour. Have they then learned anything new?

" So, the question is purely about whether something man-made can achieve consciousness"
The 80's answer to that would dwell upon whether humans could feasibly engineer something as complex as themselves. Anybody that has seen modern circuit design and evolutionary design in chips would argue that such a thing is not an issue ( http://archive.bcs.org/bulletin/jan98/leading.htm for an old article but a nice intro), though I suppose it would then have to question what you call man made.

"Since not all people would want to be uploaded"
I dare say it would be a case of may the luddites perish or otherwise become a dead branch of the evolutionary tree. There is however an interesting projection for the future of humans, it argued that they might split into two classes -- 1) the genetically manipulated and 2) the, presumably poorer, artificially augmented. Robot brain would change that and personally I would probably go for both camps anyway if robot brain was not available.

"What I keep coming back to as the stumbling block is two reasons: it's not alive. It's an object."
It would be something of a non issue to me. Less abstractly than that life is just a bastard hard to replicate state of a chemical reaction -- if I have to heat something to a given range but go over the limit and burn it then life is just a far more complex variation on the theme. If you can replicate a sufficiently large part of it (my appendix is probably not inherent to the human condition and I could live without it, thus there are things you can peel away) then it is a moot issue. Or if you prefer then what about the golden age scenario -- resources are not a thing and if you die you pop out of the nearest replication pod 20 seconds later and all memories intact. I kill you in that scenario, have I really done anything?

"Who provides this environment, who decides what goes into it, who provides protection from cyber-attacks, and who controls the whole thing?"
Always a fun one and usually the major issue I have with films covering this. If I get to play hacker at human mind, robot body (that Surrogates film omitting that from it killed it for me really). There is no such thing as an unhackable system and the incentives to have access to an unrestricted take on it are too big for really anything to control it.

"Another aspect I have just been thinking about is modification of the uploaded mind. Once a mind has been digitised, could we apply 'patches' to it? This could remove traumatic memories, delete depression, enhance cognition, or implant skills. If this were possible, should we do it?"
We do it anyway. A lot of training is to bypass internal concepts. Sometimes this is troubling -- a popular field of study was what happens to people trained to kill. It was found in world war 2 that an alarmingly small number of people shoot to kill (stats vary a bit but around 5%, something that apparently corresponds well to levels of sociopathy, was the usual stat given), after that training saw people get conditioned differently and it worked well (more like 90%). The subsequent amount of depression and mental issues also increased.
Anyway after I get robot brain step 1 is figure out how to mod it, 2 is duplicate it for experiment and 3 is improve and alter it. I don't know how long it would take but what started as me would be unrecognisable in very short order should I be able to modify things. Some would ask why bother if I can seed an AI in a different way, perhaps it is just vanity but it is a vanity I could accept in myself and others.

"loss of digital infrastructure could effectively kill people"
It is an issue in AI development -- is turning off an AI a morally justifiable act?
Anyway back to the server farm thing then if those uploaded are limited to that then they have screwed up very hard. Worldwide distribution (and possibly offworld as well), redundancy, backup and so forth would be an even higher priority for me in the steps above, though I would hope I would have sorted that before uploading.

Your list as I would see it.
1. Nah I would do it this second if my meat would continue and I could do it again later if tech improved. I might revise my opinion if the meat goes with it, I am mainly imagining either some future NMR/MRI or a classic cap with wires and lights.
2. Assuming 1. is the case then I suppose I might have the turning off the AI problem to consider but if copy me can't do it then too bad.
3. We already saw my list of steps.
4. Again I think my list would take care of that or make it not even an issue.
5. I can't envisage a situation where such an AI would not be able to interact with the... universe? Isolation might be a useful safety feature in development though so I guess that could be a thing.

I would try to think of it like your scenario where it is just my consciousness floating around a bit but there is nothing that would stop me from that list short of destroying the AI that was me.

I shall have to find that film you mention though, mainly as I have that awful Transcendence film going round in my head now -- the group going round killing AI scientists in that would the ones I was hunting in that world.
 
  • Like
Reactions: KingVamp

BurningDesire

Well-Known Member
Member
Joined
Jan 27, 2015
Messages
4,999
Trophies
1
Location
Behind a screen reading news
XP
4,885
Country
United States
Personally, One of my biggest fears is that if religion is indeed false then that means there is no life after death. If you think about it like that then what happens after you die is your just there. It's all black, you can't feel anything, people have forgotten you, you have no one to talk to, you can't here/ IF there was a option for me to upload my brain to a computer I so would. By the time I am ready to die and be forgotten I bet there will be a way to as chine already has robots with feelings (I think) In the long term we just need a foundation that focuses on mapping out the human brain. In doing so seeing how powerful it really, how much information can be stored. If we can find that that is a huge step towards never dying. You can also be something you where not if you had a robot body. Like I could finally be a female (All tho genders will probably be non existing then) this is a very interesting topic I am sure non of us at temp' can come up with a solution if you saw what happened last night. However I bet we as temp as a whole might be able to make a feasible hypothesis for mapping out the human brain.
 

FAST6191

Techromancer
Editorial Team
Joined
Nov 21, 2005
Messages
36,798
Trophies
3
XP
28,348
Country
United Kingdom
Personally, One of my biggest fears is that if religion is indeed false then that means there is no life after death. If you think about it like that then what happens after you die is your just there. It's all black, you can't feel anything, people have forgotten you, you have no one to talk to, you can't here

Not all religions argue for a life after death. Also what you describe is closer to a thing called locked in syndrome. Wouldn't the reality be more that you can not even think and thus there is no blackness, no knowledge of the absence of feeling... as your mind is not functioning to process it.

Not EOF but song


Also comedy video
 
  • Like
Reactions: BurningDesire

BurningDesire

Well-Known Member
Member
Joined
Jan 27, 2015
Messages
4,999
Trophies
1
Location
Behind a screen reading news
XP
4,885
Country
United States
Not all religions argue for a life after death. Also what you describe is closer to a thing called locked in syndrome. Wouldn't the reality be more that you can not even think and thus there is no blackness, no knowledge of the absence of feeling... as your mind is not functioning to process it.

Not EOF but song


Also comedy video

Thanks, now my fear of death has increased drastically lmao
 

mashers

Stubborn ape
OP
Member
Joined
Jun 10, 2015
Messages
3,837
Trophies
0
Age
40
Location
Kongo Jungle
XP
5,074
Country
It is really creepy to think about that you give your Mind, Memories and your Personality in the Hands of another person. They can change it to thier liking, and because they have your Mind at thier control, they can make you a mindless Zombie, without own opinions or that they can even change things like your Sexual orientation. Its pretty scary stuff. And because they have your Memories they can just wipe them off so you don't remember transferring your mind at all.
Good point. You wouldn't know it had happened because they could erase the memories of it ever happening. Maybe it already has.

i'd say never modify someone's brain EVER. even a lost childhood memory could create a completely different person.
But isn't it up to the individual to decide that? Lets say somebody was abused as a child and that has caused them to become an untrusting adult with attachment issues, depression, suicidal thoughts etc. They would probably welcome the opportunity to change all that by deleting the memories of what happened. But then, we're in the realm of editing history. The holocaust was a horrible event, so shall we delete it from everyone's memory so nobody has to experience the trauma of thinking about it? Or should we keep it in our collective consciousness to learn from it? I believe the latter, but not everybody would agree.

Anyone seen the movie Transcendence? interrestiing movie, which is about so-called brain uploading and such. you should see it.
Actually I've noticed it on Netflix but haven't seen it yet mostly because Johnny Depp is too annoying. Maybe I'd better give it a go.

The brain is hugely complex and thus when studying it a popular thing to do is to is to start with something broken*/different and see how that plays out. A basic start for that would be what if someone starts out blind, deaf or missing some sense and then gains it later in life. That could provide something to contemplate.
*perhaps a poor choice of words given I mention deafness in the next line.
There is also a philosophical pondering about if someone grew up in black and white but read everything about colour, they then get let out and see colour. Have they then learned anything new?
I've heard these allegories before and I don't think we'll know the answer until we can restore any sense to somebody who was born without it (and thus study their responses to the introduction of the new sense). However, one such restorative procedure does exist in the form of cochlear implant surgery. Observing the responses of people who undergo this procedure and their adaptations to the experience of sound is quite fascinating. I'm not sure this necessarily applies in this situation, as I would assume that the artificial brain would be designed to mimic the sensory cortex of a human brain such that the digital environment is perceived as sights, sounds etc. However, who's to say that whole new senses couldn't be discovered or invented in this environment? Of course we can conceive of what these might be no more than we can imagine a colour which we have never seen.

" So, the question is purely about whether something man-made can achieve consciousness"
The 80's answer to that would dwell upon whether humans could feasibly engineer something as complex as themselves. Anybody that has seen modern circuit design and evolutionary design in chips would argue that such a thing is not an issue ( http://archive.bcs.org/bulletin/jan98/leading.htm for an old article but a nice intro), though I suppose it would then have to question what you call man made.
Theories like the Singularity suggest that these discoveries will start when technology becomes able to improve upon itself. I can't decide whether this is exciting or terrifying (or both), but I hope it will be like this:
judgment_day.png


"Since not all people would want to be uploaded"
I dare say it would be a case of may the luddites perish or otherwise become a dead branch of the evolutionary tree. There is however an interesting projection for the future of humans, it argued that they might split into two classes -- 1) the genetically manipulated and 2) the, presumably poorer, artificially augmented. Robot brain would change that and personally I would probably go for both camps anyway if robot brain was not available.
The novel Existence by David Brin follows similar lines, with (sentient) robotic and cybernetic life-forms becoming recognised sub-classes of human. I also like the transhumanistic opinions on this.

"What I keep coming back to as the stumbling block is two reasons: it's not alive. It's an object."
It would be something of a non issue to me. Less abstractly than that life is just a bastard hard to replicate state of a chemical reaction -- if I have to heat something to a given range but go over the limit and burn it then life is just a far more complex variation on the theme. If you can replicate a sufficiently large part of it (my appendix is probably not inherent to the human condition and I could live without it, thus there are things you can peel away) then it is a moot issue.
Yeah, that's what I was trying to get at in the OP - if you can duplicate a brain then it's conscious. But does it still work if it's an inorganic brain? Does that matter? Why [not]? (I realise these questions are currently unanswerable. I'm just pondering.)

"Who provides this environment, who decides what goes into it, who provides protection from cyber-attacks, and who controls the whole thing?"
Always a fun one and usually the major issue I have with films covering this. If I get to play hacker at human mind, robot body (that Surrogates film omitting that from it killed it for me really). There is no such thing as an unhackable system and the incentives to have access to an unrestricted take on it are too big for really anything to control it.
Maybe by the time we have developed the technology to do this, we will have evolved beyond the need to dominate or own or control. Or perhaps the emergence of this technology will actually cause that to happen, as we realise that we can have everything we ever wanted within the digital environment and there is no need to take from others.

"Another aspect I have just been thinking about is modification of the uploaded mind. Once a mind has been digitised, could we apply 'patches' to it? This could remove traumatic memories, delete depression, enhance cognition, or implant skills. If this were possible, should we do it?"
We do it anyway. A lot of training is to bypass internal concepts. Sometimes this is troubling -- a popular field of study was what happens to people trained to kill. It was found in world war 2 that an alarmingly small number of people shoot to kill (stats vary a bit but around 5%, something that apparently corresponds well to levels of sociopathy, was the usual stat given), after that training saw people get conditioned differently and it worked well (more like 90%). The subsequent amount of depression and mental issues also increased.
Really interesting point. How different is editing a digital mind to [re-]training a biological one? Though I would posit that the former is potentially more dangerous as you could change literally anything with the individual having no recollection of it happening, whereas there are limits to what can be achieved through conditioning, and the individual would have some memory of what had been done to them (unless we're talking about really extreme cases like MK Ultra).

Anyway after I get robot brain step 1 is figure out how to mod it
Homebrew IN UR BREIN.

2 is duplicate it for experiment
Another interesting point. If I can access your digital brain, I can copy it, extract from it, spoof as it... So there would need to be some considerable DRM involved. Come to think of it, such a situation could actually open up a market. Personality traits, skills or memories or even whole identities could be bought and sold so you can become a completely different person.

and 3 is improve and alter it
I wonder what effect that would have on class structures. Those with the resources or skills to alter their mind (or have it altered by others) would obviously be advantaged in society. Just like access to education and employment now.

I don't know how long it would take but what started as me would be unrecognisable in very short order should I be able to modify things. Some would ask why bother if I can seed an AI in a different way, perhaps it is just vanity but it is a vanity I could accept in myself and others.
It would take a very content and self-controlled person to leave their digital mind exactly as it was if the potential to modify it were there. I suspect that if this situation occurred, modifying a mind would either be illegal or heavily regulated in a similar way that psychiatric care and cosmetic procedures are now.

"loss of digital infrastructure could effectively kill people"
It is an issue in AI development -- is turning off an AI a morally justifiable act?
Well, that depends whether they can be considered to be alive. The film Ex Machina is an interesting portrayal of this issue.


1. Nah I would do it this second if my meat would continue and I could do it again later if tech improved. I might revise my opinion if the meat goes with it, I am mainly imagining either some future NMR/MRI or a classic cap with wires and lights.
Good point. The tech might be refined to get a better copy of your brain later, so you'd want to have the opportunity to re-image later. Unless the bio brain could be preserved (but inactive) so your biological body doesn't need to continue functioning and remain conscious.

5. I can't envisage a situation where such an AI would not be able to interact with the... universe? Isolation might be a useful safety feature in development though so I guess that could be a thing.
I agree it would be bizarre. I would want to be able to turn it on and off at will so I could choose if and when to 'go out' into the physical world (like I do now) and prevent intrusion from it. (As a side note, I think that living in a digital environment with total control of what interaction occurs and how the environment itself is represented would be wonderful for people with autism. Being able to tone down the sensory input and control the rate of presentation of social interaction would be very welcome).

I shall have to find that film you mention though
It's on Netflix. It's clearly low budget and has an indie vibe to it, but I like that.

Personally, One of my biggest fears is that if religion is indeed false then that means there is no life after death.
I don't think the concept of mind uploading necessarily negates religion or life after death. Though it does raise an additional question. If there is a spiritual entity attached to each mind, what happens when you clone it? Do you clone the spirit as well (I feel this would be beyond our ability even if it existed), or do both minds share the same spirit, or do you end up with a mind with no spirit or.....

If you think about it like that then what happens after you die is your just there.
No, if there is no spirit (and your consciousness is simply a product of the structure of your brain) then when your brain dies, you're not 'just there' - you completely cease to exist. You're not a consciousness devoid of any contact with others or body to inhabit floating in blackness - you are just completely gone. Given the ideas most religions have about the afterlife I could expect to experience, I much prefer the idea of oblivion.

what you describe is closer to a thing called locked in syndrome. Wouldn't the reality be more that you can not even think and thus there is no blackness, no knowledge of the absence of feeling... as your mind is not functioning to process it.
Exactly.
 
  • Like
Reactions: BurningDesire

The Real Jdbye

*is birb*
Member
Joined
Mar 17, 2010
Messages
23,327
Trophies
4
Location
Space
XP
13,904
Country
Norway
I've been thinking a lot recently about artificial brains and mind uploading - that is, the concept of transferring a human consciousness to something other than a human brain. I've been pondering whether I believe this is desirable, or even possible.

Proponents of this concept argue that consciousness is the product of the complex neurological structure of the human brain. They theorise, therefore, that a synthetic brain which was of sufficient complexity and which mimicked the structures of the human brain could achieve consciousness. How we would establish that this had actually happened is debatable. If a computer is sufficiently (artificially?) intelligent to behave in such as way as to utterly convince a human that it is conscious, then how do we know whether it is or is not? For that matter, the only human I actually know to be conscious is myself. I have no objective evidence for the consciousness of anybody, or anything, other than the consciousness I can directly observe. Everything outside my own mind has the potential to be an elaborate hoax.

My next dilemma is whether something artificial can achieve consciousness. I have no spiritual belief whatsoever, so I don't need to worry about concepts like the soul or the spirit. So, the question is purely about whether something man-made can achieve consciousness. Would a human cloned from another human be conscious? Most biologists, spiritualists and laypeople would agree that it is. How about a real human brain which was grown in a jar, artificially nourished, and with artificial means of communicating and interacting? (i.e. some kind of interface to the outside world). Would this be any different to a human brain which grew inside a skull on top of a person's body? I believe it would not - I believe it would function exactly the same way as any other human brain (albeit with a different developmental trajectory as its experiences would be very different to those of a brain inside a human body (though the perception of those experiences could be fed in artificially to cause the brain to believe it is experiencing childhood experiences such as play and social interaction)). What about a 'brain' made of artificial neurones designed to behave exactly like organic ones and arranged into a structure which physically and functionally resembles a human brain? Or a similar artificial brain which represents a human one functionally but not structurally? What about a software representation of the neural network of a brain?

I feel that the more abstracted the idea gets from an organic brain, the harder it is to conceive of it achieving consciousness. A human brain is alive, whereas an artificial one is an object. It is really difficult to understand how an object could be self-aware and have thinking and reasoning skills, or even emotions. But why shouldn't it? As a non-spiritual person I believe that the human brain is just a clump of nerve tissue which has arranged itself slowly and successively to carry out certain functions which we call 'thought' and 'consciousness' and 'language' and 'emotions' and ... . I don't believe there is anything magical about this, so I don't understand why I struggle to conceive that something man-made could achieve the same thing. What I keep coming back to as the stumbling block is two reasons: it's not alive. It's an object.

Now, lets assume it's possible to create an artificial brain and transfer a human consciousness into it. Lets also assume that the transferred mind is an exact replica of the original, including all memories, cognitive skills, attitudes, beliefs etc etc. This raises a few issues. The mind would, presumably, be provided with either a form of physical interface with the outside world, or an artificial environment in which to communicate with other uploaded minds, or both. Since the whole point would presumably be to enhance the experience of consciousness and to experience things that are not possible in the 'real world' I would assume that an artificial environment would be provided. Since not all people would want to be uploaded, I would also assume that there would also be an interface to the outside world for communicating with biological people, and also for maintaining the infrastructure which hosts the minds and provides their environment. This brings me to my first issue. Who provides this environment, who decides what goes into it, who provides protection from cyber-attacks, and who controls the whole thing? Of course the same questions could be asked about the physical world - are we really free to live our lives how we want, or are we being controlled (directly or indirectly) by structures and influences beyond our purview? Similar to concerns about natural resources for biological entities, digitally uploaded minds would need be concerned about what happens if a server farm shuts down or if the company that is running it goes under. We're not talking about closing an online store here, or deleting a forum and losing all the posts - in this scenario, loss of digital infrastructure could effectively kill people.

Another consideration is what happens to your biological brain. A bit like with the theories on dematerialisation/rematerialisation in Star Trek-like transporter technology, it is possible that you would end up with a duplicate of your mind. Does the original then continue to function as a biological being, knowing that it exists elsewhere in digital form? Or would the biological brain die in the process? Or would your original body be killed? Thinking about this reminds me of a film called 'Advantageous' in which a woman's brain is transferred from her original body to a new (organic) one. It's interesting to see how this affects her relationship with her daughter and what she learns about her original brain.


Even having thought about this (a lot), written it all down, and re-read it, I would have to say that given the opportunity I would have my mind uploaded to an artificial environment, but only under a few circumstances:
  1. The life of my biological body is coming to an end or I am suffering an illness which will severely debilitate my body or remove my quality of live/dignity or kill me soon anyway
  2. I have some control over what I experience within the digital environment
  3. I am guaranteed that my digital form will be an exact replica of the consciousness within my meat-brain
  4. I have the option at any time to end my digital existence (essentially, purging my mind from the host system, allowing me to die if I choose to)
  5. I am able to interface in some way with the outside world so I can continue to communicate with biological people. I don't care about physically being in that environment (i.e. in a temporary surrogate body or android) as I could reproduce the Earth within the digital environment.


Discuss! :P
I agree with everything you said, and I believe that an artificially made clone of a human brain, with the same memories and consciousness as a person is just as real.
At the same time, if you had your consciousness uploaded to an artificial brain, it wouldn't really be you. It would be a clone of you, with all the same memories, but it would not be directly connected to the "you" that is living in your human body. You could even go as far as to say that it's a different person, but with the same memories and beliefs. When your human body dies, so would the original you. The original body would still experience that death and the clone would have no memory of it.
Thinking about it that way I certainly would not want my original body, and original "me" killed so that I could live on forever in an artificial brain, that seems ethically wrong to me. So it would be an option only if my original body was about to die (or preferably already dead) anyway.
I kind of feel like all those memories the clone would have of past life could be considered "fake", as they were experienced by the original body, even though they would seem just as real as any memories the clone gained afterwards. It's difficult for me to consider the clone the same person, even if it's literally a duplicate of the original.
 

BORTZ

DO NOT SCREENSHOT
Supervisor
Joined
Dec 2, 2007
Messages
13,243
Trophies
3
Age
34
Location
Pittsburgh
XP
16,018
Country
United States
Can you elaborate? I'm genuinely interested and I think this is relevant to the discussion.
I am just tired of all the latest social justice garbage. And I dont mean to open up a new can of worms and stuff. I have no problem with gays, cis whatever, trans, all that stuff, but I am just basically tired of hearing about it. So I am not ready to hear about people starting rights campaigns for sentient AI or whatever. And besides, I feel like the second we get a sentient AI, we are going to have massive problems, unrelated to rights of any kind.
 

mashers

Stubborn ape
OP
Member
Joined
Jun 10, 2015
Messages
3,837
Trophies
0
Age
40
Location
Kongo Jungle
XP
5,074
Country
@The Real Jdbye
Good point about the digital version not being the original person. Again this is similar to some discussions about Star Trek transporter technology, with some people believing that the person arriving at the destination is not the same person, but a clone with duplicated memories (and that the original person is killed by the process). The philosophical question here is, does that matter? If our conscious mind is just a product of the structure and content of our brain, and that is copied verbatim, then what difference does it make if the duplicate is not a transference but a replica of the original?

@8BitWonder
No I haven't, but I'll lo into it. Thanks!

@Bortz
A very fair point actually. Social justice fatigue is an understandable feeling in lit of the number, scale and duration of many civil rights campaigns. More of them won't really help this situation. However, mind uploading might get rid of some of the existing ones. If I exist digitally then I can potentially be anything I want and nothing that I don't. If I want to change sex, colour, shape, density or whatever, I could do that. That will inevitably result in more diversity, eroding many identity groups, making meaningless the categorisation of people which inevitably leads to tribalistic oppositions.
 
  • Like
Reactions: BORTZ

FAST6191

Techromancer
Editorial Team
Joined
Nov 21, 2005
Messages
36,798
Trophies
3
XP
28,348
Country
United Kingdom
Oh god and i thought that the whole social acceptance BS we are dealing with right now was insufferable.
Though there are parallels with animal testing it is reserved more for strong AI.

That said I eagerly await the day I see someone told off for assuming an AI uses positive edge logic and that such an assumption is offensive and they should have asked first.

I've heard these allegories before and I don't think we'll know the answer until we can restore any sense to somebody who was born without it (and thus study their responses to the introduction of the new sense). However, one such restorative procedure does exist in the form of cochlear implant surgery. Observing the responses of people who undergo this procedure and their adaptations to the experience of sound is quite fascinating. I'm not sure this necessarily applies in this situation, as I would assume that the artificial brain would be designed to mimic the sensory cortex of a human brain such that the digital environment is perceived as sights, sounds etc. However, who's to say that whole new senses couldn't be discovered or invented in this environment? Of course we can conceive of what these might be no more than we can imagine a colour which we have never seen.

Maybe by the time we have developed the technology to do this, we will have evolved beyond the need to dominate or own or control. Or perhaps the emergence of this technology will actually cause that to happen, as we realise that we can have everything we ever wanted within the digital environment and there is no need to take from others.

Really interesting point. How different is editing a digital mind to [re-]training a biological one? Though I would posit that the former is potentially more dangerous as you could change literally anything with the individual having no recollection of it happening, whereas there are limits to what can be achieved through conditioning, and the individual would have some memory of what had been done to them (unless we're talking about really extreme cases like MK Ultra).

Another interesting point. If I can access your digital brain, I can copy it, extract from it, spoof as it... So there would need to be some considerable DRM involved. Come to think of it, such a situation could actually open up a market. Personality traits, skills or memories or even whole identities could be bought and sold so you can become a completely different person.


I wonder what effect that would have on class structures. Those with the resources or skills to alter their mind (or have it altered by others) would obviously be advantaged in society. Just like access to education and employment now.


It would take a very content and self-controlled person to leave their digital mind exactly as it was if the potential to modify it were there. I suspect that if this situation occurred, modifying a mind would either be illegal or heavily regulated in a similar way that psychiatric care and cosmetic procedures are now.


Well, that depends whether they can be considered to be alive. The film Ex Machina is an interesting portrayal of this issue.

Good point. The tech might be refined to get a better copy of your brain later, so you'd want to have the opportunity to re-image later. Unless the bio brain could be preserved (but inactive) so your biological body doesn't need to continue functioning and remain conscious

Cochlear implants are obviously the big one, though some interesting things happened in comparative MRIs. It is not limited to them though as there have been a few cases of some trivial procedure that was not done because the sufferer lived in some third world shithole or religious group disliked medicine.
On senses unknown then I can see magnetic fields* or stick magnets in my fingers to feel them, night vision/heat vision, UV unfiltered vision, I can mod my body to include a compass (though it seems some people, Australian Aboriginals being the study I saw, can already pull this off). You mentioned colours so I reckon it might be worth looking into tetrachromancy http://www.popsci.com/article/science/woman-sees-100-times-more-colors-average-person
*magnetic film is great fun to play with and pretty cheap these days as well


I did mean to go further into what makes humans by virtue of being trapped by/defined by senses (I went on the slight tangent later with the appendix thing). I would be curious to see what goes if you can be perfectly logical too, possibly whilst retaining knowledge that the illogical is still a path used by certain beings or occasionally useful in the absence of complete information/models.

A lot of things do seem to boil down to effectiveness. To link it back to the topic that this was slightly inspired by we see such things in genetic manipulation now -- cows were selectively bred aurochs, dogs from wolves, carrots are purple in the wild... but each of those took several generations and thus seem perfectly acceptable to have use despite being just as GMO as any glow in the dark sheep.
On limits of conditioning I would wonder what they might be, obviously less than the potential of computer driven stuff, and it might even vary by person (I imagine the raised by wolves/monkeys scenario would be something to look at as part of this).

On DRM there was a great little comedy video I saw once but could not find for the earlier replies, and still can't. It ran through what might happen if you came back with a take on the current copyright and whatnot of this world, as you were experiencing the poor man's version then all the copyrighted songs you knew were stripped from memory.

"Those with the resources or skills to alter their mind"
Resources I could see, skills is that of anybody with a functioning mind. The rate of change might initially be low but few argue it would be anything other than an exponential increase. If we are continuing with XKCD then https://xkcd.com/505/ seems relevant at this point.

Contentment and self control? I would see it as a sign of a lack of imagination.
 
  • Like
Reactions: BORTZ

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
  • No one is chatting at the moment.
    Sonic Angel Knight @ Sonic Angel Knight: :ninja: