guy recently linked this essay, its old, but i don’t think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin
guy recently linked this essay, its old, but i don’t think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin
“If shitty things happen to you, then you will not like that and it will suck,” still doesn’t break the continuity of self. Fundamentally that same exact thing can happen to the current flesh and blood you and it would be horrible and destructive: you can be disfigured through an accident or through someone’s cruelty, you can be locked in a cage and dehumanized on the whim of the state-sanctioned professional violence men and given a farce of a trial by the dysfunctional shitshow that is the legal system, etc, but no one is going to argue that shitty things happening to you ontologically unpersons you in some sort of mystical fashion.
You can be reduced, you can be transformed, but you continue existing for as long as vital functions do. Talk about someone becoming someone else, or dying in truth long before they died in body, those are just poetic attempts at articulating sorrow and loss.
So I never was arguing that an upload becomes unpersoned by trauma. My point, the point of the article, is that by merely focusing on the brain we miss the other things that make us who we are.
The goal of an upload is to transfer the self to a machine, right? Well, parts of your self exist outside of your brain. It’s no different than if an upload was missing parts of the brain. They’re incomplete.
All that means is for some hypothetical future mind uploading technology, the process would need to include elements of the body and social life and society. Otherwise we’re not complete.
I am not my brain. I am my brain, my body, my social life, my place in history, etc. I am the dialectical relationship between the personal and the impersonal.