• Fur Affinity Forums are governed by Fur Affinity's Rules and Policies. Links and additional information can be accessed in the Site Information Forum.

ChapterAquila92

Resident Bronze Dragon Kasrkin
Banned
As if there wasn't already three dead threads on this:

Transhumanism 2007
Transhumanism 2012
Transhumanism 2013

While I contemplate committing thread necromancy, here's a few things relevant to the discussion that I'd like to add.

I've been fascinated by Transhumanism for about as long as I've known about the Furry Fandom, starting back when I was barely a teenager (c. 2004-7). I was particularly interested in genetic engineering at the time, and it was through online research into that field that I discovered for myself a movement focused on (what I understood at the time to be) improving the human condition through the ethical use of technology.

In that regard, it was only fitting that I designed my fursona to be a posthuman cyborg to reflect this interest of mine.

1468703483.ca92forgeworks_[cm_-_liam]_detailed_character_sheet_s_.jpg

[Artwork by Ceowolf]

For myself, I've yet to find anything more rewarding than the ability to adapt and overcome whatever obstacle an individual may face during their lifetime. Whether it's through building a well in Africa or enabling a crippled veteran to engage in a physical activity, it's a very powerful feeling to bestow that kind of ability onto those who are less fortunate, and cybernetic implants - bionics, if you will - are a very potent gateway towards that kind of empowerment. Enabling the blind to see, the deaf to hear, the lame to walk... It's one of the few things in this world that gives me even a glimmer of hope for humanity, however it is achieved, and I most certainly do not want to see it fade into obscurity.

It's on this same note that I'm an avid supporter of the notion of uplift - that is, providing the means for another species to be on a roughly equal standing with us. The idea is fairly Nietzschian, as he writes in Thus Spoke Zarathustra:
Nitezsche said:
"Companions the Creator seeks; not corpses, herds or believers. Fellow creators the Creator seeks; those who write new rules on new tablets; and fellow harvesters, for everything about him is ripe for the harvest."
Insofar as we don't know whether or not we are alone in the universe, I see no reason why we should leave it to guesswork when there's beings of otherwise comparable intelligence, whom we share this pale blue dot with, that we can potentially elevate to be those fabled companions and fellow creators.

When it comes to other means of improving the human condition, genetic manipulation is treated with the appropriate caution it probably warrants. With my interest in genetics however, I'd be loath to see the field be strangled in its infancy over fears of designer babies which, while a potential exacerbation of current social pitfalls between the rich and the poor, reflects little on what the technology can enable us to do. In addition, I'm in an interesting situation - on top of a family history of various health problems, my immediate family also had its experience with SIDS through my baby brother, and as the only surviving sibling there's a bit of an onus on me to consider having children myself at some point in my life. Any self-respecting parent would want nothing less than the best for their children, and for what it's worth I would want mine to be free from that genetic influence on their health. Beyond that, the rest is up to how you want to raise your kid.

Lower on the list of things I support within the movement also includes extreme, technologically-induced body modification, in part out of support for morphological freedom but primarily as a means of optimizing for a particular task. However this may be done - surgery, cranial uploading with Eclipse Phase-style body sleeving, NANOMACHINES SON, etc - it once again falls under that drive for empowerment, to be able to do things that you wouldn't otherwise be able to.

Despite all of this, I have had my own concerns about what goes on in the ranks of the movement. In some circles, especially the Singularitarians and the Immortalists, there's a disturbing cult-like mentality that pays lip service to the core tenets of transhumanism - including reason and logic - while wishfully dreaming of living in a posthuman era, possibly worshiping a machine god or some other hubris (which doesn't add up to the claim many of them make of being atheist.) I wouldn't normally have a problem with this - to each their own - if it wasn't for their cavalier attitude towards what they seek to champion, with complete disregard for the potentially existential risks involved.

The Singularitarians, who espouse the idea of a perceived Technological Singularity (beyond which the current generation has little to no ability to predict what may come afterwards), have especially become a breeding ground for such machine cultists, which some people like myself find especially worrisome with the prospect of emerging "smart" AI possibly being created using bad programming (we already know what bad programming can do - look no further than the Therac-25 for an example), never mind a myriad of other horrifying scenarios including Grey Goo (again, bad programming dooming us all) and Posthuman Treachery (in which god complexes and "contempt for the flesh" tend to be prominent).

The Immortalists, who argue for indefinite life extension through various means, aren't quite as severe in their leanings as the Singularitarians, but they can still be disturbing to listen to. I've probably listened to far too many of them prattle on about immortality being the only thing that matters, along with how aging is a disease that must be cured and that anyone who disagrees is a "deathist," but it doesn't leave a great impression when it's been because of improvements to the overall quality of health that human life expectancy has increased to what it is today, never mind what the future may hold for us if we continue to make improvements in the medical field.

In the end, I'd say it's safe to say that the transhumanist community has much in common with the Furry Fandom - one common interest, many self-defined niches, and an annoying vocal minority of nutjobs who ruin it for the rest of us.
 

Jarren

You can't just quote yourself! -Me
If I recall correctly, there's some dude in Russia financing serious transhumanist research. His goal was to have a viable consciousness upload system in place before 2050. In addition, he's also stated that his people are working on therapies to combat aging/telomere degeneration to provide a path toward functional immortality. It's a rather pie-in-the-sky plan right now, but I've got high hopes for some good stuff to come out of it.
 

ShamonCornell

Active Member
See, this stuff? This is the stuff sci-fi authors stay awake at night, thinking about. Some with excitement, some with horror.

For one, as I understand genetics, it's (currently) impossible for a human being to live indefinitely. Say, sake of argument, you go full Ghost in the Shell: you're a brain in a jar, in a robot body. While your organs, yes, are far more durable and easily replaced, your brain is still tied to the limits of cell reproduction.

So, your DNA has little bits on the end, called "telemeters". Each time a cell in your body splits, you lose one. These are junk data at first, nothing important...until it hits things like your hair color. When your hair goes gray, that's about the half-way point on the gas tank of a human life. From there, it starts to hit more important things, one at a time. Your joints, your ability to heal so fast.

Finally, where most people die, is when it hits one of the major vital organs, such as the kidneys, heart or liver. In the case of dimentia patients, and those suffering Alzheimer's, it was the brain.

Now, some postulate that you can, one way or another, ADD telemeters to a person's DNA...except, we already have something that does this, and it's a crazy thing we call "cancer". The extra genetic material is likely to become a disease rather than a cure, much like someone trying to improve their computer's programming and writing a virus instead.

From there, generic modification can look...pretty damned scary, when our building blocks aren't so much "a Lego kit" as they are "a precarious Jenga tower". Some things could be easy fixed, like cancer susceptibility, and the like, but a drastic change to an entirely different species? I would say, in all honesty, the furthest reach of the technology (again, as we know the limits of genetics NOW) would be in gender reassignment, as that's a fairly simple, if full-body, change on paper.

Of course, when you apply the above concepts to the Star Trek transporter...hoo boy, how does Jim Kirk not have flipper hands and gills?
 
Last edited:

ChapterAquila92

Resident Bronze Dragon Kasrkin
Banned
See, this stuff? This is the stuff sci-fi authors stay awake at night, thinking about. Some with excitement, some with horror.

For one, as I understand genetics, it's (currently) impossible for a human being to live indefinitely. Say, sake of argument, you go full Ghost in the Shell: you're a brain in a jar, in a robot body. While your organs, yes, are far more durable and easily replaced, your brain is still tied to the limits of cell reproduction.

So, your DNA has little bits on the end, called "telemeters". Each time a cell in your body splits, you lose one. These are junk data at first, nothing important...until it hits things like your hair color. When your hair goes gray, that's about the half-way point on the gas tank of a human life. From there, it starts to hit more important things, one at a time. Your joints, your ability to heal so fast.

Finally, where most people die, is when it hits one of the major vital organs, such as the kidneys, heart or liver. In the case of dementia patients, and those suffering Alzheimer's, it was the brain.

Now, some postulate that you can, one way or another, ADD telemeters to a person's DNA...except, we already have something that does this, and it's a crazy thing we call "cancer". The extra genetic material is likely to become a disease rather than a cure, much like someone trying to improve their computer's programming and writing a virus instead.

From there, generic modification can look...pretty damned scary, when our building blocks aren't so much "a Lego kit" as they are "a precarious Jenga tower". Some things could be easy fixes, like cancer susceptibility, and the like, but a drastic change to an entirely different species? I would say, in all honesty, the furthest reach of the technology (again, as we know the limits of genetics NOW) would be in gender reassignment, as that's a fairly simple, if full-body, change on paper.

Of course, when you apply the above concepts to the Star Trek transporter...hoo boy, how does Jim Kirk not have flipper hands and gills?
Indeed, genetics has its limits when it comes to longevity.

The Jenga tower analogy is probably still too simplified though. What actually comes to mind is a program with functions that call other functions, in turn calling other functions in a byzantine recursive network. If any of those functions malfunctions or feeds a different result from what it normally does, the final expression might otherwise be the same or something completely different, depending on how many functions are relying on it for a particular result.
 

ShamonCornell

Active Member
Indeed, genetics has its limits when it comes to longevity.

The Jenga tower analogy is probably still too simplified though. What actually comes to mind is a program with functions that call other functions, in turn calling other functions in a byzantine recursive network. If any of those functions malfunctions or feeds a different result from what it normally does, the final expression might otherwise be the same or something completely different, depending on how many functions are relying on it for a particular result.
Right, what I'd meant to imply with the analogy, is "for all we know right now, most 'fixes' could either be rejected entirely (new block doesn't fit in the hole), could kill the host via unravelling everything (tower falls over), or heaven forbid, cause some kind of as-yet-unexplainable weirdness (tower defies physics and becomes a Tetris block), just as likely as it is to do what you wanted".

Hence, the other analogy, about computer programming. "Here, lemme make my Linux run faster, by fixing this random bit of junk code!" *blue screen* "Huh...okay, restore....let's try making these bits referencing irelevant files reference just the relevant ones!" *blue screen* "Damn...okay, restore...let's try some other thing." *unleash a virus* "...bugger."
 
Top