Sunday, September 25, 2011

Singularity and Humanity

The article "2045: The Year Man Becomes Immortal" brings up the concept of singularity which is the combining of human and machine, or machine AI overcoming humans'. In my opinion, singularity would seem to take away our humanity, our meaning in life. "By beating death, will we have lost our essential humanity?" Well, yes. The deadline that we have in life affects our decisions and our morals that we have. Most of us choose to do or not do things because of death. Most of us want to do things that will leave a good memory of us when we have passed. Take away death, take away that stoppage, people will get more rash in their decision making and they will feel they can do anything they want. Also this article brings up the question of "Who decides who gets to be immortal?" To juggle something as impactful as immortality around, naturally some people are going to want others not to be immortal. So who would get to decide who gets to live forever? Leaving all that power in people's hands would be to great even for all of our minds combined because their would always be a bias. All in all, even if the singularity did happen, that does not mean it would be a good thing. Becoming super intelligent isn't always a good thing and having all that power would drive so many people to the point of insanity and being obsessed with power, the whole world would probably be in war. In war for technology. In war for power. In war for immortality. Which would eventually lead to death.

No comments:

Post a Comment