State-Space of Background Assumptions

There is a wide number of transhumanist strains/clusters, and we don’t really understand why. How do we explain the fact that immortality is the number one concern for some, while it is a very minor concern for others who are more preoccupied with AI apocalypse or making everyone animated by gradients of bliss?

A possible interpretation is that our values and objectives are in fact intimately connected to our background assumptions about fundamental matters such as consciousness and personal identity. To test this theory, I developed a questionnaire for transhumanists that will examine the relationship between transhumanist goals and their background philosophical assumptions. If you wish to contribute, please find this questionnaire here (it takes ~15 minutes):


The link will be alive until Jul 30 2015 (EDIT: I have extended the deadline until August 2nd!). Please complete it as soon as possible. Once the results are out you will be happy you participated.

The very sense we give to words requires an underlying network of background assumptions to support them. Thus, when we don’t share implicit background assumptions, we often interpret what others say in very different ways than what they had in mind. With enough transhumanists answering this questionaire (about 150) we will be able to develop a better ontology. What would this look like? I don’t know yet, but I can give you an example of the sort of results this work can deliver: State-Space of Drug Effects.


  1. Pingback: Every Qualia Computing Article Ever | Qualia Computing
  2. dondeg · August 19, 2015

    >>How do we explain the fact that immortality is the number one concern for some
    Other traditions have options we can utilize. I suspect you know of my interest in yoga. That viewpoint would suggest the desire for immortality is an illusion nested in the illusion of our human lives. Yet they would also see it as deeply ironic insofar as what we really are is already immortal, or at least eternal. Time itself is part of the illusory nature of things they call maya.

    Transhumanism strikes me as little more than an inflated balloon. It seems impressive on initial hearing, but a little study reveals it to be vapid. Kurzweil’s notion of unlimited exponential growth is unrealistic. Look at how the computer industry is stalling out. If things grew unchecked exponentially, the Earth should be covered in 50 feet of bacterial muck and we would not exist. There are always limits to growth.

    Its just another example of something the ancient Hindus discovered: the ever-changing gunas, taking on infinite patterns, each having its moment in the Sun. Each eventually receding back into the infinite potential. Same old same old.

    Best wishes,


Leave a Reply