LinkExchange
LinkExchange Member




The Transhuman Autonomy Project

You don't have java! Why don't you get: Microsoft Explorer


Note: click here for a fast loading and more recently updated version of this site.




The focus of the Transtopia Project is creating an autonomous ocean-based transhuman community; a place where like-minded people can enjoy maximal individual freedom in social and economical matters. Presently, the project is still in its earliest stages; things like exact location, configuration and methods of funding are yet to be determined, and can best be discussed on the Transtopia mailing list. Some preliminary ideas and related links can be found here.

The more general objective of the Transtopia Project is finding ways to effectively increase one's wealth and independence, which are essential for pleasant living. A topic that should receive special attention is the Y2K computer problem, which is just around the corner (the first malfunctions are expected in late '99), and which could seriously affect the world's social and economical structures, and thus the autonomy project. That's why the first phase, which includes buying an island in the Bahamas should (ideally) be ready before the end of next year. That way you'd have both a save haven in case problems caused by the millennium bug get out of hand, as well as a good investment that can withstand the years of global recession that are likely to follow Y2K. With some research and preparation, one could not only reduce financial and other damage, but actually profit from the Y2K glitch.
`Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.'
(Vernor Vinge, NASA VISION-21 Symposium, 1993)

Bad (or promising, depending on how well you've prepared) as it may seem, Y2K is only a relatively minor issue compared to that what will likely face us in the next century: the Singularity. This is the postulated point or short period in our future when our self-guided evolutionary development accelerates enormously (powered by nanotech, neuroscience, AI, and perhaps uploading) so that nothing beyond that time can reliably be conceived. [Vernor Vinge, 1986]" (Lextropicon). In the worst case scenario, this event will cause the extinction of all (intelligent) life on earth due to malevolent/indifferent Super Intelligences (SIs), runaway technology or some other mishap. In the best case scenario everyone will upload, become posthuman and will live happily ever after. In the light of historical precedents however, this second option doesn't seem very likely; a relative handful of the world's rich and powerful will probably monopolize the technologies (such as genetic engineering, molecular nanotechnology, AI and mind uploading ) that will lead to the creation of SI and [thus] the Singularity. Normally, this temporary monopoly wouldn't be such a problem; eventually the new technologies would trickle down to the rest of society so that they could benefit too. Transhumanists could just sit and wait until the means to transcend would virtually fall into their lap.

There is a fundamental difference, however, to things like AI, nanotech, and uploading: they can, for the first time in history, give absolute autonomy to a single individual. Powerful businessmen like Bill Gates still need (lots of) other people to work for them, and to buy their products. Dictators still need (lots of) other people to fight in their armies, and to keep the country running. A SI, on the other hand, needs no-one. It is a being of unprecedented physical, intellectual, and psychological capacity, a self-programming, self-constituting, potentially immortal, unlimited individual. It thus can, if it has a sufficiently great lead, afford to kill all competition without suffering too much inconvenience. Since we can't, by definition, know the motivations of a SI, we would be taking an enormous risk by relying on its benevolence.

Does this mean that the Singularity is something that needs to be prevented at all costs? No, instead of trying to stop the massive future changes, which would probably be pointless anyway (as well as undesirable because, paradoxically, we need these new technologies to survive), the best course of action would be to co-operate to get as rich and influential as possible before the Singularity comes. Only then can we hope to have a reasonable chance to survive, and profit from this event (in other words: we must become SIs ourselves). Therefore one of the major goals of the Transtopia Project will be to form a transhuman mutual aid group, with the purpose of providing its members with a good starting position for the Singularity. This project is a logical extension of the autonomous island plans. After all, it would be a shame to go through the trouble of setting up a transhuman nation, only to get wiped out a couple of decades later.

To recap, the main (heavily interconnected) goals of the Transtopia Project are:

Disclaimer: although at first glance it may look like one, this is not (just) another one of the many (pseudo-religious) millennium-craze survivalists groups that have popped up in recent years. It is merely a coincidence that the Y2K glitch happens around the same date that all kinds of outrageous "ancient" prophecies are to be fulfilled, and we regard it as a purely practical problem/ opportunity, not something "mystical". As for the Singularity: it has nothing to do with the year 2000. The event, a logical consequence of advances in various fields of technology (see also above), will probably take place somewhere between 2005 and 2100. Unless of course some world-wide disaster takes place before this time (interestingly, Y2K could, in a very extreme scenario of accidental nuclear strikes etc. due to computer failure, prevent or at least delay the Singularity. In fact, a delay is almost certain because of the expected global recession that will follow Y2K, which means less funding for all kinds of scientific research). Other events that could stop or delay the Singularity are for example a massive meteorite impact, WW3, a naturally occurring or man-made killer virus etc, obviously "cures" that are at least as bad as the "disease".

Of course both the millennium bug and the Singularity may turn out to be relatively minor problems, or in the case of the latter, an even completely positive or mild event. However, in these cases, where there is so much at stake, it's much better to be safe than sorry. Besides, the autonomous island project is useful now matter what happens; whether the world remains its overregulated self, or lapses into complete chaos, we can always use a save haven. The usefulness of the "
wealth project" is obvious, and staying abreast with the latest technologies won't exactly hurt you either.

For the time being, the main function of this web page will be to point people to the Transtopia mailing list, where the abovementioned ideas & proposals, plus any more or less related topics (libertarian, skeptic, atheistic, transhuman, generally freedom-related etc.) can be discussed. Useful results from the discussions will be added to this site, so that it can become a reference point for transhumanistically inclined pioneers.


Link to us!


Note: click here for a fast loading and more recently updated version of this site.


Miscellaneous thoughts on sea-based autonomy | The Transtopian Principles | Transhumanism | >H Links Collection | Vernor Vinge on the Singularity (article) | Five Things You Can Do To Fight Entropy Now (essay) | Law of the High Seas | Mailing list | BreakOut (a game) | E-mail





LE FastCounter



Last updated : 03-09-98