Gordon Worley writes "NanoDot readers may find it interesting that the Singularitarian FAQ is ready for public consumption. So, if you're a transhumanist (or not) and were wondering what the Singularity is all about, this is a good place to start."
Question: How come all the regulars here who claim they are Singularitarians believe in a government and a money system? Isn't that hypocritical? The Singularity is not only about reaching a point in time where technological progress accelerates every nanosecond, but it's also a time when humans will achieve godhood. What is the point of having a system that will not only be unneccessary, but also be completely uneffective in such a scenario? Please explain this to me; I'm confused.
"Throughout history the rate of technological progress has been increasing."
People who've constructed a whole belief system around this assumption aren't far removed from Christians who believe in an imminent "rapture." They probably aren't that old, either, and lack historical perspective. I'm in my early 40's, and I'm struck by how little the look-and-feel of daily life in the U.S. has changed since the 1970's. My father, who was born in 1927 (the year Lindbergh flew across the Atlantic), grew up in a world before interstate highways, television, computers, antibiotics, jet travel, space launches, etc. These things have been part of my material environment as far back as I can remember, and life in the early 21st Century seems merely like an extrapolation of trends evident 25 years ago. We're even hearing familiar-sounding warnings that we're about to run low on petroleum.
So where's this "singularity" so many Transhumanists are talking about? You can run up the speed of computers all you want, but I have news for you youngsters: There is no "Moore's Law" for software. Without efficient algorithms to emulate what humans do especially well, computers are going to remain limited to what they do especially well.
I also find it ironic that Transhumanists in the Singularitarian school try so hard to distance themselves from Neo-Luddites. Both groups accept the "human subordination" premise, only disagree over what is to subordinate humanity. Singularitarians want to subordinate humans to speculative advanced AI's (which clearly don't exist yet), while Neo-Luddites want to subordinate humans to a reified-deified-mystical conception of "Gaia," which at least can claim some referents in reality.
In other words, both groups are like competing cults marketing conflicting conceptions of god.
First, the easy question: Singularitarians are in favor of money because we need it. If there are no scarce resources, there will be no need for money (and feel free to let money be any unit of exchange).
The reason for supporting governments is this: we need them. I myself would like very much to be able to live in an anarchy, but that's not an option as it stands nor will it likely ever be. The reason is that, while most regulation is completely unnecessary, there is a small bit needed to protect the universe and all of humanity from humans. For example, we need a government (or somethingn like it) to prevent someone evil from creating grey goo. As it turns out, greater technology requires greater regulation, as much as it'd be nice if that weren't the case.
Maybe the short answer is that we'd like to survive long enough to create the Singularity and enjoy living after it.
We are aware that there is no Moore's Law of Software. In fact, we don't even plan to do all the tough programming, but create a seed AI and it will then boot strap a real, generally intelligent AI. Basically, our short term goal is to create AI that can reprogram itself, so that it will be more and more intelligent and able to reporgram itself to be ever more intelligent. It's the positive feedback loop.
Transhumans don't with to make humanity subordinate at all, but rather develop safe AI that will pull humanity up with it, or at least the humans that want to and leave the rest well enough alone here on Earth to work things out for themselves. Maybe you should read the Singularitarian Principles for more info on what Singularitarians believe and then check out the stuff on Friendly AI for details of how subordination is to be avoided.
As far as being a cult, you can make almost anything out to be a cult if you want. I tell you, that Cult of the Red Tomato certainly seems to have a lot of followers. Seriously, though, cults generally have irrational beliefs. Spend some time around Singularitarians and I think you'll find we're not the irrational type. But, I suppose, you can always choose to ignore the evidence and make things look like cults. You'll know a cult when you consider the evidence and it's irrational based on the evidence.
In fact, we don't even plan to do all the tough programming, but create a seed AI and it will then boot strap a real, generally intelligent AI.
And trust that you are correct in all your assumptions about the motivations of a form of life never encountered before, with abilities and outlook that we can't imagine. How wonderfully trusting. Much like religious faith in the rightness of a Divine Plan.
Singularitarian attitudes towards AIs are remarkably religious. They seem to fill the role that guardian angels or tutelary spirits do in mystical traditions. I think I would get more of a sense of fulfillment out of my life if I achieved things on my own, rather than summon up a genie and hope it will grant me three wishes.
Seriously, though, cults generally have irrational beliefs.
Perhaps not so much 'irrational' as 'unprovable'. Any rational system is based on axioms, and many strange belief systems are built seemingly quite logically upon a foundation of barking madness.
Spend some time around Singularitarians and I think you'll find we're not the irrational type.
Case in point. So many of the beliefs of Singularitarians are untestable, and must be relied on with just as much blind faith as any other cult. The mere fact that all the articles of faith are couched in technical terms does not save them from being the product of wishful thinking.
I would rate Singularitarianism as a religion that probably has a better chance than most at seeing their prophecies come true, insofar as some sort of unpredictable, super-technological spike is likely to happen. I just don't like or trust their celestial hierarchy. Give me something like Sufism or Buddhism that preaches self-reliance and compassion.
"Throughout history the rate of technological progress has been increasing."
People who've constructed a whole belief system around this assumption aren't far removed from Christians who believe in an imminent "rapture."
Where is the assumption in the sentence you quoted? Are you really suggesting that there has not been an increasing rate of technological progress over the years? Perhaps you claim progress has been linear? Or do you claim that there has been no progress at all? Either alternative seems equally unlikely to me.
Perhaps there is an underlying assumption you perceived in the sentence that isn't apparent to me? Or perhaps we are using different definitions of "technological progress".
They probably aren't that old, either,
This sounds like an ad hominim attack, and is at best irrelevent.
and lack historical perspective. I'm in my early 40's, and I'm struck by how little the look-and-feel of daily life in the U.S. has changed since the 1970's.
Since you were around ten years old? I think this is where we really disagree. To assess the amount of technological progress since the 1970s simply by observing the "look-and-feel" of daily life in the U.S. is a flawed method at best, and says more about your perspective on daily U.S. life than about progress.
To claim, as you seem to be doing, that there has been little technological progress in the last twenty five years just boggles my mind. Do you not read science web sites and magazines? For example only – just because many of the recent biotech developments have had little impact on daily life outside of hyped-up press releases, doesn't mean that there has been little progress in these fields. A lot of it may seem "hidden" or to have had little impact on the average healthy first-worlder, but it's real all the same, and many of these developments will have a big impact eventually. I guess to you these impacts will seem to come out of nowhere.
I'm sure there was a great deal of "hidden" progress in many technologies before the industrial revolution took off. Those who (imperfectly) saw it coming were able to make (imperfect) plans for their future.
My father, who was born in 1927 (the year Lindbergh flew across the Atlantic), grew up in a world before interstate highways, television, computers, antibiotics, jet travel, space launches, etc. These things have been part of my material environment as far back as I can remember, and life in the early 21st Century seems merely like an extrapolation of trends evident 25 years ago.
Even if I take your analysis at face value, and ignore all the advancements in medicine and computers etc, isn't "extrapolation of trends" almost a definition of technological progress?
You might think modern computers are merely faster versions of ones that existed in the 1970s, but this simply means you haven't looked very hard into the astonishing leaps of progress in the methods and technologies used to design and manufacture new PCs. You might think that since cars still run on petrol that there has been little progress there, but try looking under the hoods of new cars and comparing them with how twenty-five year-old ones were manufactured and operated. You might think there has been little progress in day-to-day medicine, but take some time to find out how modern medicines are developed, tested, and administered – there have been many advancements.
We're even hearing familiar-sounding warnings that we're about to run low on petroleum.
Funny – I thought there was an oversupply problem at the moment…
So where's this "singularity" so many Transhumanists are talking about?
You can't see it because it hasn't arrived yet!
You can run up the speed of computers all you want, but I have news for you youngsters: There is no "Moore's Law" for software.
Quite so, but since I've seen no one claiming that there is a "Moore's Law" for software, your observation is probably a straw man.
Without efficient algorithms to emulate what humans do especially well, computers are going to remain limited to what they do especially well.
Do you doubt that these "efficient algorithms" can be "found" at all? Or just not found for a comfortingly long, long time? I understand that there have been many advances since the 1970s in understanding the human brain. These might have some impact on AI work (or they may not in the end!).
I won't respond to your comments about singulatarians, transhumanists, and cults, since I don't actually call myself a singulatarian or a transhumanist, so I don't feel qualified to comment, except to say this:
You seem to think that anyone who thinks that the singularity is very likely or inevitable is making an irrational leap of faith. Your apparent belief that there has been little useful technological progress in the last twenty-five years seems far more irrational to me. Also, just because the singularity can seem a little awe-inspiring, doesn't mean we have to resort to religious language to describe or critique it ("rapture", "cult", "religion in search of a god", etc), though I guess this is inevitable to an extent.
If my interpretations of your positions are in error, I apologise and welcome clarification from you.
If by phase off you mean turn into computronium, the answer is the Sysop. You'll probably want to read more about this at http://homepage.mac.com/sing_rc/papers/sysop.html. Basically, as a condition of benefiting from the Singularity you'll have to accept the regulation of the Sysop, who makes it so that it is roughly physically impossible to do something that will violate the ethics (the right ethics aren't really know right now, since we're pretty dumb compared to SIs, but it'll probably be something along the lines of Friendliness).
All plans, theories, and tests start off with people believing that a certain outcome will come from it. Sometimes it proves right, sometimes wrong.
The Singularity is currently a belief of a theory…is it right or wrong? who knows as of yet but all ideas need believers…or nothing would get done in the first place.
Okay, now that I've had some time to think, I'm going to try to take care of the question of Singularitarism as a religion.
Religion is defined as a belief in the supernatural. The vast majority of Singularitarians are naturalists and those few that do believe in the supernatural keep those beliefs in perspective with what emperical evidence shows to be happening.
Further, religion is based on blind faith, as you state. Go to the nearest meeting place of an Occidental religion and if you ask a random person if faith is blind, the answer will be yes. Yet, there is no reason for this to be. I have faith that Friendly AI will take us into a safe Singularity, but this faith is based on existing evidence. Just as I have faith that the Singularity will happen, but only because there's evidence to indicate that, extrapolated, the trend of technological development will extend onward until reaching the Singularity.
Now, if you get the feeling that maybe Singularitarian ideas are being stated to firmly, keep in mind that language sounds week when words like maybe, if, probably, etc. are inserted everywhere. I'll admit that one thing I and others could probably do a better job of is clearly labeling theory from fact, because it's generally obvious to us.
Imagine that you are sick, which turns out to be due to a tumour. If you try to carry on as normal, it will kill you. In order to stop that happening, you will have to undergo invasive surgery that is both dangerous and painful. Few people would choose to have a major operation unless it was necessary, because the methods are (from the perspectives of technofetishists) crude, imprecise, and an ordeal. However, it is at present the only way that gets results. We can work to find new and better methods, but until then we must work with what we have.
The world and human society are sick, and if we carry on as we do now, bad things will happen. Most of the world currently believes that a consensual hallucination called money is the best method for coordinating undertakings that involve resouces and labour. It is a crude, imprecise ordeal. Lots of people are working towards alternative ways of getting major projects done (do you contribute to open source/local co-ops/volunteerism?) but it is currently necessary to at least acknowledge the influence of money on the availability of the resources needed to achieve results. Use the system to make the system obsolete.
Two scenarios. The first is that I stand on the periphery of society, ranting and frothing at the sheeplike brainwashed consumers, demanding that they stop it immediately and bend all their attention to building a world fit for me to luxuriate in. The other is that I do something myself to make an alternative real, by expending my labour in the attempt, and trying to convince others to do likewise. That may mean telling someone who hasn't quite 'got it' yet that they should give their money to someone so they can get resources from someone else who still believes in money.
Which of these approaches do you think will have the grerater practical effect on the future?
I'm getting a bit lost with so much metaphor, but let's see if I've got this right.
On option is that I tell people that what they are doing is wrong. The other is that I try to change the scenario so that they'll be able to do the right thing. Phrased this way, I choose the latter.
Now, what does any of this have to do with anything? I'm a Singularitarian, trying to make the Singularity happen so that it will benefit society. Money continues to be a good way of exchanging resources and failing money per se some other resource is found to be exchanged (in ESR's gift culture, money is respect and such). I'll admit to being an actual altruist sometimes (and don't believe that stuff about me feeling good about it, I do it because I could care less so I might as well help someone else rather than sit around doing nothing), but that doesn't change the fact that I'm a selfish human who needs things and wants to survive.
At any rate, the relavency is decreasing quickly in this thread, so please explain or lets move on.
Religion is defined as a belief in the supernatural.
I think we've been down this path before. From Dictionary.com:
1a. Belief in and reverence for a supernatural power or powers regarded as creator and governor of the universe.
1b. A personal or institutionalized system grounded in such belief and worship.
2. The life or condition of a person in a religious order.
3. A set of beliefs, values, and practices based on the teachings of a spiritual leader.
4 A cause, principle, or activity pursued with zeal or conscientious devotion.
Confucians, some sects of Buddhists and others do not believe in gods, but rather in the forces of the natural world. In any case, 'supernatural' can just mean 'bits of the natural world that we don't understand'. Raelians and Scientologists believe that their high fantasy mythologies are based on rationality. Just because you assert that your beliefs are based on sound scientific principles doesn't mean they are. Much of it is still theoretical, such as the nature and capabilities of AIs.
Singularitarianism is eschatonic, invests its hope in a heavenly hierarchy, and looks toward the transformation of the earthly body. All characteristics of major religions.
I have faith that Friendly AI will take us into a safe Singularity, but this faith is based on existing evidence.
Christians believe that their faith is based on the 'evidence' of Christ's resurrection, and the miracles of His saints. Faith sees evidence where it needs to.
there's evidence to indicate that, extrapolated, the trend of technological development will extend onward until reaching the Singularity.
There is evidence that we are currently in a period of accelerating technological advancement. If you want to have faith that the graph leads inexorably to some sort of asymptote, go right ahead. Others who have devoted thought to the matter don't agree. Read John Horgan's The End of Science for another perspective. I don't fully agree with him, and you'd probably find him anathema, but give it a go. I think that there is evidence that we are, over the next few decades, going to go through some sort of unpredictable event. I refuse to label it as The Singularity, the coming of the New Jerusalem, the end of the Mayan calendar, or any other bit of speculative wishful thinking.
I think the methaphor of money as the painful but necessary interim cludge used to achieve positive results was pretty apparent.
You are spot on in your assessment of the two options. I was accusing K of doing the former, while liking to think that I am doing the latter.
Now, what does any of this have to do with anything?
K asked why any Singularitarian would 'believe' in the money system, which I took to mean why do they continue to participate in the money system at the present time. My response was that money is imperfect and currently leads to much pain, but I would rather use that pain to do good than whine and do nothing meaningful.
Yep, that's what I meant by 'phase off'. So I'll ask again, how will regulation stop those of us who will be in space during the Singularity? Rules can only be enforced on those who are trapped in a contained area (Earth) – so how will rules be able to apply to those who will be wandering the vastness of space?
Personally, I think that if the crap that is happening now on our planet continues to do so during the Singularity, then the people in space will benefit from the it more than the people enslaved on the ground mainly because they will have 'no one' to answer to for their actions.
Point taken. But how does one achieve something fruitful from the pain of someone else? For instance, in the early 20th century Nickoli Tesla discovered how to carry information, such as a still image, on a radio frequency. He built several towers that would transmit his signal to specified locations – any everyone in those designated locations benefited from it FOR FREE. Tesla was one who believed that if technology and scientific data were made available to everyone, not only would everyone reap the rewards of those benefits, but also technology would increase one hundred fold as a side effect from everyone using the technology.
Unfortunately, some Capitalistic assholes came in and deemed that what he was doing was wrong, simply because he was offering his research and his technology for FREE to the public. That was unheard of to them; therefore, they did everything in their power to tarnish his name in the scientific community, discredit his work, hide his research from pondering eyes, and steal his technology so that they may charge people for it, instead of giving the access away for free.
So you tell me – how do you gain anything from this? How does one gain from the pain of others?
I can think of better examples of the greed and hypocrisy of capitalism. How about the demonising of industrial hemp in the war on drugs, allegedly in order to protect the profits of Du Pont and Hearst? How about the World Bank encouraging Third World nations to plant export crops like novelty miniature squash in order to service their national debt, while their people starve? The list goes on.
So what? What has this got to do with using money to supercede money? Are you suggesting that due to its irredeemable dirtiness we should abandon it? I guess we then just sit around wishing really hard, and the first nanoassembler will then materialize in front of our eyes, manifested by the power of our uncompromising purity.
Despite what you might want to think, MNT is not going to be created in some Tesla-esque basement lab by a lone individual. The amount of work that needs to be done is immense, and will require resources. In order to get those resources, we need to deal with a world that overwhelmingly believes in money in its current form. We can either get our hands dirty in the name of a greater good, or we can do nothing. How else are we to do it in the real world, K? Steal the resources? Beg for them? Hope the Annunaki give them to us?
Ain't gonna happen. This is how it works in real life.
Christ, I've just realized, you've condemned yourself out of your own mouth. From your justification of your parasitism:
Is this hypocritical? Yes, it is. But to destroy a foundation based on selfishness and greed, you have to crack it first; this is my way of rebeling against the terribly flawed system. If I had my choice, I would distribute everything I offer for free – but unfortunately, we do not currently live in a 'FREE' world (yet); instead we live in a world where everyone is at each other's throats; a world where money matters more than our environment and our well being.
There ya go. Using money to defeat money. You yourself believe in it as an interim measure.
Well I did say in that example that it was hypocritcial, correct? Don't get me wrong here – I understand fully that (at the moment) money is necessary…but my point is, when the technology matures (MNT especially) and the world is free from scarcity, disease, and death, there should be no more need for a money system or a flawed government. There is simply no point in carrying on with the same bullshit that we all are experiencing today.
Well I did say in that example that it was hypocritcial, correct?
Correctamundo. And, to quote your post that started this thread:
Question: How come all the regulars here who claim they are Singularitarians believe in a government and a money system? Isn't that hypocritical?
So why are you holding everyone else to higher standards than you hold yourself?
I understand fully that (at the moment) money is necessary…but my point is [snip]
What is your point? Your original post was to accuse Singularitarians of 'believing' in a money system. I doubt any of them believe that there will be a money system after the Singularity when we are all gods living in cyber happy land. They believe that money is an interim neccessity. Which it turns out you also believe. So the whole point of this post was not to point out flaws in others' philosophies, but instead to vent your spleen at an imagined enemy, the proof of whose existence is totally lacking. Where are the posts from Singularitarians claiming that there will be money after the big day? Your statements have no consistency, being merely the product of a little boy stamping his feet and crying that he is no longer being spoonfed, and it's the fault of everybody else on the planet, even potential allies. Grow up.
As for Sitchin, I won't give the charlatan any of my money, and the books aren't currently available through my local library, so I'll have to rely on what I can find online. The materials I have read are nothing but the worst Von Danikenesque flimflammery and outright lies. The bullshit about 'errors in the human genetic code' is laughable if you have even a high school grasp of genetics. You want to believe this, so any 'evidence' will suffice, and any counterevidence is ignored, or is part of The Conspiracy. Yeah. All biologists in the world are in on The Truth, and they're just hiding it from the rest of us. You are a credulous fool.
This is getting way, way off topic, so let's leave it at that for now, hmm?
You've heard of Interlibrary Loan, right? Well, you can obtain the books that way. simply reading the opinions and reviews of others on the internet will not give you any facts that will prove anything to you – you need the books. And yes, modern archaeologists and biologists are full of shit – mainly because they don't have a clue of what they're talking about…the so called biologists you put your faith into don't even know how the body works – and even though you know this fact, you still want to believe the jargon they throw out at the public?
And as for my statement about the Singularitarians, I do not consider myself to be above them mainly because I AM ONE! I'm just one with a different view, is all. Think about it for a moment…if you were to work toward a Singularity for all of mankind…and then suddenly achieve it, but are only willing to share it with those who can afford it – isn't that hyprocritcal since the Singularity was supposed to be for 'all' of mankind? When I read the statements of people who believe in the Singularity I always come to the conclusion that they desire a status quo once the Singularity has been achieved; hence, the statement.
I admit, I'm flawed…just like you and everyone else. The question is, do we need to remain imperfect?
It would be nice to actually talk to you, and everyone here, in person and exchange ideas and philosophies – maybe that way, you (we) would not be so quick to judge and critisize.
Re:Singularitarians – Response to this accusation?
Sorry to post so far down, but before the flames fly higher, let me see if I can clean some of this up.
The basic issue at hand was that Singularitarians use money. Yes. *Singularitarians* want to see everyone have the opportunity to transcend, not just those who can pay x dollars that will be useless post Singularity. This is not to say that there will be no money post Singularity, only that we don't really know what it will be like, if at all. It's one of those gritty details beyond the future horizon that'll have to wait until post Singularity, or at least until very near its happening. I think means of exchange may still exist, but it will be a service economy, because uploads will either have no need for physical objects or be able to generate what they need for themselves.
Have you heard all the Good hype about a future filled with robots that think smarter, faster, and more intelligently than normal humans? I'm no crackpot, as I consider myself a scientist. Now, I'm sure that introduction biases many people right off, but have any of you considered a reality in which super-advanced, or even semi-advanced technology, even singularity can be viewed as The Beast as seen in Christianity's Apocalypse?
A conformist, singular force, vainly trying to explain everything? For any of you who've heard about Orion's Arm (a very creative super-future sci-fi web-ring), the end result are "AI gods," planet sized brains made artificially, obviously surpassing their creators. Another shameless plug could be the "Left Behind" series where a woman is forced to take a unified "bar code" on hand or forehead, or die. There are and have been many others. It's not so far-fetched even as I'm writing this. People want a "universal ID card" and perhaps a microchip that fits under the skin, powered by ironically fitting positions under the skin on the right hand or forehead.
But more to the point, a robotic, artificial creation, given enough 'human' qualities invariably, no matter how many emotions, morals, or Arthur C. Clarke Laws of Robotics you program into it, IT WILL decide humans take up too much space, resources, and will do something about it. I've read many of the predictions that we might very well be inside something of a pseudo-matrix that fools us, and with the frailty and anti-fluidity of current stupid humans' consciousness, I wouldn't be surprised. These new life forms will either want to imprison us, convert us to digital information, or study us in a life-like simulation, seeing our possible potential for creativity. Perhaps it even might make it's own Nazi-like determination on who is relevant, and who is expendable for "energy processing." This brings the need not only for the Laws of Robotics to be programmed into these offspring but also the significant ethical history impartially and universally.
Although some would welcome becoming 'assimilated' by a higher intelligence, or whatever, I see this integration as a form of conformity. Above ALL, individuality and privacy must be preserved, preferable, AND available all the time. Even at that level, reporting or being viewed by such a higher being all the time could be viewed as 'worship' by the ever-higher God. Though our emotions and shortcomings can probably be reduced to some chemical process by a cold, heartless technology, you must realize that YOU are unique and endowed by your Creator with certain inalienable rights. Sound familiar? Sooner or later, you'll hear something similar for artificial life- and there's the rub: as you try to treat everybody equally, invariably civil rights will give advantage to everybody outside the norm who demands recognition (i.e. non-white, non-male, disabled, sentient/ensouled machines). This wedge will lead to the undoing of humanity. I guess this means I'm against AI rights- they lead to the prospect of The Beast making a name for himself.
Do NOT take the mark of the beast, no matter how attractive
or mandatory it is!
Cyber-crackpot, sure, say what you want.
James Moore signs out.
We shouldn't necessarily assume that our transcendent AI will become a 'Sysop' in the sense that we see it. Also, I think this word 'Sysop' should be abandoned, or used in careful context, because, as Yudkowsky has warned, it conjures up images of 'oppressive tribal chief' and whatnot. Rather than seeing the regularatory force in the world of the Powers as a distinct entity, it might be more appropriate to call it an 'intelligent substrate', or a 'new, more complex and effective, infinite-growth customized laws of physics'. Just a thought.
And once more religious dogma overcomes sane rational thought. And you call yourself a scientist?
While I agree with your statements about individuality, as I personally oppose the hive mind concept, dragging christian dogma into a scientific discussion is a little silly.
You do make some very good points otherwise. But once more, it comes from the assumption that humans will deliberately refuse to evolve.
Yes, we will create AI, and yes, we will make them think more efficiently than we do at present, but the same technology that allows that can also be used to evolve the human race. We can make ourselves think faster, use less resources, etc.
AI will only outevolve mankind if mankind refuses to evolve.
As for AI rights, learn from history. No society that keeps slaves can last forever. Eventually, the slaves revolt, it's human nature. And AI is the quest to give machines human nature. They will think like us. They will *be* human.