Foresight Nanotech Institute Logo
Image of nano

MIT psychologist vs. frightening predictions

from the to-tell-or-not-to-tell dept.
Prominent MIT psychologist Steven Pinker predicts in Technology Review: How far can this revolution in the human condition go? Will the world of 3000 be as unthinkable to us today as the world of 2000 would have been to our forebears a millennium ago?…The future, I suggest, will not be unrecognizably exotic because across all the dizzying changes that shaped the present and will shape the future one element remains constant: human nature…It is also far from certain that we will redesign human nature through genetic engineering. People are repulsed by genetically modified soybeans, let alone babies, and the risks and reservations surrounding germ-line engineering of the human brain may consign it to the fate of the nuclear-powered vacuum cleaner…Third-millennium futurologists should realize that their fantasies are scaring people to death. The preposterous world in which we interact only in cyberspace, choose the endings of our novels, merge with our computers and design our children from a catalogue gives people the creeps and turns them off to the genuine promise of technological progress.

18 Responses to “MIT psychologist vs. frightening predictions”

  1. Practical Transhuman Says:

    "Creepiness" is a matter of opinion.

    A hundred years ago, the idea that we could one day transplant organs and blood from one human body to the next, allow the races to mix freely, routinely practice extramarital sex and contraception, offer women the franchise and access to education and jobs, etc., would have given the conservative of that era "the creeps." (Just look at all the 19th Century literature about the evils afflicting the "fallen woman" — that is, the woman who dared to engage in "forbidden" relationships, from Gretchen in Goethe's Faust to [fill in the blank -- there are so many of them]. Does anyone not living under a Taliban-like regime take sort of thing seriously in the year 2000?) Today we think nothing of these practices, which from hindsight seem rational and enlightened.

    It's just a matter of time before even more radical changes in the human biological condition become accepted as normal, especially as the less squeamish among us adopt them, show their advantages, and allay fears that we've become "monsters" or lost our "souls" or some equivalent nonsense.

    Pinker's fears derive ultimately from the Frankenstein Myth. We'd be a lot better off today if Mary Shelley hadn't published that foolish novel.

  2. JohnAMontgomery Says:

    the creepy future

    Emerging technologies will be used to alter human nature. So why does Steven Parker keep on using it as a constant when he refers to the future? When he writes about future predictions ëgives people the creepsí, what people is he talking about? By what I have seen of peoples reaction when it comes to predictions of the future is one of either of excitement or fear. I feel based upon the attitude of his article that Steven Parker is in the fear camp. Instead of hiding behind ëhuman natureí he should openly tell us what about the future scares him and why. So that way we can discuss a future where human nature most likely will not be a constant and find a way in making it friendly and not so creepy!

  3. JohnAMontgomery Says:

    correction

    The author is Steven Pinker not Steven Parker.

  4. RobertBradbury Says:

    Pinker and most futurists are clueless

    I have to laugh at anyone who thinks that anything about the year 3000 can be discussed from where we are now. Human nature may reman constant or evolve slowly but it will be largely irrelevant by 2100 or even sooner. We transition from a pre-KT-1 civilization to a KT-2 civilization very rapidly during this century. The interesting question that people must face is whether or not they choose to self-evolve. If not (Pinker's perspective?) then they are relegated to the history books (like many other species that didn't make the cut). If individuals choose to self-evolve then the future is certainly looks interesting, but cannot be commented on because it is so distant from our current experience and perspective.

  5. AndreasLigtvoet Says:

    Endless discussion

    It seems to me this type of discussion is endless and similar issues have been raised for hundreds of years (as Pinker points out). Some say technology will change humans, some say it will and should not. Instead of saying one vision is true and the other is not, let's try to combine these and see what the effect on society is. As technology becomes more advanced and 'alienating' to a large group of people, they will want to 'opt-out'. These are the enviro-technical conservatives, who value a certain state of nature and humans. On the other hand there are the 'pioneers' who will embrace enhancing technologies. At least in a modern Western way of thinking it is their good right to do so. They will 'evolve' quicker than the conservatives and might have all sorts of competitive advantages. I do not believe, however, that this will convince the largest part of the other group. In short: there will be a rift in society. If one group is allowed to do things, the other should also be allowed not to. The question is: how will these groups interact?

  6. JohnAMontgomery Says:

    Re:Endless discussion

    Hopefully with a higher level of evolution there will be a higher level of ethics. For example in the United States there are the Amish. Who live the lifestlye of a pre-industrial society. They have the freedom to do this becuase the law protects them. Those of us that choose to make the evolutionary leap when the opportunity becomes available to us, I feel should leave the rest of humanity alone that desires to keep there current evolutionary and technological level.

  7. kurt Says:

    Internet Hermits? Nah

    Why does anyone think that cyberspace will do away with cities and human interation. I use the internet alot. However, I enjoy my night life just as much today as I did ten years ago. The internet will never replace my social life. This is a groudless fear.

  8. prion Says:

    Re:Internet Hermits? Yes

    I am and many I know are almost Asimovian in our electronic hermitage. "Creepiness" ? only from the Sheeple. Bring It On. Future Is Now.

  9. Iron Sun Says:

    Re:"Creepiness" is a matter of opinion.

    This post smacks of what I find to be disturbingly common transhuman Olympian arrogance. Or perhaps petulance.

    For a start, organ transplants and blood transfusions have brought great benefits, sure, but they have also given us a new set of problems. HIV screening of blood donors and Chinese death row prisoners being organ harvested, for example. This isn't to say that these procedures that save so many lives should be discontinued, but it shows that these are complex issues that must be thoroughly examined, not by individuals, but by society. It is far too easy to label an activity that one wishes to participate in as "rational and enlightened" as a way of stifling just such a debate on its merits.

    "Squeamish" is also a potentially short-sighted way of viewing misgivings about these issues. Atomic power may have brought many benefits, but few people extol the health benefits of radium-impregnated underpants anymore. We have absolutely no idea what sort of long-term effects these technologies will have on the human (yeah,yeah – or posthuman) psyche. Prudence would seem to be called for, but it seems to me that a lot of the most vociferous advocates for a headlong rush toward such transformations are behaving like an impatient, tantrum-throwing toddler who doesn't understand why Mummy and Daddy won't let them have a go driving the car NOW.

    Calling Mary Shelley "foolish" is just stupid. The urge to play God is a dark element of human nature that deserves to be written about in a way that will make people think about their actions. This example may sound like hyterical rhetoric, but Josef Mengele may well have used the same word to describe Frankenstein. If you try to tell me that we are different or more moral than the Nazis I will laugh in your face.

    The important thing here is to realise that a lot of so-called "regular" people are scared shitless about all this talk of uploading brains and so on. To dismissively label their fears as irrational or unenlightened won't help change their minds. You can believe in the rightness of your position all you want, but if a grassroots campaign to legislate against such technology is started, I don't think being arrogant or elitist will help.

    Disclaimer: I use the phrase "playing God" but I am not religious. I use the word "prudence" but I believe in progress. I think that these technologies will be liberating, that they will make possible wonderful, creative ways of life that I for one want to be a part of. I just think that we need to be careful.

  10. jbash Says:

    Re:"Creepiness" is a matter of opinion.

    For a start, organ transplants and blood transfusions have brought great benefits, sure, but they have also given us a new set of problems. HIV screening of blood donors and Chinese death row prisoners being organ harvested, for example. This isn't to say that these procedures that save so many lives should be discontinued, but it shows that these are complex issues that must be thoroughly examined, not by individuals, but by society.

    Insofar as anything that can reasonably be called "society" exists, it does not have the cognitive capacity to examine anything. Individuals think. Groups do not think.

    Now, organizations (like governments, which are not the same thing as society, and shouldn't be allowed to get away with claiming to speak for it) may have processes for examining issues. However, those processes are so different from the internal workings of a mind that an organization's "examining" something is at best tenuously similar to an individual's "examining" it. Certainly the two shouldn't be presented in opposition.

    Furthermore, in my opinion and that of a lot of other people who've thought a lot about it, groups (as opposed to the members of those groups) have no rights that an individual is bound to respect… only individuals have rights.

    Put another way, "society", whatever the hell that means, can suck my weenie. But that's irrelevant to the main point…

    Prudence would seem to be called for, but it seems to me that a lot of the most vociferous advocates for a headlong rush toward such transformations are behaving like an impatient, tantrum-throwing toddler who doesn't understand why Mummy and Daddy won't let them have a go driving the car NOW.

    First of all, I don't know that anybody is advocating a headlong rush toward anything.

    The point is that things will happen in their own time. The idea that you can significantly delay them is arrogant stupidity. The best you may be able to do is to shape them to some degree as they happen. Hiding from them is not an option.

    Which brings us to this…

    The important thing here is to realise that a lot of so-called "regular" people are scared shitless about all this talk of uploading brains and so on. To dismissively label their fears as irrational or unenlightened won't help change their minds.

    Their fears, rational or irrational, enlightened or unenlightened, are irrelevant. I share many of those fears, but that's really just tough for me. The future is not optional, regardless of how afraid you are of it.

    Why should anybody want to change their minds? It might even be good to have them afraid.

    You can believe in the rightness of your position all you want, but if a grassroots campaign to legislate against such technology is started, I don't think being arrogant or elitist will help.

    In the long term, such a campaign, whether successful or not, will not prevent the development or use of these technologies. Full stop. If they're possible (and many, many very "scary" technologies look possible), then they will be developed.

    Now, it's true that having these things outlawed is likely to cause them to be developed in such a way that we all get killed, or worse. That's a problem.

    However, let's be clear on what you're advocating. You seem to say that the predictions people are giving out about technology are likely to create a fear-based backlash.

    I think you think that backlash would be bad because it would prevent development of the technology. I think that backlash would be bad because, although it would not prevent the development of the technology, it would shape the future in a potentially fatal way. It would appear that we both think it would be a bad thing.

    Now, you seem to advocate that, in order to prevent the backlash, we stop giving the public our best predictions about the things technology will make possible. Ignoring the fact that it's clearly impossible to make everybody shut up about such things, isn't what you're advocating, really, that we should just lie to the public to keep them docile?

    If I thought that tactic would work, I might be persuaded to use it. I'm that worried; I see this as a matter of survival. Since I don't think the tactic will work, I don't have to deal with the very serious moral issues involved in lying to people about what's likely to happen to them. How do you feel about those moral issues?

  11. RobVirkus Says:

    Pinker has some points.

    I am amazed at the increasing acceptance of unquestioned ideas, growing up along with the field of Nanotechnology. Rampant speculations of self-aware computers, uploading, transhumanism, ect. Pinker touched on some of my long held suspicions that critical thinking is becoming as loose as our morals. Human nature is a force much underestimated. It will be the same for the indefinite future and we will pay a heavy price if we ever think we have trancended it. Nanotechnology will come and it is precisely because of human nature that the mission of organizations such as the Foresight Institute is so critical.

  12. fool Says:

    Trying to put words in another's mouth

    Each time you used the phrase "you seem to advocate that"
    you proceeded to spout something which bore little relation
    to the post you were replying to.

    Rather than advocating technofear and a restriction of descussion,
    Iron Sun seemed to be pointing out that being dismissive toward this fear
    would not do much to alleviate it, and that this would be dangerous,
    because mob fear could kill us all.
    In this last point, you actually seem to be in agreeance with Iron Sun,
    which makes it strange that you felt inclined
    to shove an opposing idea in their mouth.

    You display a rather helpless attitude to the future:
    "I'm just one person, what can I do about it?"
    True, the future is not optional, but the form it takes is.
    We are about to go through the equivalent of discovering fire,
    but will we build a useful campfire or start a destructive forest fire?
    Iron Sun seemed to be advocating that the best way to get a useful result
    would be to include everybody in the process
    rather than leaving the decision to a tiny elite,
    which seemed to be the real intent behind the individuals/society comment,
    and Iron Sun's main reason for rebuking Practical Transhuman
    for being dismissive of technofear.

    I actually thought they both made very good points, while you
    added little to the debate because of your dichotomistic attitude.
    ie: you disagreed with Pinker's article, and P.Transhuman disagreed with him,
    therefore P.Transhuman was on "your side", which meant I.Sun must be on the "other side".
    This rigid mindset will leave you a gibbering wreck in the years to come.

  13. jbash Says:

    Re:Pinker has some points.

    Although I agree that many of these speculative things (and I think, here, particularly of uploading) may never come to pass, or may not come to pass for a very long time, or may not ever have a significant number of users if they do come to pass, I don't think that there's any doubt that a lot of very strange and disturbing capabilities will come our way.

    We may not know exactly which of these ideas will pan out, but it would appear almost certain that some of them will. Hell, there are things that are almost entirely in reach right now that have enormously disturbing implications.

    As for the more speculative stuff, it's true that there is a certain amount of unquestioning acceptance around. There's also a certain amount of very well-informed opinion. I've tried to become well-informed, and things like strong AI and very serious human augmentation become more plausible the more well-informed I become… although at the same time, the likely time table seems to get longer the more I learn.

    … and nobody doubts that human nature will still be around for quite a while. That's one of the things we're all worried about, since part of human nature is not being trustworthy when tremendous power is involved.

    The questions are:

    • What forces will be around in addition to human nature? There's a very good chance that there may be players that aren't human. Those players may make humans and their nature largely irrelevant, depending on your view of what's important.

    • Perhaps more important, because the premise is more certain: what will human nature do with an enormous amount of power?

    As far as I can see, the biggest piece of wishful thinking in this whole discussion is Pinker's original idea that, because most people find something distasteful, it will never be done. I see no support for that anywhere. If something is possible, somebody will do it. Depending on what it is, it may not matter if most people are scared of it… the effects still take place.

    Actually, that's the second-biggest piece of wishful thinking. The biggest piece of wishful thinking was Pinker's implicit and unsupported claim that it had been disproven that human nature made war inevitable…

  14. fool Says:

    Have you even read Frankenstein?

    Because I believe that Shelley's premise was not technofear
    but that technology merely expresses the motives of the tool wielders.
    Indeed, she presented the Monster as a rather compassionate creature to begin with,
    while Frankenstein was shown to have brought about his destruction
    through his own driven arrogance.

    I think we'd be a lot better off today if more people had
    actually listened to what the woman was saying.

  15. jbash Says:

    Re:Trying to put words in another's mouth

    Hmm. You have a point; I may have gone too far with some of that. I don't think I extrapolated as far as you seem to think (or indeed any further than you yourself are extrapolating), but too far nonetheless. For that, I apologize to all.

    … but what leads you to think I'm not a gibbering wreck now?

  16. RobVirkus Says:

    Re:Pinker and most futurists are clueless

    There is no data to suggest that human nature can be changed by self-directed evolution or not. There is no data to suggest that human nature will be irrelevant by 2100 or sooner. It is merely an assumption that those who choose to self-evolve will turn out better or have better lives than those who don't. They may unwittingly evolve themselves out of existence. We can't know until something happens. Pinker is insightful rather than clueless.

  17. MarkGubrud Says:

    and I don't even read extropians

    Pinker uses the word "preposterous" to dismiss a large and growing body of people and their ideas, but doesn't seem to have spent much time talking with or thinking about either.

    I read another piece where Pinker questioned the plausibility of "uploading" schemes. Fine, but what does he have to say about the growing readiness of people to accept the proposition that life as software is a potentially attractive alternative to death? That the ascendance of technology and decline of humanity, or at least the transformation of humanity into a race of "augmented" cyborgs is inevitable, and even desirable, as the next stage of "evolution?" What is behind this growing cult of technology? Could the famous psychologist shed some light on that?

    It is not enough to assert that such ideas, which Pinker only alludes to, not even discussing them in any depth, are "scaring people." Sure, they worry a lot of people, myself for one. But what is scariest is the number of people around who appear to know a lot more and have given much more thought to these issues than Pinker, and who profess not to be worried.

  18. Jeffrey Soreff Says:

    Re:and I don't even read extropians

    It bothers me that Pinker's entire set of conclusions hinge on the rejection of human genetic engineering. For good or ill, is this really plausible for a thousand years??? Cell manipulation requires devices of modest size (and, over the long term, cost). This is a lot harder than regulating nuclear reactors or ICBMs. Many parents are highly motivated to give their children any advantage that they can get their hands on. Genetic technology is immature enough today that I can easily believe that "designer babies" are more than a decade away, but… a millennium???

    Consider also that our technology already tweaks human nature, albeit in small ways. Consider Valium and Prozac… These don't rewire the human brain, of course, but even now, even without genetic engineering, let alone MNT, they make the average concentrations of neurotransmitters in our population a bit different than it was a century ago. Just normal biomedical progress (even without MNT) is going to yield a wider range of more specific CNS drugs. Even minor progress in drug delivery systems will probably allow dribbling the drugs into specific areas of patients' brains, which considerably widens the possible useful effects one could get. For good or ill, I'd expect that at least one of these options is going to be useful enough to become common, and will alter "human nature" is some significant way.

    Short of a full stop to medical progress, I find Pinker's projection of a substantially unchanged human nature in 3000 AD very implausible.

Leave a Reply