Engineers seen as unable to make moral decisions
from the who-else-is-even-paying-attention? dept.
From a Newsweek article on MSNBC on the coming age of cyborgs: "Who, then, can speak on moral issues? Certainly not the engineers. Ellen Ullman, a former computer programmer and the author of the 1997 book Close to the Machine: Technology and Its Discontents, says that 'the problem is not the technology, which in any event canít be stopped. The problem is that engineers are making decisions for the rest of us.' Programmers are hired guns, she says, and rarely understand in a nuanced way their clientsí actual work. They are, she says, the last people 'to understand what is an acceptable risk.' " CP: In Foresight's experience, programmers and engineers are far more attentive to ethical issues in technology than members of other professions.



December 30th, 2000 at 6:24 PM
Super comment!
This article is rather unfocused, but deals mostly with cyborgization and meaning-of-human issues, briefly discussing computation and supertechnologies, and giving so much space to Bill Joy that one begins to wonder why they didn't just reprint the Wired piece.
As usual, the article is rather confused about technical issues. Silicon has more than another 10 years of progress left, DNA computing is irrelevant, and quantum computing is not about extending Moore's Law.
The comment about engineers appears at the end and does not fit well with the rest of the article. I think you are misconstruing it. Ms. Ullman might not disagree with your claim that engineers score above-average on the social awareness scale. But that doesn't stop them from working as "hired guns" on evil projects.
Further reactions:
And have for about the last 10,000 years.
I don't see how these "go well beyond" pacemakers. They are clearly all just prosthetics for normal medical purposes — restoring health or lost function.
Why subcutaneous? That might seem a convenient way of carrying such records, but not if the paramedics don't know you have one, or if yours is coded in an obsolete standard, etc. The whole idea seems a bit silly, as if one is just looking for an excuse to implant chips.
Now, that's scary. Brooks is dangerous and out of control.
This is obviously a nonsensical question. One can map out a continuum without phase boundaries. But that does not mean that nonhuman, nonliving machinery is equivalent to human flesh, or that cyborgs are human beings. We can recognize differences and make distinctions between different things. The important question is, What is desirable? And if you answer that you desire to "become" a cyborg, or worse, then the next question is, Why?
Conductive of what? I have a feeling the writer is being deliberately mysterious here.
Why would it matter how "humanity" is "defined"? The writer seems to be echoing the notion that it is okay if people are replaced by robots, as long as they are sufficiently "spiritual."
And that the computers are going to take over if we set them up to do so. But she's right on target. One of the pathologies of technology cultists is to already identify with machines, to carry around a self-image of being machinelike, meticulouly organized, rational, an object rather than a subject, an abstract process rather than a physical creature. You must be suffering from such a complex if you think it would be good to have a backup copy somewhere in case you got killed.
If every human were progressively loaded up with implants, the difference would disappear. This is just one way in which one can imagine gradual erosion of the distinction between the survival and death of our species. Another might be if all humans had their limbs, internal organs, and so on progressively removed, with no prostheses replacing them. Or if radiation bombs slowly poisoned us all. Or if aliens took over the planet and forced us all to interbreed with their genetic creations until the human genome was diluted to the vanishing point in a pool of artificial DNA. And so on. Here we are told that "Brooks foresees" this as if it were inevitable. What a Gloomy Gus.
I agree entirely, but it is Brooks who has not freed himself from the need to connect morality with physics. This kind of nihilism is the whimpering of a pre-Copernican in despair. "Alas, the universe isn't ruled by a God who loves us! Might as well commit mass suicide. Maybe the machines can figure out a reason for living."
Since theology's out of style.
Human beings are the fact. Any definition is just an attempt to describe this fact. Looking to "the most common definitions" to tell us what "human" means is circular. But this is, of course, the sort of verbal game that academic philosophers like to play.
No. Next question. Oops… wait a minute… there is one type. Human intelligence. Tautologically, only humans can be said to have this. But computers can probably simulate it to any desired degree of fidelity.
Well-put.
December 30th, 2000 at 7:37 PM
21st Century Bill of Rights
Speaking of ethics, here is a link to a proposed 21st century Bill of Rights: (my appologies if the link doesn't work) which addresses all of the ethical concerns about "cyborgization" and self-enhancement technologies.
Perhaps the Foresight Institute could take a lead in promoting our 21st century Bill of Rights.
December 30th, 2000 at 7:41 PM
Re:21st Century Bill of Rights
Speaking of ethics, here is a link to a proposed 21st century Bill of Rights: "http://www.ugf.edu/CompSci/CGray/CYBILL.HTM" (my appologies if the link doesn't work) which addresses all of the ethical concerns about "cyborgization" and self-enhancement technologies.
Perhaps the Foresight Institute could take a lead in promoting our 21st century Bill of Rights.
December 31st, 2000 at 1:17 PM
Re:Super comment!
"But that doesn't stop them from working as "hired guns" on evil projects."
Yes, but it makes them much less likely to. I think that engineers would be more likely to refuse to do something on ethical grounds than, say, a construction worker who is going to destroy the environment by helping the build a suburb or a miner involved in a stripe minning opperation (not to put these people down, just that they sterotypically are less socially aware than, say, engineers).
Also, nothing stops the Ms. Ullman for working as a 'hired gun' to spread FUD about future technologies.
" Rodney Brooks…says, ìwe will become our machines.î
Now, that's scary. Brooks is dangerous and out of control. "
Again, Mark, you are showing your low tolerance for far reaching technology. Why is it so scary that we might become computers rather than biological? I think that it will make life different, hopefully better.
Also, Brooks is being taken out of context. Becoming machines does not mean turning into cold, personalityless computers, but creating an intelligence in the computer that is of ourselves. Whether done one neuron at a time or all at once, the upload process is the same.
"You must be suffering from such a complex if you think it would be good to have a backup copy somewhere in case you got killed."
Okay, when you and I die in a plane crash, I'll be sure to tell your family that I tried to get to make a back up before the flight, but that you refused. They'll be pretty sad that you didn't care enough about them to make a simple back up in case something went wrong.
"I agree entirely, but it is Brooks who has not freed himself from the need to connect morality with physics. This kind of nihilism is the whimpering of a pre-Copernican in despair."
I would probably make similar comments and am not in despair. As you will find, Mark, such people are only is despair if they hold the same vision of the future that you do. Morals, as far as I'm concerned, don't even exist in the sense that most people think about them (or rather they do, but only in a relative sense, in that people make up their own morals, which has little to do with how they should act as just a plain human). The meaning of life has been found: do whatever gives you the greatest net benefit. With that in mind, I live my life everyday, doing exactly that. I have fun and enjoy myself, never falling into some kind of dispare that I and the people I affect are the extent of my influence in the universe.
Now, at least I'm glad for that last comment you brought out, as well as the fact that you agree with it. Even if I nor anyone else ever convinces you to take advantage of more technology than you are already willing to, at least you are willing to let us do whatever we want. Who knows, in 30 years I may be making comments from the other side of the screen and still having it out with you on matters of the social implications of technology.
December 31st, 2000 at 4:36 PM
Re:Super comment!
The claim is that it would be one's self. But that is nonsense.
Yes, the creation of some kind of facsimilie. If the original has been destroyed, the person has been killed. If not, the person is stilll alive, but we have this other thing that has been created.
No, they'll look at your "backup" like it has lizards crawling out of its ears (whether or not it has ears).
I am not in despair. I see danger ahead, not doom.
I don't believe in Plato's garden either, but certain theorems of mathematics are true, while 2+2=5 is false. The same may be said of moral propositions.
So you would agree, then, that it's okay for the Taliban to oppress women, or anyway, no one can tell them it's wrong? Slavery should be okay, too, as long as slavers make up their own morals?
But I see you don't really believe that:
So you do acknowledge some sort of universal standards, but I am troubled by the phrase "just a plain human"… does this imply that a fancy human would not be bound by such standards?
What is the meaning of "benefit"?
I don't have the power to stop you from doing anything, but society can and (I hope) will act to prevent the creation of threats to human survival. In the meantime, I will try to coax you into thinking a bit.
January 2nd, 2001 at 8:29 AM
Re:Super comment!
"Yes, the creation of some kind of facsimilie. If the original has been destroyed, the person has been killed. If not, the person is stilll alive, but we have this other thing that has been created."
Think of it like copying a program on the computer. The two copies of it are the same (so long as no errors arise in the copying, which could happen in uploaded, generating the need for multiple scans and verification to eliminate 99.999…% of all errors), and running either one results in the same thing. If your brain were frozen in time, a copy of it could be made and that copy would run just like the original one. It will not be a facsimile: it will be the same thing. The only way that this could not be true is if there is some kind of soul or other supernatural object involved.
Now, the actual scanning process is not yet known and what will be involved, since congnitive science is still pretty young and we don't know that much about how the brain works. So, just mapping out the brain may not be enough, but I'm sure there will eventually be some technology to get us there.
"No, they'll look at your "backup" like it has lizards crawling out of its ears (whether or not it has ears)."
So do you mean that we won't have good cloning technology? That the physical generation of backups will result only in 'ugly' people? I think that it is a bit early to make such judgements, considering how early we are the the stages of developing such technology. If your reference is to the fact that it won't really be you, see above.
" So you do acknowledge some sort of universal standards, but I am troubled by the phrase "just a plain human"… does this imply that a fancy human would not be bound by such standards?"
Yes. I believe in natural law theory, so different species have different natural laws to adhere to (for example, some animals practice canabilism, but most humans don't because we realize that we are alive and that killing other humans is usually not to our benefit). As you will notice, animals that know that they are alive generally don't kill anything else that knows it's alive except under extreme circumstances (like when defending one's own life). This should hold true for trans and post humans, plus AIs. If it doesn't, then maybe I need to look for a new theory to explain social behavior.
This brings up your next question:
"What is the meaning of "benefit"?"
A benefit is a positive effect. The use of the word net is also important. Something that gives a person pleasure has a gross benefit, but not necessarily a net benefit, as the act may be a Nazi killing thousands of Jews, which has a net negative effect, since the value of the people killed is much higher than that of a Nazi's pleasure (since we're all humans, judging one persons pleasure to be worth more than other's life is not possible, since natural laws work on equality).
The problem is always deciding what is a net benefit and what is not. This is why morals make our society so complex and ridden with problems, because morals make it okay to do something that would have a negative net effect from an objective standpoint but a positive one from behind morals (think of the Crusades, the Spanish Inquisition, the Nazis, etc.).
"I don't have the power to stop you from doing anything, but society can and (I hope) will act to prevent the creation of threats to human survival. In the meantime, I will try to coax you into thinking a bit."
That's just it. From your point of view, lots of things are threats, but to me, there are only risks on the road to the Singularity (and whatever tech might lie beyond that horizon).
January 2nd, 2001 at 3:39 PM
De re uploading
What do you mean by this? They are not the same copy. One is one copy, the other is the other. You have an operational definition of "sameness," but does mean that 1+1=1?
This is true for identical (in the operational sense) digital computers running identical (itos) programs, but it will never be true for any two physical brains, even if they were identical down to the last molecular bond specification at tzero.
If I make one Pentium chip, it is one thing. If I make two of them, how many things is that? One is a facsimilie of the other; they are not identical, but they both conform, within tolerances, to some common specification. If I make a million of them, is that still only one thing? You might as well say all humans are the same, since we do all conform to some common specification (with relatively wide tolerance bands).
Do pentiums (pentia?) have souls? It seems to me that the claim that a facsimilie of a person is the person could only be true if some supernatural entity (soul) denoting the essence of that person had been transferred to the fax. Discussions of "uploading" commonly use words like "brain pattern" or "identity" or "YOU" in order to indicate the idea of this ineffable essence. The word "soul" will do just as well.
A nice declaration of your religious faith. I'm not so sure; I'm currently working on a paper on this topic and if anyone has any references or info on proposed uploading schemes I'd be very interested.
January 3rd, 2001 at 2:36 AM
Do not just blame the Engineers
An important comment needed in order to bring increased self awareness and scrunity to those developing new technologies, but somewhat flawed due to its limited scope.
For added realism, there are many people in all dimensions of society making important decisions that affect daily lives. What about content programming and selection in mass media ? What about politics, law enforcement and social activities ? In all of these fields, there are also "hired guns" that have less of a social conscience and more of a concern for the money they are earning. And I will not even start on the financial industry, where a superstar derivatives trader lives in lavish luxury while devestating raw materials prices across continents and making it difficult for peoples in developed worlds to come up the quality of live curve. But these too are difficult ethical and moral points that ultimately I find futile and resigned to not understanding.
Ethical and moral concerns need to be addressed in relevant context at all different levels. Technologies are often ethically and morally neutral, although they lean in different directions (nuclear power has more potential for damage [and benefit] than advanced agricultural machinary, although the former operates as a more abstract level of reality and therefore could be expected to have a more fundamental impact on reality – risks are somehow proportional to the basic level of reality at which they are considered, if that makes sense).
Hired gun programmers may not need to be aware of issues of human risk if their operating context is only software – they do need to be aware of software risks: somehow the idea is "be aware of your context, and do responsible justice to it" in the guise of "think globally, act locally". Hired gun project managers can be considered in a similar vein. However, hired gun systems engineers may need an enhanced awareness of human risk, considering that humans are a part of the context of the system and that is where the responsibility lies. Here the train of impact and responsibility intersects with the rest of the world.
However, how do you address the role of the financiers and those who fund or otherwise instigate projects – are they morally or ethnically blind ? They also need a conscience about the higher level impact, but there is tension with the engineers. In an better society, the businessmen, politicians and other people operating at a "heightened social context" would be aware of all these things.
Unfortunately, technology has illustrated the rise of a greedy side of capitalism that views financial gain as the be all and end all, and justifies financial gain as morally and ethically neutral, and any consideration of "morality and ethics" as a "beauty contest". This is part of a much more complex debate that I certainly cannot do justice to, and I only hope that other people can, for the practical reason that while life extension technologies are amiable and desirable, they are eventually pointless if the quality of life we can look forward to is miserable. And like the author, I can only raise the point and be somewhat provocative as a way of bringing up the level of debate.
January 3rd, 2001 at 5:52 PM
When you get married….
When you get married, you will come to understand the many benefits of financial gain, not to mention its positive impact on "quality of life".
January 4th, 2001 at 6:24 AM
Re:De re uploading
Semantic confusion. Is not pattern the issue here?
If I make a copy of, say, an ordinary car, that is a perfect duplicate down to the last quantum state, then there is NO difference between the original and the copy – the are the same thing, and there is no way, *within themselves* not even in principle, that you can differentiate them. Of course the 2 instances of the car might occupy different parts of space and time, but they are the same object by any conceivable system of measurement you care to apply.
I am an instance of a pattern of atoms. I know this because few if any of the atoms comprising my body at present are the same actual atoms that made me up 10 years ago, but I still have a scar on my left index finger and still like chocolate ice cream. Accurately re-arranging a pile of soggy organic matter into the same pattern defined in my DNA will result in an exact copy of me, i.e. the same thing. So a perfect simulation of that same pattern in software results in an information-processing pattern that is me in software, i.e. my consiousness has been uploaded.
The soul is, like the lumniferous aether, irrelevant to the discussion, because it doesn't exist.
January 4th, 2001 at 7:34 AM
Re:De re uploading (Terrific!)
A car or a person, you're quite right. However, the no-cloning theorem of quantum mechanics tells us that it is impossible to do this without destroying the original. Anyway, no one proposes to do uploading by quantum state "teleportation." The copying processes they propose would create imperfect facsimilies, within some specified tolerances of similiarity to the original but unambiguously distinct, separate objects, not even "identical except for location."
Are you not a unique physical existence, composed of atoms?
Over time, the scar is changing and so might your taste in ice cream. The real problem here is the word "me." What does "me… 10 years ago" mean? Both the atoms and what you call the "pattern" are changing. Two unambiguous facts: 1) You are a particular physical entity which is here now. 2) There is a physical continuity over time between the person that was there then, the person that is here now, and the person that (one hopes) will be there later. "Continuity of identity" is thus a statement of biological (physical) fact. It is our nature to feel that this fact is psychologically, emotionally, morally significant to us, but it does not appear to have any other physical significance. The universe couldn't care less about our "identity."
I think you know better than this.
Is actually probably not possible. You could start with the DNA and make a "realistic" simulation of embryogenesis, morphogenesis, birth and life (in an artificial world), but you could not exactly simulate anyone's actual life in the real world.
So what is this "consiousness" you are talking about? If you say, "a facsimilie of my brain has been made," I have no argument with you, but you seem to be claiming more than this. What, exactly, do you mean? If what you mean is no more than "a facsimilie of my brain," then why don't you just say that?
January 8th, 2001 at 10:00 AM
Re:When you get married….
What is the point of your comment. What I mean to say this that blind pursuit of financial gain can be ruinous at a social level when people make money out of producing things of low quality. Perhaps you fail to appreciate my perspective.
In the context of engineers, it is relevant in product design when low cost products of low quality make life worse for other people. Sometimes low cost products are pursued because they return more money for the engineer. So the happy engineer has more money to spend on more flashy things, at the expensive of making the world a bit of a better place to live.
January 16th, 2001 at 10:52 PM
Low cost and Low quality
Yes, an engineer can get away with a low cost, low quality design, but only in the absence of competitive pressure. Free-market competition ensures that the customer always gets the best quality products for the lowest price, resulting financial gain being based on improving people's lives, not reducing them. Free-market capitalism is a win-win game.
March 4th, 2002 at 12:48 PM
Re:De re uploading (Terrific!)
*sigh*
Why must there constantly be so much confusion over semantics?
The human mind is composed of atoms, and as such is a coherent pattern of energy constrained in physical form. If that pattern is duplicated, be it atom by atom, of via some as yet unknown application of physics that allows quantum duplication, so long as it is a perfect copy of the original, there should be no difference between how the copy thinks and percieves the world and how the original thinks and percieves the world. So long as the synaptic maps are identical, they are the same person.
This is true whether the copy is made from a stored record of the pattern or created dynamically from the original.
However, from the instant of copying, the experiences will vary, and if the copy is reintegrated with the original, then both experiences will reside in the neural net.
Consciousness is the dynamically running program that is being executed moment to moment on the hardware that is the brain, and that program stores information as synaptic connections, as well as utilizing that data as part of operating.
This being the case, transferral, or copying of that program is possible.
It is also the case that that program can be transferred to a different form of hardware capible of running the software, as Marvin Misky has proposed. Without loss of consciousness, transferral of the program from organic to digital hardware is possible. That we do not as yet posses the mechanism to do so is true, but the theoretical process does not present foreseeable problems…
Yet still, objections are rasied based on nothing more than Vitalism, that some force will make it impossible to make such changes.
That's a religious holdover, and it has no grounding in what is, at present, reality. I make no claims for what may be discovered once the technology is developed to make copies of the mind, it's too early yet to know for sure what all will be needed, but to argue it's impossible based only on the religious belief in "souls" is not very logical.