<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
		>
<channel>
	<title>Comments on: Singularity, part 2</title>
	<atom:link href="http://www.foresight.org/nanodot/?feed=rss2&#038;p=2959" rel="self" type="application/rss+xml" />
	<link>http://www.foresight.org/nanodot/?p=2959</link>
	<description>examining transformative technology</description>
	<lastBuildDate>Wed, 03 Apr 2013 18:23:47 +0000</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.0.4</generator>
	<item>
		<title>By: Nanodot: Nanotechnology News and Discussion &#187; Blog Archive &#187; Forward to the past</title>
		<link>http://www.foresight.org/nanodot/?p=2959#comment-818388</link>
		<dc:creator>Nanodot: Nanotechnology News and Discussion &#187; Blog Archive &#187; Forward to the past</dc:creator>
		<pubDate>Tue, 03 Mar 2009 15:15:42 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2959#comment-818388</guid>
		<description>[...] As far as the Singularity is concerned, Charlie&#8217;s notion of it is not the same as mine, so arguing this point would be a case of talking past each other. But I will strongly claim that an AI/nanotech revolution that kicks the economy into a growth mode that looks like Moore&#8217;s Law, &#8220;is going to affect everything.&#8221; [...]</description>
		<content:encoded><![CDATA[<p>[...] As far as the Singularity is concerned, Charlie&#8217;s notion of it is not the same as mine, so arguing this point would be a case of talking past each other. But I will strongly claim that an AI/nanotech revolution that kicks the economy into a growth mode that looks like Moore&#8217;s Law, &#8220;is going to affect everything.&#8221; [...]</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: </title>
		<link>http://www.foresight.org/nanodot/?p=2959#comment-816283</link>
		<dc:creator></dc:creator>
		<pubDate>Tue, 24 Feb 2009 03:53:15 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2959#comment-816283</guid>
		<description>I think this argument that the singularity can be grasped by looking at the world economy is a bit strained.  A true singularity would rather quickly unleash MNT.  With that the market and the entire scarcity presumption behind tradional economics would be largely over.  If that isn&#039;t an economic singularity I find it difficult to imagine what would be.    Simultaneously the cost per unit of intellectual work, increasingly the only kind of much economic value, would quickly drop precipitously given AGI.   So costs of production quickly tend toward zero.   The idea that these sorts of changes will result in only a factor of 4 increase is utterly bizarre.</description>
		<content:encoded><![CDATA[<p>I think this argument that the singularity can be grasped by looking at the world economy is a bit strained.  A true singularity would rather quickly unleash MNT.  With that the market and the entire scarcity presumption behind tradional economics would be largely over.  If that isn&#8217;t an economic singularity I find it difficult to imagine what would be.    Simultaneously the cost per unit of intellectual work, increasingly the only kind of much economic value, would quickly drop precipitously given AGI.   So costs of production quickly tend toward zero.   The idea that these sorts of changes will result in only a factor of 4 increase is utterly bizarre.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Nanodot: Nanotechnology News and Discussion &#187; Blog Archive &#187; Singularity, part 3</title>
		<link>http://www.foresight.org/nanodot/?p=2959#comment-814895</link>
		<dc:creator>Nanodot: Nanotechnology News and Discussion &#187; Blog Archive &#187; Singularity, part 3</dc:creator>
		<pubDate>Wed, 18 Feb 2009 20:14:36 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2959#comment-814895</guid>
		<description>[...] In the previous essay in this series, I argued top-down, from historical and economic precedents, that the coming singularity might look approximately like the second half of the computer/internet revolution. Today I&#8217;ll argue the same conclusion from the bottom up: by looking at things from the point of view of the individual AI. [...]</description>
		<content:encoded><![CDATA[<p>[...] In the previous essay in this series, I argued top-down, from historical and economic precedents, that the coming singularity might look approximately like the second half of the computer/internet revolution. Today I&#8217;ll argue the same conclusion from the bottom up: by looking at things from the point of view of the individual AI. [...]</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: </title>
		<link>http://www.foresight.org/nanodot/?p=2959#comment-814672</link>
		<dc:creator></dc:creator>
		<pubDate>Wed, 18 Feb 2009 03:52:03 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2959#comment-814672</guid>
		<description>&quot;Every 18 months to two years, technology exist to put twice as many transistors on a chip. These new transistors are twice as fast as the prior generation. Hence we obtain 4X computer speed in two years. What follows is a 2X computer speed every year. &quot;

That is actually not correct.  This is an exact sentence you have lifted from Kurzweil.

Every 18 months, transistor sizes shrink by half.  Period.  That is why we get a doubling every 18 months.  The new transistors are not twice as fast as the previous ones, they are the same speed.

The proof is in the pudding.  Today, 1 GM of RAM is about $20.  15 years ago in 1994 (10 doublings, or 1024X), 1 MB cost $20.  15 years before that in 1979 (10 doublings, or 1024X again), 1 KB cost $20.  In 2024, 1 TB will cost $20.  

So the 18-month doubling is quite exact.  It is not 12 months.</description>
		<content:encoded><![CDATA[<p>&#8220;Every 18 months to two years, technology exist to put twice as many transistors on a chip. These new transistors are twice as fast as the prior generation. Hence we obtain 4X computer speed in two years. What follows is a 2X computer speed every year. &#8221;</p>
<p>That is actually not correct.  This is an exact sentence you have lifted from Kurzweil.</p>
<p>Every 18 months, transistor sizes shrink by half.  Period.  That is why we get a doubling every 18 months.  The new transistors are not twice as fast as the previous ones, they are the same speed.</p>
<p>The proof is in the pudding.  Today, 1 GM of RAM is about $20.  15 years ago in 1994 (10 doublings, or 1024X), 1 MB cost $20.  15 years before that in 1979 (10 doublings, or 1024X again), 1 KB cost $20.  In 2024, 1 TB will cost $20.  </p>
<p>So the 18-month doubling is quite exact.  It is not 12 months.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: </title>
		<link>http://www.foresight.org/nanodot/?p=2959#comment-814324</link>
		<dc:creator></dc:creator>
		<pubDate>Mon, 16 Feb 2009 21:09:47 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2959#comment-814324</guid>
		<description>Every 18 months to two years, technology exist to put twice as many transistors on a chip. These new transistors are twice as fast as the prior generation. Hence we obtain 4X computer speed in two years. What follows is a 2X computer speed every year.</description>
		<content:encoded><![CDATA[<p>Every 18 months to two years, technology exist to put twice as many transistors on a chip. These new transistors are twice as fast as the prior generation. Hence we obtain 4X computer speed in two years. What follows is a 2X computer speed every year.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: </title>
		<link>http://www.foresight.org/nanodot/?p=2959#comment-813828</link>
		<dc:creator></dc:creator>
		<pubDate>Sun, 15 Feb 2009 13:36:27 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2959#comment-813828</guid>
		<description>Actually, it&#039;s easy to produce greater-than-human intelligence: give a human a
    pencil and paper. Such an augmented human can multiply 10-digit numbers,
    something an unaided human has a really hard time doing. Want an even more
    intelligent system? Give the human a computer connected to the internet and
    access to Google. We humans have been making ourselves more intelligent in ways
    like this for our entire history as a species, and — by Jove — yes, there has
    been a positive feedback loop and yes, there has been an exponential increase.

 
Although I agree with you, it doesnt capture the essence of what at least I consider to be the Singularity. It is a strong critical argument against that the Singularity actually will happen as proposed by SF writers.
Humans have been increasing their intelligence slowly over millenias, what is to say that a slightly more intelligent machine or man-machine symbiosis will increase its intelligence billions of times faster?
 

    To sum up: the &quot;Singularity&quot; should best be thought of as the second half of the
    information technology revolution, extending it to most physical and
    intellectual work. Overall economic growth rates will shift from their current
    levels of roughly 5% to Moore&#039;s Law-like rates of 70% to 100%. The shift will
    probably take on the order of a decade (paralleling the growth of the internet),
    and probably fall somewhere in the 20s, 30,s or 40s.

 
Wasn&#039;t part of the original singularity-idea that an AI more intelligent thatn humans would emerge?
 
If this is possible I´d prefer to look at our current situation as something similar to the situation when the first protozoic lifeforms emerged.
The elemental structures are there for further development and most computers are connected nowadays. The existing lifeforms (dataviruses) are however barely equivalent of biological viruses.
If/when sentient beings arises from this strata, then we got what I would call the Singularity.
 
I think your post is a very interesting read, but the economical analysis doesnt have much to do with my image of the singularity.
 
Comparing the history of killer whales with humans would probably bring more similarities than between humans and sentient computers.
(After all, both humans and killer whales eat other organisms to survive, are prone to mutations, have sex, offspring and different civilizations.)
 
I´m not saying that your analysis is wrong, just that the artificial lifeforms will have an entirely different economy and it will interfere with our own and, if so inclined, dominate it within a century or two. This is a situation that I dont think can be compared to anything less than the emergence of homo sapiens sapiens, if even that.</description>
		<content:encoded><![CDATA[<p>Actually, it&#8217;s easy to produce greater-than-human intelligence: give a human a<br />
    pencil and paper. Such an augmented human can multiply 10-digit numbers,<br />
    something an unaided human has a really hard time doing. Want an even more<br />
    intelligent system? Give the human a computer connected to the internet and<br />
    access to Google. We humans have been making ourselves more intelligent in ways<br />
    like this for our entire history as a species, and — by Jove — yes, there has<br />
    been a positive feedback loop and yes, there has been an exponential increase.</p>
<p>Although I agree with you, it doesnt capture the essence of what at least I consider to be the Singularity. It is a strong critical argument against that the Singularity actually will happen as proposed by SF writers.<br />
Humans have been increasing their intelligence slowly over millenias, what is to say that a slightly more intelligent machine or man-machine symbiosis will increase its intelligence billions of times faster?</p>
<p>    To sum up: the &#8220;Singularity&#8221; should best be thought of as the second half of the<br />
    information technology revolution, extending it to most physical and<br />
    intellectual work. Overall economic growth rates will shift from their current<br />
    levels of roughly 5% to Moore&#8217;s Law-like rates of 70% to 100%. The shift will<br />
    probably take on the order of a decade (paralleling the growth of the internet),<br />
    and probably fall somewhere in the 20s, 30,s or 40s.</p>
<p>Wasn&#8217;t part of the original singularity-idea that an AI more intelligent thatn humans would emerge?</p>
<p>If this is possible I´d prefer to look at our current situation as something similar to the situation when the first protozoic lifeforms emerged.<br />
The elemental structures are there for further development and most computers are connected nowadays. The existing lifeforms (dataviruses) are however barely equivalent of biological viruses.<br />
If/when sentient beings arises from this strata, then we got what I would call the Singularity.</p>
<p>I think your post is a very interesting read, but the economical analysis doesnt have much to do with my image of the singularity.</p>
<p>Comparing the history of killer whales with humans would probably bring more similarities than between humans and sentient computers.<br />
(After all, both humans and killer whales eat other organisms to survive, are prone to mutations, have sex, offspring and different civilizations.)</p>
<p>I´m not saying that your analysis is wrong, just that the artificial lifeforms will have an entirely different economy and it will interfere with our own and, if so inclined, dominate it within a century or two. This is a situation that I dont think can be compared to anything less than the emergence of homo sapiens sapiens, if even that.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: </title>
		<link>http://www.foresight.org/nanodot/?p=2959#comment-813733</link>
		<dc:creator></dc:creator>
		<pubDate>Sun, 15 Feb 2009 09:27:59 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2959#comment-813733</guid>
		<description>Kurzweil is always wrong by overstating the rate of progress by 50%.  In &#039;The Age of Spiritual Machines&#039;, he says that &#039;computer power doubles every year&#039;.  It simply does not.

This is why his 2009 predictions (made in 1999) have mostly not happened.  Most of them will happen by 2014.  But most will not happen by the end of 2009.  

Kurzweil&#039;s 2045 estimate for the Singularity is absurd.  That is just 36 years from now.  The Singularity will be 2060-65.</description>
		<content:encoded><![CDATA[<p>Kurzweil is always wrong by overstating the rate of progress by 50%.  In &#8216;The Age of Spiritual Machines&#8217;, he says that &#8216;computer power doubles every year&#8217;.  It simply does not.</p>
<p>This is why his 2009 predictions (made in 1999) have mostly not happened.  Most of them will happen by 2014.  But most will not happen by the end of 2009.  </p>
<p>Kurzweil&#8217;s 2045 estimate for the Singularity is absurd.  That is just 36 years from now.  The Singularity will be 2060-65.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: </title>
		<link>http://www.foresight.org/nanodot/?p=2959#comment-813458</link>
		<dc:creator></dc:creator>
		<pubDate>Sat, 14 Feb 2009 15:41:52 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2959#comment-813458</guid>
		<description>First, Kurzweil uses the 18 month doubling time in his predictions, if you read the singularity is near, you&#039;ll see this (although some technologies related do double in a year.)  Second, the 2045 is not based on Moore&#039;s law directly, it&#039;s the time when Kurzweil thinks a machine will be equivalent to many many human brains (1,000 or a 1 million, I forget) for $1,000.  It&#039;s an ultra-conservative estimate, imo.  Even a supercomputer capable of human level AI (which may be possible as soon as a couple of years from now) should be able to kick-start the singularity, depending on things which are simply not known for certain yet. People just don&#039;t agree on these things, so what can you say..</description>
		<content:encoded><![CDATA[<p>First, Kurzweil uses the 18 month doubling time in his predictions, if you read the singularity is near, you&#8217;ll see this (although some technologies related do double in a year.)  Second, the 2045 is not based on Moore&#8217;s law directly, it&#8217;s the time when Kurzweil thinks a machine will be equivalent to many many human brains (1,000 or a 1 million, I forget) for $1,000.  It&#8217;s an ultra-conservative estimate, imo.  Even a supercomputer capable of human level AI (which may be possible as soon as a couple of years from now) should be able to kick-start the singularity, depending on things which are simply not known for certain yet. People just don&#8217;t agree on these things, so what can you say..</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: </title>
		<link>http://www.foresight.org/nanodot/?p=2959#comment-813313</link>
		<dc:creator></dc:creator>
		<pubDate>Sat, 14 Feb 2009 00:11:42 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2959#comment-813313</guid>
		<description>&quot;Overall economic growth rates will shift from their current levels of roughly 5% to Moore’s Law-like rates of 70% to 100%. &quot;

Wait, the world economy is growing at 5%?  I thought we were in a recession!!

Also, Moore&#039;s Law is exactly 58% a year.  Not 70-100%.</description>
		<content:encoded><![CDATA[<p>&#8220;Overall economic growth rates will shift from their current levels of roughly 5% to Moore’s Law-like rates of 70% to 100%. &#8221;</p>
<p>Wait, the world economy is growing at 5%?  I thought we were in a recession!!</p>
<p>Also, Moore&#8217;s Law is exactly 58% a year.  Not 70-100%.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: </title>
		<link>http://www.foresight.org/nanodot/?p=2959#comment-813312</link>
		<dc:creator></dc:creator>
		<pubDate>Sat, 14 Feb 2009 00:10:22 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2959#comment-813312</guid>
		<description>This &#039;doubling every year&#039; exaggeration is why Kurzweil&#039;s predictions are always 50% sooner than they actually happen.

Computing does NOT double every year.  It doubles every 18 months.  Thus, there are 20 doublings in 30 years, not 30 in 30 years.

Hence, the Singularity will happen in 2060-65.  NOT 2045 as Kurzweil says, or even sooner as Storrs Hall says.</description>
		<content:encoded><![CDATA[<p>This &#8216;doubling every year&#8217; exaggeration is why Kurzweil&#8217;s predictions are always 50% sooner than they actually happen.</p>
<p>Computing does NOT double every year.  It doubles every 18 months.  Thus, there are 20 doublings in 30 years, not 30 in 30 years.</p>
<p>Hence, the Singularity will happen in 2060-65.  NOT 2045 as Kurzweil says, or even sooner as Storrs Hall says.</p>
]]></content:encoded>
	</item>
</channel>
</rss>