<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
		>
<channel>
	<title>Comments on: Early Retirement</title>
	<atom:link href="http://www.foresight.org/nanodot/?feed=rss2&#038;p=2985" rel="self" type="application/rss+xml" />
	<link>http://www.foresight.org/nanodot/?p=2985</link>
	<description>examining transformative technology</description>
	<lastBuildDate>Wed, 03 Apr 2013 18:23:47 +0000</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.0.4</generator>
	<item>
		<title>By: the Foresight Institute &#187; Blog Archive &#187; Faster, less expensive medical diagnostics through nanotechnology</title>
		<link>http://www.foresight.org/nanodot/?p=2985#comment-1128603</link>
		<dc:creator>the Foresight Institute &#187; Blog Archive &#187; Faster, less expensive medical diagnostics through nanotechnology</dc:creator>
		<pubDate>Fri, 23 Mar 2012 21:33:00 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2985#comment-1128603</guid>
		<description>[...] general intelligence could produce an &#8220;early retirement&#8221; for the human race (see &#8220;Early retirement&#8221; and &#8220;Early retirement — how soon?&#8220;). Perhaps the issue of how transformative [...]</description>
		<content:encoded><![CDATA[<p>[...] general intelligence could produce an &#8220;early retirement&#8221; for the human race (see &#8220;Early retirement&#8221; and &#8220;Early retirement — how soon?&#8220;). Perhaps the issue of how transformative [...]</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Sharad Bailur</title>
		<link>http://www.foresight.org/nanodot/?p=2985#comment-1070731</link>
		<dc:creator>Sharad Bailur</dc:creator>
		<pubDate>Wed, 14 Sep 2011 00:51:23 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2985#comment-1070731</guid>
		<description>I think retirement is really doing what YOU want, not others want. If the work required to be done is something that I want to do, I don&#039;t look upon it as work. Most creative work for example falls in this category. I write articles for various publications. But I do this when I feel like it. If I had to do it because a publication wanted me to, it would be work.Now it is a hobby.</description>
		<content:encoded><![CDATA[<p>I think retirement is really doing what YOU want, not others want. If the work required to be done is something that I want to do, I don&#8217;t look upon it as work. Most creative work for example falls in this category. I write articles for various publications. But I do this when I feel like it. If I had to do it because a publication wanted me to, it would be work.Now it is a hobby.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: the Foresight Institute &#187; Blog Archive &#187; Report on Fourth Conference on Artificial General Intelligence published</title>
		<link>http://www.foresight.org/nanodot/?p=2985#comment-1070685</link>
		<dc:creator>the Foresight Institute &#187; Blog Archive &#187; Report on Fourth Conference on Artificial General Intelligence published</dc:creator>
		<pubDate>Tue, 13 Sep 2011 22:36:58 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2985#comment-1070685</guid>
		<description>[...] President J. Storrs Hall, which seems to continue the theme of his March 2009 Nanodot post &#8220;Early Retirement&#8220;. The future could indeed by wonderful, if we have the foresight to get from here to there. [...]</description>
		<content:encoded><![CDATA[<p>[...] President J. Storrs Hall, which seems to continue the theme of his March 2009 Nanodot post &#8220;Early Retirement&#8220;. The future could indeed by wonderful, if we have the foresight to get from here to there. [...]</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: </title>
		<link>http://www.foresight.org/nanodot/?p=2985#comment-826188</link>
		<dc:creator></dc:creator>
		<pubDate>Wed, 25 Mar 2009 08:54:01 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2985#comment-826188</guid>
		<description>Re &quot;It should, if possible, reflect the fact that this will be a major liberating event for the human race — no longer need we spend our lives in forced drudgery, since we have built machines to do the necessary work. But it should also reflect the fact that we need to be planning for it. Chris Peterson, who also came up with the term “open source,” suggested “early retirement.” I can’t think of a better one. Can you?&quot;

I find this very naive. After the S, if such a thing is coming, we will have more options to choose in a more complex world, so it seems evident that we will have  _more_ problems to solve, not less, but with he power to do something about them. I see the Singularity not like early retirement, but like the passage to adulthood.</description>
		<content:encoded><![CDATA[<p>Re &#8220;It should, if possible, reflect the fact that this will be a major liberating event for the human race — no longer need we spend our lives in forced drudgery, since we have built machines to do the necessary work. But it should also reflect the fact that we need to be planning for it. Chris Peterson, who also came up with the term “open source,” suggested “early retirement.” I can’t think of a better one. Can you?&#8221;</p>
<p>I find this very naive. After the S, if such a thing is coming, we will have more options to choose in a more complex world, so it seems evident that we will have  _more_ problems to solve, not less, but with he power to do something about them. I see the Singularity not like early retirement, but like the passage to adulthood.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: </title>
		<link>http://www.foresight.org/nanodot/?p=2985#comment-825628</link>
		<dc:creator></dc:creator>
		<pubDate>Mon, 23 Mar 2009 22:09:50 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2985#comment-825628</guid>
		<description>If the singularity happens, it will be a great thing for humanity. Can you imagine all will be living in peace and abundance.I really hope it happens. Obviously i don&#039;t think it will happen in the lifetime of everyone living today but may be our great great grand children will see the beginning of it.
Most of what was predicted to happen by 2009 did not happen. The exponential growth is a myth .It only applies to the processign power of computers as they try to follow Moor&#039;s law.</description>
		<content:encoded><![CDATA[<p>If the singularity happens, it will be a great thing for humanity. Can you imagine all will be living in peace and abundance.I really hope it happens. Obviously i don&#8217;t think it will happen in the lifetime of everyone living today but may be our great great grand children will see the beginning of it.<br />
Most of what was predicted to happen by 2009 did not happen. The exponential growth is a myth .It only applies to the processign power of computers as they try to follow Moor&#8217;s law.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Michael Anissimov</title>
		<link>http://www.foresight.org/nanodot/?p=2985#comment-824765</link>
		<dc:creator>Michael Anissimov</dc:creator>
		<pubDate>Sat, 21 Mar 2009 19:46:33 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2985#comment-824765</guid>
		<description>This catch-phrase and presentation completely ignores the grave risk of human-indifferent self-improving AI.  

It also implicitly states that qualitative superintellingence is impossible.  Why should humans be the qualitatively smartest possible being?  To assume we are is very un-Copernican.  We should expect that many superintelligences will be ineffable and routinely think thoughts beyond our understanding.

Calling the Singularity an &quot;early retirement&quot; is misleading because it glosses over the unique risks (recursive self-improvement, superintelligence) of the event, as well as risks related to getting what we wish for in ways we don&#039;t anticipate nor want, and the unique benefits (intelligence enhancement, pleasure engineering, creating completely new minds to communicate with).  It fundamentally presents advanced AI as more of the same, when advanced AI will actually seem very magical, foreign, and unusual when it is created.</description>
		<content:encoded><![CDATA[<p>This catch-phrase and presentation completely ignores the grave risk of human-indifferent self-improving AI.  </p>
<p>It also implicitly states that qualitative superintellingence is impossible.  Why should humans be the qualitatively smartest possible being?  To assume we are is very un-Copernican.  We should expect that many superintelligences will be ineffable and routinely think thoughts beyond our understanding.</p>
<p>Calling the Singularity an &#8220;early retirement&#8221; is misleading because it glosses over the unique risks (recursive self-improvement, superintelligence) of the event, as well as risks related to getting what we wish for in ways we don&#8217;t anticipate nor want, and the unique benefits (intelligence enhancement, pleasure engineering, creating completely new minds to communicate with).  It fundamentally presents advanced AI as more of the same, when advanced AI will actually seem very magical, foreign, and unusual when it is created.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: J. Storrs Hall</title>
		<link>http://www.foresight.org/nanodot/?p=2985#comment-824372</link>
		<dc:creator>J. Storrs Hall</dc:creator>
		<pubDate>Sat, 21 Mar 2009 00:54:58 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2985#comment-824372</guid>
		<description>The Moore&#039;s Law growth rate is composed of various components, some of which are faster than an 18-month doubling time, some slower. The whole business is not well defined unless you are looking at some specific figure of merit (such as VLSI density) -- but when you do that you lose the relevance to the economy as a whole. So you have to work with fuzzy, order-of-magnitude figures.  That&#039;s why I said &quot;give or take a decade&quot; in the original essay.

I do think, btw, that one of the effects of widespread AI will be to push the growth curve towards a one-year doubling time rate.</description>
		<content:encoded><![CDATA[<p>The Moore&#8217;s Law growth rate is composed of various components, some of which are faster than an 18-month doubling time, some slower. The whole business is not well defined unless you are looking at some specific figure of merit (such as VLSI density) &#8212; but when you do that you lose the relevance to the economy as a whole. So you have to work with fuzzy, order-of-magnitude figures.  That&#8217;s why I said &#8220;give or take a decade&#8221; in the original essay.</p>
<p>I do think, btw, that one of the effects of widespread AI will be to push the growth curve towards a one-year doubling time rate.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: </title>
		<link>http://www.foresight.org/nanodot/?p=2985#comment-824069</link>
		<dc:creator></dc:creator>
		<pubDate>Fri, 20 Mar 2009 04:50:45 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2985#comment-824069</guid>
		<description>I am annoyed with how so many Singulatarians greatly overstate the rate of computing power doubling per dollar.

They say it doubles every year, and thus increases 1000-fold every decade.

It does not.  The real rate is a doubling every 18 months, and thus 100-fold improvement every decade.

Thus, J Storrs Hall&#039;s earlier sentence should mean that if AI costs $1 million in 2025, it would be $1000 in 2040, and $1 in 2055.  

Don&#039;t overstate the rate of progress.  All these futurists, in 1999, thought that by 2009 all sorts of advances would have happened.  Now that it is 2009, I don&#039;t see any of them owning up to their overstatements.</description>
		<content:encoded><![CDATA[<p>I am annoyed with how so many Singulatarians greatly overstate the rate of computing power doubling per dollar.</p>
<p>They say it doubles every year, and thus increases 1000-fold every decade.</p>
<p>It does not.  The real rate is a doubling every 18 months, and thus 100-fold improvement every decade.</p>
<p>Thus, J Storrs Hall&#8217;s earlier sentence should mean that if AI costs $1 million in 2025, it would be $1000 in 2040, and $1 in 2055.  </p>
<p>Don&#8217;t overstate the rate of progress.  All these futurists, in 1999, thought that by 2009 all sorts of advances would have happened.  Now that it is 2009, I don&#8217;t see any of them owning up to their overstatements.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: </title>
		<link>http://www.foresight.org/nanodot/?p=2985#comment-823646</link>
		<dc:creator></dc:creator>
		<pubDate>Thu, 19 Mar 2009 04:57:08 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2985#comment-823646</guid>
		<description>&#039;Maybe more homicide than suicide, unless you count drug use, unhealthy sexual behaviors, and violent associations as a form of suicide.&#039; 

A lot of this would also change. We&#039;ll have other technologies also rising during these same time peroids. If VR can mimic reality to a degree that we can&#039;t tell the differnce then let people engage in whatever form of &#039;unhealthy&#039; sex they like so long as its virtual. Let them enter into gigantic versions of GTA and blast the hell out of each other. Frankly if they want to be wire heads and this we have technology that prevents us from being over burdened by their irresponsibility then let them bliss their lives away. 

I&#039;ve never fully understood the hatred (or, perhaps to be fair, outrage) people feel towards people on wellfare. Of course, to be fair, I&#039;ve always been able to afford to live in neighborhoods, even in New York, where my interaction with people from the projects has been extremely limited. What confuses me is the generalization- the idea that anyone on any of these programs is and always has been a free rider. I support the existences of safety nets, imperfect as they are, because I believe that to not have them would be worse. That the externalities associated with their removal would impact me either more greatly or more unpleasantly than my higher taxes. 

However, if I was someone barely getting by, working my ass off and hovering on the knife&#039;s edge, I suppose my feelings towards someone doing far less but getting damn near the same comfort would be rather sour...</description>
		<content:encoded><![CDATA[<p>&#8216;Maybe more homicide than suicide, unless you count drug use, unhealthy sexual behaviors, and violent associations as a form of suicide.&#8217; </p>
<p>A lot of this would also change. We&#8217;ll have other technologies also rising during these same time peroids. If VR can mimic reality to a degree that we can&#8217;t tell the differnce then let people engage in whatever form of &#8216;unhealthy&#8217; sex they like so long as its virtual. Let them enter into gigantic versions of GTA and blast the hell out of each other. Frankly if they want to be wire heads and this we have technology that prevents us from being over burdened by their irresponsibility then let them bliss their lives away. </p>
<p>I&#8217;ve never fully understood the hatred (or, perhaps to be fair, outrage) people feel towards people on wellfare. Of course, to be fair, I&#8217;ve always been able to afford to live in neighborhoods, even in New York, where my interaction with people from the projects has been extremely limited. What confuses me is the generalization- the idea that anyone on any of these programs is and always has been a free rider. I support the existences of safety nets, imperfect as they are, because I believe that to not have them would be worse. That the externalities associated with their removal would impact me either more greatly or more unpleasantly than my higher taxes. </p>
<p>However, if I was someone barely getting by, working my ass off and hovering on the knife&#8217;s edge, I suppose my feelings towards someone doing far less but getting damn near the same comfort would be rather sour&#8230;</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: JamesG</title>
		<link>http://www.foresight.org/nanodot/?p=2985#comment-823432</link>
		<dc:creator>JamesG</dc:creator>
		<pubDate>Wed, 18 Mar 2009 19:43:31 +0000</pubDate>
		<guid isPermaLink="false">http://www.foresight.org/nanodot/?p=2985#comment-823432</guid>
		<description>&quot;Other possibilities - transhuman times? timeless times? retirement times? liesure times? technical times? &quot;

I always used &quot;Nano-age&quot; to refer to this time period, and variations like &quot;Nano-society&quot; to describe the people who would live there, from the original definition of nanotechnology.  

“I think that most people would be devastated, misery and suicide would skyrocket, and senseless crime would not be far behind, if we did not have productive labor.”

Yes, without the ability to sit in a cubicle all day answering angry phone calls I would just be absolutely devastated... :rolleyes: From real-life VR games, to sports, to watching movies and listening to music and having parties and enjoying human replica fembots, I don&#039;t think I or anyone else is going to be anything like devastated.  Nice emotional word use, though.</description>
		<content:encoded><![CDATA[<p>&#8220;Other possibilities &#8211; transhuman times? timeless times? retirement times? liesure times? technical times? &#8221;</p>
<p>I always used &#8220;Nano-age&#8221; to refer to this time period, and variations like &#8220;Nano-society&#8221; to describe the people who would live there, from the original definition of nanotechnology.  </p>
<p>“I think that most people would be devastated, misery and suicide would skyrocket, and senseless crime would not be far behind, if we did not have productive labor.”</p>
<p>Yes, without the ability to sit in a cubicle all day answering angry phone calls I would just be absolutely devastated&#8230; :rolleyes: From real-life VR games, to sports, to watching movies and listening to music and having parties and enjoying human replica fembots, I don&#8217;t think I or anyone else is going to be anything like devastated.  Nice emotional word use, though.</p>
]]></content:encoded>
	</item>
</channel>
</rss>