First, sorry about the long absence. Second, sure, most of us would like to, but who actually feels scientifically comfortable taking Ray Kurzweil at his word at all? Discuss.
L
onge
C
ity
Advocacy & Research for Unlimited Lifespans
Posted 06 November 2005 - 12:16 AM
Posted 06 November 2005 - 04:23 AM
Posted 06 November 2005 - 09:14 AM
Posted 06 November 2005 - 04:08 PM
Posted 07 November 2005 - 01:02 AM
Posted 07 November 2005 - 07:38 AM
Posted 07 November 2005 - 05:29 PM
Posted 07 November 2005 - 07:45 PM
Edited by bgwowk, 08 November 2005 - 12:41 AM.
Posted 08 November 2005 - 10:28 PM
Agree or disagree with Kurzweil's timetables, and BY HOW MUCH?
Posted 09 November 2005 - 12:11 AM
Posted 09 November 2005 - 12:16 AM
What does computer speed and storage capacity have to do with creating an artificial intellect?
Everyone here that honestly thinks "strong" A.I. is just around the corner is operating on assumptions that are based on further assumptions.
Posted 09 November 2005 - 12:36 AM
My main problem with Kurzweil is his convoluted values and his suspect motives. If you don't understand what I mean by this, then you too are a part of the problem....
pssstt....hey kids, can I sell ya on some of that new religion??
Posted 09 November 2005 - 01:24 AM
Posted 09 November 2005 - 02:12 AM
Posted 09 November 2005 - 02:48 AM
Posted 09 November 2005 - 03:03 AM
How else is there to operate? Not expecting the outcome sought? That's an irresponsible way to proceed. Indeed, there's a separate argument that enterprises can't proceed without having expected outcomes.I don't demand anything except that people stop operating on assumed outcomes of certain enterprises.
So basically you're saying it's better to engender X and not expect X than to engender X and expect X. What a trivial difference!Don't expect it to work ...
Posted 09 November 2005 - 03:57 AM
Posted 09 November 2005 - 04:24 AM
You were saying more than that, all of which were sufficiently invalidated. What you're "simply saying" is pointless.... I am simply saying that erroneus expectaions are created by blind enthusiasm in the immortalist meme.
Stop babbling and get to it then.I really need to work on my articulatory skills.
Posted 09 November 2005 - 05:59 AM
Posted 09 November 2005 - 06:19 AM
Posted 09 November 2005 - 06:45 AM
Why have you decided to delete your posts, Justin?
Posted 09 November 2005 - 07:20 AM
What's this theory supposed to explain, that we can't be specific about post-Singularity states, that we need to be responsible, that transhuman intelligence is the last invention Homo sapiens need to make? We already know this. What Justin wants is a theory that says no one's smarter than he is, and his vacuous high-IQ associates, and can't be.Justinb made a good point about a need for a hard science approach to the Technological Singularity theory.
Do you know about the Journal of Evolution and Technology and the upcoming Future of Humanity Institute (among a number of other risk-management organizations which account for the Singularity)? Which new voices are you looking for as opposed to whom?It would also be nice to hear more new voices, preferably via peer-reviewed research papers.
Posted 09 November 2005 - 03:43 PM
Posted 09 November 2005 - 05:42 PM
Yes. I mean the same.I believe much of what Kurzweil theorizes. However, there is more work to be done, whether or not the Singularity is inevitable.
Posted 09 November 2005 - 06:53 PM
If all transhumanists and other technology progressives thought this was a new religion, then there might be a call for this suspicion. I cannot know what Kurzweil truly thinks, but when he talks about the spirituality of machines, it seems to me he is defending against those who insist on viewing technological progress as cold and unfeeling. Cast the products of technological progress in a spiritual light and you fend off one avenue of attack by critics.
I find much of value in Kurzweil's work but there are a few beliefs he has in common with many transhumanists that bug me. One example: the insistence that humans merging with technology is a transcendent event in a spiritual, sum-greater-than-its-parts sense. There is nothing transcendent about any of this unless the definition of transcendence is restricted to "surpassing others." Transhumans and posthumans will be the result of rapid technological progression, not spiritual transcendence. That would be like calling complex weather patterns a transcendent phenomena, when it is instead a study in complexity. The posthuman historian will not need to use hand-waving to describe the evolution of humans and technology into posthuman forms.
However, I am not about to disregard everything the guy says.
Posted 09 November 2005 - 07:09 PM
1. Determinism is not a problem. Plenty of pleasurable experiences can occur in a deterministic universe.
2. Our personalities depend on a dynamic process of signals (information/forces), not necessarily on the material you associate with the human nervous system.
3. Cosmology does not take into account a universe saturated with intelligent processes. Don't assume entropy will kill us if you don't supply the rigor you demand.
Posted 09 November 2005 - 07:22 PM
Posted 09 November 2005 - 08:20 PM
I think this is a bit out of context. Recall what he says on page 370:If you want to know what Kurzweil "truly thinks" then go to Barnes&Noble and read his new testament lying out there on the front display case.
Ray Kurzweil: We need a new religion.
Um actually, no, we don't Ray.
George Guilder has described my scientific and philosophical views as "a substitute vision for those who have lost faith in the traditional object of religious belief." Gilder's statement is understandable, as there are at least apparent similarities between anticipation of the Singularity and anticipation of the transformations articulated by traditional religions.
But I did not come by my perspective as a result of searching for an alternative to customary faith. The origin of my quest to understand technology trends was practical: an attempt to time my inventions and to make optimal tactical decisions in launching technology enterprises. Over time this modeling of technology took on a life of its own and led me to formulate a theory of technology evolution. It was not a huge leap from there to reflect on the impact of these crucial changes on social and cultural institutions and on my own life. So, while being a Singularitarian is not a matter of faith but one of understanding, pondering the scientific trends I've discussed in this book inescapably engenders new perspectives on the issues that traditional religions have attempted to address: the nature of mortality and immortality, the purpose of our lives, and intelligence in the universe.
If the death of the free thinker is subjugating the will to religion, then so is the tenacity on being aimless. The free thinker needs wisdom and knowledge for direction. There must be some element of discipline and commitment here.Religion, an ambigious term to be sure, but one that (to me at least) represents the large scale unification of belief and the establishment of dogma; the ultimate subjugation of the will, the death of the free thinker. And for what? To satisfy the all too human psychological need for certainty and meaning?
No. Not really. There's a fundamental pattern in intelligence that we can all recognize. Intelligence is the process toward merging thought and being. Even those who are anti-intelligence are trying to be right about something. The mere act of trying to be right is an attempted step toward merging thought and being. One's values can probabilistically either move one away or move one closer to the mergence of thought and being. Anti-intelligence or ignorance are more likely to move one away. Intelligence or intellectual endorsement are more likely to move one closer.But could not the amplification of the human mind to some unprecedented level of ultra intelligence result in the radical reassessment of our values? Could this not, in a sense, be considered...transcendent?
It looks that way.And also, if I may put forward one more rhetorical question, does transcendence necessarily demand a positive valuation?
In this context "hedonism" would be more appropriately replaced with "eudaemonism." But even that's an insufficient description of the aims of responsible intelligence. I disagree that the needs of intelligence reside exclusively within the biological paradigm. We can reasonably anticipate that as we merge with nonbiological technology, and later become totally nonbiological, we will still operate with survivability, the concept of technology, and intelligence enhancement paradigms.Technologists often grant positive value to technological progress, but I have found that frequently their justification for a positive assessment is based on the satisfying of needs residing exclusively within the biological paradigm. This simply can not do, for with a drastic redesign of the human mind, and the instantiation of various types of meta-programming, the current amalgam of urges, impulses and base logic that are selected for or against in human societies (LL uses the term 'human selection') will finally give way to/be eclipsed more fully by the memetic paradigm. This is not to say that various forms of hedonism will not still be fashionable, only that such impulses total influence over evaluative processes will be greatly reduced, if not eliminated entirely.
Posted 09 November 2005 - 09:04 PM
Oh, and one more thing Justin.
Berkeley's inverted-monist ontology may be impenetrable, but it also the laziest of philosophies.
Edited by justinb, 10 November 2005 - 07:42 AM.
Posted 09 November 2005 - 09:25 PM
0 members, 1 guests, 0 anonymous users