Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.
If you could augment your intelligence, how far would you go?
#31
Posted 20 March 2008 - 06:35 AM
#32
Posted 20 March 2008 - 11:42 AM
I think even if you know everything about a thing you still can enjoy it. Maybe you enjoy it even more then. Take at example riding a bike. If you are good at riding you know pretty much exactly how the bike will behave and what happens if you do something. Still its much more fun riding, doing tricks and playing with it, at the time when you master it than at the time you learn to ride(at least for me). Its not all about acquiring knowledge. Applying it is often much more fun.I chose smart enough to understand every known and unknown concept. And I agree with the person who said there will always be knowlege out there so life should never get boring...if it does than I would take away some of my genius abilities I think...
sponsored ad
#33
Posted 21 March 2008 - 03:22 AM
#34
Posted 21 March 2008 - 03:55 AM
After getting the s**t kicked out of my by a hard final exam...yeah, I want to become as smart as possible without becoming unstable.
No you don't want to do that. A lot of intelligent people are psychopaths and worst.
#35
Posted 21 March 2008 - 04:57 AM
After getting the s**t kicked out of my by a hard final exam...yeah, I want to become as smart as possible without becoming unstable.
No you don't want to do that. A lot of intelligent people are psychopaths and worst.
But being intelligent doesn't necessarily mean the person will become "evil".
#36
Posted 21 March 2008 - 05:09 AM
After getting the s**t kicked out of my by a hard final exam...yeah, I want to become as smart as possible without becoming unstable.
No you don't want to do that. A lot of intelligent people are psychopaths and worst.
But being intelligent doesn't necessarily mean the person will become "evil".
But it does mean an evil person can think of evil, viable schemes to fulfill their narcissistic needs and inflict far more damage than mere criminals.
#37
Posted 21 March 2008 - 05:44 AM
After getting the s**t kicked out of my by a hard final exam...yeah, I want to become as smart as possible without becoming unstable.
No you don't want to do that. A lot of intelligent people are psychopaths and worst.
But being intelligent doesn't necessarily mean the person will become "evil".
But it does mean an evil person can think of evil, viable schemes to fulfill their narcissistic needs and inflict far more damage than mere criminals.
That's for sure. Intelligence is an expander of personal power and abilities. The more intelligent the person is, the more power this person has to influence reality and the environment. So of course, more intelligence to an evil person would mean more power to do evil.
#38
Posted 21 March 2008 - 07:45 AM
I think I will upgrade my intellect as soon as intelligence-modifying components are agreed as safe to use. I will upgrade to become as intelligent as possible. But I will not however tamper with my emotions using super-drugs, or anything of the sort
- Adam
It would seem that the drastic alteration of one's mental faculties to any degree would fundamentally alter the relationship between the complex, emergent phenomena that is consciousness and your more primitive drives and desires. Your emotions, or affective states, are the perceptual artifacts of your limbic system. Your subjective understanding of those evolved substrates hinges greatly on the way that you alter, by way of learning and experience, the architecture of your higher cognitive capacities. By integrating vastly superior wetware, it's naive to think that you will possess the ability to maintain a relationship with your emotional "self" that is similar to the one you have now.
Personally, I look forward to expanding my notion of consciousness to incorporate a networked sense of existence wherein I can still maintain an identity at will, but might also possess the capacity to distribute my mental capacity through decentralized systems, in essence becoming multiple links in other individual's thought processes, and furthermore consciously evolving to incorporate reflections of these other intelligences into my own self-identity. To an extent this is what the net represents in a very impersonal instantiation, but I envision this being an experience that resembles gazing into every conscious being's eyes simultaneously. This would not be for the purposes of seeing their thoughts, but merely sensing the multitude of existing subjective perspectives.
Anyhow, I checked "other"
#39
Posted 21 March 2008 - 10:30 AM
After getting the s**t kicked out of my by a hard final exam...yeah, I want to become as smart as possible without becoming unstable.
No you don't want to do that. A lot of intelligent people are psychopaths and worst.
But being intelligent doesn't necessarily mean the person will become "evil".
But it does mean an evil person can think of evil, viable schemes to fulfill their narcissistic needs and inflict far more damage than mere criminals.
That's for sure. Intelligence is an expander of personal power and abilities. The more intelligent the person is, the more power this person has to influence reality and the environment. So of course, more intelligence to an evil person would mean more power to do evil.
i.e. Hitler, Stalin, Mao, Fidel, Sadam, etc.
#40
Posted 21 March 2008 - 03:14 PM
In other words: You change and get a new view on reality. But that's what augmentation is about in the first place, right?It would seem that the drastic alteration of one's mental faculties to any degree would fundamentally alter the relationship between the complex, emergent phenomena that is consciousness and your more primitive drives and desires. Your emotions, or affective states, are the perceptual artifacts of your limbic system. Your subjective understanding of those evolved substrates hinges greatly on the way that you alter, by way of learning and experience, the architecture of your higher cognitive capacities. By integrating vastly superior wetware, it's naive to think that you will possess the ability to maintain a relationship with your emotional "self" that is similar to the one you have now.
#41
Posted 21 March 2008 - 05:24 PM
After getting the s**t kicked out of my by a hard final exam...yeah, I want to become as smart as possible without becoming unstable.
No you don't want to do that. A lot of intelligent people are psychopaths and worst.
But being intelligent doesn't necessarily mean the person will become "evil".
But it does mean an evil person can think of evil, viable schemes to fulfill their narcissistic needs and inflict far more damage than mere criminals.
That's for sure. Intelligence is an expander of personal power and abilities. The more intelligent the person is, the more power this person has to influence reality and the environment. So of course, more intelligence to an evil person would mean more power to do evil.
i.e. Hitler, Stalin, Mao, Fidel, Sadam, etc.
Yes yes. I just hope you got my point. More intelligence to people in general is good, because there are a lot of good people out there, hopefully more than evil people. So in general we would advance faster as a civilization.
#42
Posted 31 October 2008 - 12:19 AM
Normative Reasoning: A Siren Song?
From: Michael Wilson (mwdestinystar@yahoo.co.uk)
Date: Sun Sep 19 2004 - 08:02:45 MDT
Though speculation about post-Singularity development trajectories is
usually futile, my recent research has thrown up a serious moral issue
which I believe has important implications for CV. The basic points are
that a normative method of reasoning exists, it and close approximations
thereof are tremendously powerful and any self-improving rational
intelligence (artificial or upload) will eventually converge to this
architecture unless their utility function explicitly prevents this
action.
The problem here is that just about all the human qualities we care about
are actually the result of serious flaws in our cognitive architecture,
and many of these seem to have lossless translations into goal specifications
for a perfectly rational substrate (the basis for Yudkowsky's really
powerful optimisation processes). As humanity self-improves, normative
reasoning (of which appropriately implemented Solomonoff induction is at
the very least a good approximation) is a major attractor; adopting it makes
you as effective as possible at utilising any finite amount of information
and computing power. If there's any sort of competition going on, turning
yourself into a RPOP is the way to win. Unfortunately it also appears to be
the end of most of the stuff we place moral value on. A universe full of
perfect rationalists is a universe where all diversity resides solely in
people's goal systems (which may or may not converge); the qualities of
'insight', 'creativeness', 'willpower' etc will all dissappear as they are
defined against flaws, and goal-system properties such as 'compassion' will
revert to 'did this person have an initial utility function that was
compassionate under renomralisation'? This is on top of the already known
issues with qualia and the illusion of free will; both are results of
specific (adaptive) flaws in human introspective capability which would be
relatively trivial for transhumans to engineer out, but at the cost of
breaking the grounding for the actual (rather than theoretical)
implementation of our moral and legal systems and creating something we can
no longer emphasise with.
The basic question here is 'can we create a Power we can care about?'. A
Yudkowsky RPOP is at least potentially a Power, but it is explicitly
designed to be one we don't care about, as it isn't sentient in a way we'd
assign moral worth to (a decision currently made using our ad-hoc evolved
neural lash-together). What do we need to add to make it volitional, and
what further characteristics would we want to be present in the beings
humanity will become? Are some inherent limitations and flaws actually
necessary in an intelligence order to qualify as something worthwhile?
Less relevantly, is a Nice Place To Live likely to insist that its volitional
sentients have some selection of reasoning flaws in order to create a
diverse and interesting society? This is something of a blow for
rationalists, in that perfect rationality may indeed be hopelessly inhuman,
but isn't there a way to hybridise normative and non-normative reasoning
into a cognitive architecture that is both powerful and morally relevant
(ok, perhaps this is my desire for Cosmic Power coming through ?
The CV question could be glibly summarised as 'is there a likely incremental
self-improvement path from me to a paperclip optimiser?'. While few people
like paperclips /that/ much, it seems likely that many people would choose
to become perfect rationalists without appreciating what they're losing. If
there is a path to normative reasoning that looks locally good all the way
and reports back that everything is fine when extrapolating, an
implementation of CV that doesn't allow for this may lead us into something
we should consider a disaster.
This issue is ultimately a comprehension gap; a universe of perfect
rationalists might well be rated as valuable inhabitants, but we have no
way of mapping our conception of worthwhile and desirable onto this
basically alien assessment. Along the wild ride that has constituted my
seed AI research to date, my original engineering attitude (focus on
practical stuff that works, everything can be fixed with enough technology)
has had to expand to acknowledge the value of both the abstract (normative
reasoning theory and relevant cosmology) and the humanist (despite all the
hard maths and stuff you have to cover just to avoid disaster, Friendliness
ultimately comes down to a question of what sort of universe we want to
live in).
* Michael Wilson
http://www.sl4.org/b...i.pl?Starglider
http://www.sl4.org/a.../0409/9841.html
Edited by Savage, 31 October 2008 - 02:55 AM.
#43
Posted 01 November 2008 - 07:51 PM
It would seem that the drastic alteration of one's mental faculties to any degree would fundamentally alter the relationship between the complex, emergent phenomena that is consciousness and your more primitive drives and desires. Your emotions, or affective states, are the perceptual artifacts of your limbic system. Your subjective understanding of those evolved substrates hinges greatly on the way that you alter, by way of learning and experience, the architecture of your higher cognitive capacities. By integrating vastly superior wetware, it's naive to think that you will possess the ability to maintain a relationship with your emotional "self" that is similar to the one you have now.
I think it would be possible to design intelligence upgrades that would preserve our capacity for emotion. After all, we do seem to share emotions with small-brained animals. If we assume mice are conscious, a mouse's fear probably feels similar to human fear. The only difference is that we can better understand the situations we are afraid of and our minds can (meaningfully) generate the words "I'm scared".
#44
Posted 03 November 2008 - 10:44 PM
#45
Posted 04 November 2008 - 08:24 PM
Why would any body not want to be able to move toward the understanding of every concept? Not understanding stuff leaves room for mistakes and mistakes leave room for destruction on unforseeable levels ranging from trivial to the end of the universe.
#46
Posted 05 November 2008 - 04:55 AM
It would look a little like a bunch of high functioning idiots doing hippy commune stuffs but without the patchouli and dirty feet. All dirty feet will be caned. All patchouli will be waved out of face.... because that's all we'll be willing to do about it...get on with things, you know?
We will turn the upper stratosphere into a wind-turbine meat grinder bringing us the freshest seasonal birds when they so choose to feed themselves to us.
We will blind all bees and our shorter brethren with our rampant application of solar panels (panel tilt angle).
We might as well ranch dolphins sine they have such large brains, livers, and kidneys which are far more efficiently nutritious than any other food on earth.
Cars. We got 'em. But they run on seawater. The fumes are captured as potable drinking water that we barter to inland fruitstand owners for some of those nearly extinct Polaroid photos of our trip destination: Inland Fruitstand. Wait, freshwater...? Who cares, it's a Polaroid! A pity all teh salt we drop alongside the fruittree groves to get there...
We'll laugh and galumpfph and trip because we don't know what dancing is and who really cares when the lights are all on? Everyone looks like as ass when they're dancing so we've won: loss of self-consciousness.
Corduroy is the national dress code for friction-harvesting technologies. Dance party shinkansen...all the way there. We got dolphin in our bellies, we can handle.
Anything I've forgotten? Oh that's right...all of it.
Edited by REGIMEN, 05 November 2008 - 05:05 AM.
#47
Posted 06 November 2008 - 06:43 AM
#48
Posted 06 November 2008 - 11:03 PM
Ide want to continue on toward understanding every concept, if along the way I found that limits in certain areas were more useful than not then Ide want to implement that.
Why would any body not want to be able to move toward the understanding of every concept? Not understanding stuff leaves room for mistakes and mistakes leave room for destruction on unforseeable levels ranging from trivial to the end of the universe.
That's a certainty, and on a practical level, we will need to have as many powerful means as possible at our disposal, if we are to survive our future.
But as I disagree with the idea that unweaving the rainbow doesn't make it any less beautiful (rather, it unveils a different kind of beauty, which may well be uncompatible with the previous beauty we perceived in that rainbow), I think that there's a pretty evident reason why you one wouldn't want to grow intelligent beyond a certain point : because the being resulting from that vastening wouldn't be you anymore. It's a kind of death.
Now of course if nothing prevents you from duplicating yourself and having a part of you growing to post singularity intelligence, and the other(s) remaining as you are now (or as you will be along the path, as many of them as you need to snapshot all the significantly different versions of yourself), then this fear is as good as dispelled.
People don't just exist as problem solvers, though they have to be problem solvers before anything else, by necessity. They could certainly also enjoy idle time as mere human beings. It's a valuable state of being in itself.
#49
Posted 03 December 2008 - 03:54 AM
#50
Posted 31 December 2008 - 06:52 PM
#51
Posted 22 February 2009 - 08:15 AM
Lets say you do have a brain the size of a star. You know everything. You understand everything. You are in contact with every other mind in existance and share data simultaniously with all of them.
Why?
That person you share data with has the exact same data as you. Since you share all data, even differences between your locales in the universe are meaningless, because you will have already shared such differences and analysed it completely. Logically they are your equal and so any conclusions they draw will be identical to yours, they will have no areas in which there is any difference between "you" and "them". In essence talking to "them" is identical to sitting and thinking, but what is there really to think about? You know everything!
This is the end result of the rationalistic ideal. Personally I view it as non existance.
Do I want higher intellegence? Yes. Do I want to maintain my humanity while having it? You betcha. I don't want to control others, because if I did, where is the potential for interaction? How can I talk to them, learn new stuff from them, be forced to THINK by them when *I* AM THEM???
If I were all alone, I'd want to clone myself a few dozen times, with gender, color, and size variations, probably even differences in whether all of us are even fully human or mixed anthropomorphs, simply to give myself someone to talk to who DID NOT HAVE THE EXACT SAME VIEWPOINT as myself. Even if we started out similar, our difference would grow over time.
So I suppose you could say higher with limitations, because limitations leaves potential for growth. Is it a risk? You betcha, but when the alternative is eternal stasis, I'll take my chances thank you very much.
#52
Posted 02 March 2009 - 02:19 PM
I want to do that. I live to learn and learn to live. Once you know everything - you know how to live forever and once you life vorever - you have enough time to learn everything.
I do want to be able to understand it all, it has to be a challenge.
#53
Posted 02 March 2009 - 04:09 PM
Edited by ben, 02 March 2009 - 04:10 PM.
#54
Posted 24 June 2009 - 02:48 AM
Think about that, if would live for 10,000+ years, we can learn so much things!! It could be boring if everyone would augment his intelligence.
Edited by Anonymous, 24 June 2009 - 03:23 AM.
#55
Posted 24 June 2009 - 04:41 AM
#56
Posted 24 June 2009 - 07:13 AM
I would give myself photographic memory and maybe increase my ability to reason 20-30 iq points. That would be more then enough to satisfy my needs.
There have been a lot of books that talked about this subject. Everyone here should read some larry niven. He wrote a book called Protector which is about what happens to humans when they enter the 4rth development stage of human life. I think he won the nebula award once or twice for his ring world series.
Edited by bobscrachy, 24 June 2009 - 07:14 AM.
#57
Posted 24 June 2009 - 04:49 PM
First it depends on the track record of the procedure. I certainly will not volunteer to be first. The risks for having everything you were being overwritten, in essense dying, are too great. I want to maximize what I have been given so far before embarking on something more, anyway. I've still got such a long way to go. I feel inferior in many ways. (It's not just a feeling. I know conclusively that I am deeply inferior--not merely to others, but to what I could be (naturally).
Mental superaugmentation may become a necessity if the augmented are so vastly superior as to enable them to collectively dominate all others on earth.
But how would that be different from a runaway superAI? Maybe we would need some heroes to step forward to keep the AIs in check, while ensuring that the ultraintelligent entities still have humanity. I get so jaded about the weaknesses of humans, but my innate love of people is so strong...
If we live for thousands of years, mental augmentation may be a necessity. Imagine being in the stone age among modern people.
#58
Posted 24 November 2009 - 05:27 PM
http://en.wikipedia....wisatz_Haderach
http://en.wikipedia....i/Pak_Protector
#59
Posted 06 December 2009 - 02:03 PM
No you don't want to do that. A lot of intelligent people are psychopaths and worst.
A lot of unintelligent people are also psychopaths or worse.
In my professional life, I have seen sociopathy (which is, I presume, what you mean) in adults with learning disabilities. We're talking really, really low intelligence here, yet still all the same behavioural markers. The only difference is that a certain level of intelligence is generally required to be dangerous to society.
That said, I'd wager that a sociopath with intelligence above a certain level would be intelligent enough to not go on mass-murdering sprees, etc. The kind of person colloquially known as the "white collar psychopath".
sponsored ad
#60
Posted 06 December 2009 - 02:55 PM
I would vote to know every known concept, somehow incorporate fun theory in there by either slowing down my knowledge accumulation learning process, or just using set knowledge and my intellect to do cool and fun things, that I couldn't really speculate on being in this primitive stage in the game.
I would think throwing in ideas like will power, competition, even deleterious and maybe slightly irrational mood states might benefit this intelligence, just as long as it doesn't become dangerous or sociopathic. I guess I also agree with David Styles in this sense and his argument that low intellect can yield psychopathic tendencies as well. Also mental illness is a contributor as well, and I'm pretty sure Hitler was mentally ill on some or many levels. However I do think the American psycho stereotypical psychopath would be far more dangerous than a more impaired mind with the same tendencies.
Finally this overlaps with Portals argument that imperfection leads to terrible often catastrophic mistakes, and you can well imagine some of the scenarios. I would like to build my intelligence, (assuming I would augment it with nano, or pycho technology, and whatever else the future would yield), to have iterative gradients of well being, (as David Peace comments,) whilst also knowing every known concept...somehow the combined two could yield Yukowskies vision of true fun theory.
I also agree with Vyntager's argument that even though something will be lost as we morph from human to post human intellect the new intellect will be just as exciting, probably exponentially more so, and we probably wouldn't miss our old brains, just as long as we can incorporate some kind of emotional reasoning...or simply put feelings.
Edited by dfowler, 06 December 2009 - 03:06 PM.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users