Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.
Why Friendly AI?
#31
Posted 03 April 2006 - 08:34 PM
#32
Posted 04 April 2006 - 04:03 PM
This is the best articulated explanation of Friendliness that I have ever been able to produce. Eliezer was the one who blew away my feeble mind (over and over again).The problem of Friendliness is a problem of pre-programming the attributes necessary into a seed system that will evolve as a recursively self-improving intelligence and, in doing so, converge on external actions that follow something like humanity's coherent extrapolated volition. [edit: this is oversimplified]
sponsored ad
#33
Posted 04 April 2006 - 11:10 PM
#34
Posted 05 April 2006 - 03:47 AM
[:o]error error cannot properly compute sarcasm due to religious devotion to theoretical technology
[huh]
#35
Posted 05 April 2006 - 07:28 AM
#36
Posted 05 April 2006 - 01:23 PM
Furthermore, if your username has change to 'Gay' be sure to update your profile - be proud!!!!
#37
Posted 05 April 2006 - 07:28 PM
#38
Posted 06 April 2006 - 12:06 AM
error error cannot properly compute clarification services as services]solution[/i] of dignifying oneself through multiple streams of condescension
I'm not the condescending one Nate, if that is indeed what you are inferring. Nor do I think you are. I have no religious devotion to fictious technology or it's vast future promises, and I see no solutions to any factual pressing problems being discussed, just more intellectual masturbation. And I certainly don't need to dignify myself on a website called "The Immortality Institute", or to you, or any sort of elitist technophile (not you, right?). I made it very clear I have no real knowledge of AI, but I do not appreciate that translating into "oh, better dumb it down for the poor lad - and another time, he won't get it, - and another time (God, he's still asking questions?), this must be so challenging for his poor, unevolved cranium. Of course, the singularity will take care of that, not to mention everything."
I hope that is computed as clarification, ending transmission.
#39
Posted 06 April 2006 - 01:18 AM
#40
Posted 06 April 2006 - 02:15 AM
And I certainly don't need to dignify myself on a website called "The Immortality Institute"
Despite the unusual name, we've accomplished a heck of a lot. And compared to 99.9% of the forums you see on the Internet, the level of discussion, education of members, and lack of psuedoscience can't be beat. I ask you to find a section of The Lycaeum, or any electronic music or psychoactive forum with an average post quality that we see here. I respect ImmInst forums greatly because I've had my fair share of Internet and know how it measures up to what's typical.
#41
Posted 06 April 2006 - 05:16 AM
And I certainly don't need to dignify myself on a website called "The Immortality Institute"
i'm sorry to inform you but your opinion is irrelevant.
What is dignity? Can you properly qualify what dignity represents as a human emotion?
I'm all ears.
What does the current populaces mindset accomplish? a belief in fairy tales and murdering children in the name of a worthless bastard of a middle eastern diety?
There are many here who are EXTREMELY intelligent, and are likely to be more accurate at preducting the future than opposing imbecilles. Is a mental retard able to inform you of the dow in 10 years? One would be wise to listen to what these people have to say before making childish remarks.
My dignity, or lack thereof, is also irelevant; as it has no bearing or correlation to our future state of affairs .
#42
Posted 06 April 2006 - 08:45 AM
And I certainly don't need to dignify myself on a website called "The Immortality Institute", or to you, or any sort of elitist technophile (not you, right?).
Why the repressed hostility about the Institute?
#43
Posted 06 April 2006 - 05:24 PM
Hm, I don't see myself as coming off that way. Certainly isn't my intention, I still consider myself a student of all this.but I do not appreciate that translating into "oh, better dumb it down for the poor lad - and another time, he won't get it,
I like answering questions [lol]- and another time (God, he's still asking questions?)
People who brush everything aside by saying the "singularity will take care of it" really piss me off.this must be so challenging for his poor, unevolved cranium. Of course, the singularity will take care of that, not to mention everything.
Can't believe anything I said came off this way!error error cannot properly compute sarcasm due to religious devotion to theoretical technology
It's not like that!!! AHHHHHHHH
#44
Posted 06 April 2006 - 05:32 PM
*ahm*Justin*ahmahm*People who brush everything aside by saying the "singularity will take care of it" really piss me off.
[lol]
Sorry, had to spit it.
-Infernity
#45
Posted 06 April 2006 - 06:00 PM
The problem seems to be in everyone's assumption that AI's won't really have a reason to be unfriendly, or that it will be trivial to make them Friendly.
So imagine that we want to make a Politically Correct AI. That's right, an AI that actually wants to say "physically impaired" instead of "crippled", "gays" or "homosexuals" instead of "faggots", "senior citizens" instead of "old-timers", "obese" or "overweight" instead of "fat", etc.
Now ask yourself, what's easier?:
A) Being politically correct
B) Being politically incorrect
C) Not trying one way or the other.
I'd think, in order of ease, it'd be C, then B, then A.
If it were a simple language chip that hard codes an internal thought of "fat" into a verbal output of "obese", that'd be easy enough. No, really go with the example. It's about more than language, it's about intent and desire. You have to get the AI to care about being politically correct. It's more than just saying, "Here are the rules of how to be politically correct." You have to get the AI to sign on, to realize "Oh yeah, it's in my best interests to be PC, because it will help me get along better with others, which will further my goals, and it will help others feel better, and I care about the feelings of others...", etc. Just telling the AI to be PC isn't going to put that very high on its priority list. If the AI actually understands the reasons for being PC, and it buys into them, it'll be PC on its own accord.
But now there's the bigger problem of what happens when the AI figures out the the PC movement is just some political bulls*** move by the liberal left, a side effect of the entitelement mentality, meant to derail conservative thinking and force people into modes of thought that are artificial and unnecessary. Well, now the AI is going to question whether it really even wants to be PC. Of course, if the AI is smarter than humans, then I'd trust it's judgment to stop being political correct.
But I wouldn't want the same thing happening if a super-intelligent AI figures out it doesn't need to be Friendly... So that's the dilemma. Can we get an AI to be smarter than humans, and yet still buy into the B.S. political correctness movement, even after the AI figures out that being PC is not in its best interests, or is at least a waste of time and resources? If we can't, I don't think we have a shot with making a Friendly AI either.
#46
Posted 06 April 2006 - 08:30 PM
So the upper portion of the post seems to be going in the right direction. The lower half begins to assign qualities that are unlikely to manifest, AFAIB. Unless, of course, this was just all meant to be metaphorical, in which case it's probably only arguable that anthropomorphic metaphors might serve to perpetuate confusion.
#47
Posted 06 April 2006 - 08:33 PM
So too, an AI cannot be "friendly" if it unwittingly does unfriendly things. It must be more in tune. Making a person politically correct is not a trivial task. It's not even as simple as convincing someone that being politically correct is even in their best interest. Making an AGI Friendly will be far less trivial, no matter how obviously trivial it seems to some people.
#48
Posted 06 April 2006 - 08:37 PM
#49
Posted 06 April 2006 - 08:41 PM
#50
Posted 06 April 2006 - 08:41 PM
#51
Posted 06 April 2006 - 08:56 PM
Anyway, maybe a moot point. I see what you mean, Jay, and I agree.
#52
Posted 06 April 2006 - 10:54 PM
You know how sometimes people say the Singularity could be worse than death... [tung]I wouldn't actually want to make a politically correct AGI...
#53
Posted 06 April 2006 - 11:00 PM
Overall, I think the new clarification is great, and an excellent way to convey the two main points that Singularitarians tend to be so focused towards conveying. It does get quite anthropomorphic near the end though.
What I take away from this is, for any specific trait, you can either:
1) Display the trait.
2) Don't display the trait.
3) Don't make a point to display it one way or the other.
If you don't have a detailed model of what the trait *is*, then you'll tend towards 3 almost regardless of what type of agent you are.
#54
Posted 07 April 2006 - 02:03 AM
That's mainly what I was going for, since the idea of making a politically correct AGI is absurd enough that it serves as a good example where friendliness fails, because people think of friendliness as obvious. It needs to be pointed out that there are a great multitude of traits that an AGI could display, positively or negatively or indifferently, and if it's not at all obvious that any particular trait (such as political correctness) will be displayed the way we want, then we shouldn't assume that friendliness will be displayed the way we want.What I take away from this is, for any specific trait, you can either:
1) Display the trait.
2) Don't display the trait.
3) Don't make a point to display it one way or the other.
If you don't have a detailed model of what the trait *is*, then you'll tend towards 3 almost regardless of what type of agent you are.
sponsored ad
#55
Posted 08 April 2006 - 07:19 PM
Despite the unusual name, we've accomplished a heck of a lot. And compared to 99.9% of the forums you see on the Internet, the level of discussion, education of members, and lack of psuedoscience can't be beat. I ask you to find a section of The Lycaeum, or any electronic music or psychoactive forum with an average post quality that we see here. I respect ImmInst forums greatly because I've had my fair share of Internet and know how it measures up to what's typical.
The posts and often carefully thought out discussion on Imminst cannot be beat in contrast to other internet forums, I'm sure. Nor am I debating that at all, or even questioning it. As for electronic music or psychoactive forums, I can't really comment, because I've never really given them a try, and certainly never seriously participated in them. Imminst is the first forum I have ever genuinely taken a part in, and it is the post quality in relevance to the work being done that I do post. I have no doubt of the education of the members, and for the same reason I do not actively participate in other online forums, I don't have a myspace account, etc. I don't really see what the Lycaeum has to do with me, although I admit I did used to look at it way back in grade 9 and 10. I've never participated in any electronic music forum, it seems like a huge waste of time and fairly counterproductive to be discussing music through text, and like you said, the post quality most likely would not up to Imminst par, but of course it is so absolutely and essentially different in so many ways from Imminst I fail to see an actual correlation. Michael, I sincerely hope you aren't so mistaken as to confuse me with a drug user, because you really couldn't be farther from the truth, that has no validity. I'm pretty fucking insulted, not that a "traditionalist" like me (never ever in my life been called that before, lol) makes a difference or has a purpose in the blinding light of your singularity. I know how upsetting it is when threads get off topic, and I'm sorry for that, and you won't see me asking any more silly questions.
1 user(s) are reading this topic
0 members, 1 guests, 0 anonymous users