• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
- - - - -

Does intelligence have intrinsic value?


  • Please log in to reply
81 replies to this topic

Poll: On which would you have more compassion? (26 member(s) have cast votes)

On which would you have more compassion?

  1. The sentient entity (14 votes [70.00%])

    Percentage of vote: 70.00%

  2. The intelligent entities (6 votes [30.00%])

    Percentage of vote: 30.00%

Vote Guests cannot vote

#1 Clifford Greenblatt

  • Member
  • 355 posts
  • 4
  • Location:Owings Mills, MD

Posted 18 May 2006 - 09:18 AM


Intelligence has extremely great value because it is highly beneficial to sentience. But what about intelligence for the benefit of intelligence alone? Is there some intelligence so great that we would regard it valuable even if it had absolutely no effect on any sentient beings? This poll is built on a thought experiment which will hopefully give the voter a helpful perspective by which to decide. I attempted to provide such a perspective in a
previous poll , but its thought experiment did not sufficiently isolate sentience and intelligence from each other. This new thought experiment should provide much better isolation, but at a cost. Unfortunately, the thought experiment involves some very unrealistic science fiction, because I was not clever enough to devise anything better. Therefore, I ask the reader to overlook the technical problems and focus on the value implications of the experiment. I do not want this to turn into a debate about the technicalities of the experiment.

Suppose you enter another universe through a wormhole. The physical constants in that universe are very friendly to superior intelligence but very hostile to sentience. You discover that the universe contains trillions of trillions of trillions of extremely intelligent entities. Each of these entities contribute highly to the welfare of each other. Together, they have made their universe an amazing place of great and dynamic beauty. All the inhabitants of this universe are radically more intelligent than any humans, but none of them are sentient. There is one exception. You discover that the universe contains exactly on entity that is highly sentient but which has no intelligence at all. This one entity becomes known to you, but can never be made known to any of the intelligent inhabitants of the universe.

Having learned much about the universe, you must leave it quickly, because the wormhole is about to close forever. In leaving that universe, you must make a very important decision. Your exit through the wormhole could cause a change in the physical constants of that universe. If you exit in one manner, the physical constants of that universe will be unchanged. If you exit in another manner, the physical constants of that universe will be changed in a way that will be highly beneficial to the one sentient entity but highly detrimental to all the intelligent entities.

If you keep the physical constants as they are, conditions will remain highly favourable to the intelligent entities for the next trillion years, but the single sentient entity will suffer extremely intense torment for a thousand years. If you change the physical constants, the sentient entity will spend those thousand years experiencing pleasure instead of torment. However, all the intelligent entities will spend the next trillion years in a desperate struggle against conditions that are extremely hostile to them. Which way would you choose to leave that universe? Remember that all of the intelligent entities are extremely intelligent, but they are in no way sentient. The sentient entity is extremely sentient, but lacks any intelligence.

To assist those who cannot imagine sentience without intelligence or intelligence without sentience, here are some practical considerations.

Sentience without intelligence:

A good example of this may be a profoundly retarded human. Even a dog may display much more intelligence than such a person. Yet, this person may show signs of being sentient when subject to certain conditions. It may not be too hard to imagine such a person consciously experiencing a feeling of extreme pain or discomfort like an intelligent person experiences it.

Intelligence without sentience:

Some may argue that examples presented here are not examples of intelligence. However they are certainly examples of abilities that radically exceed any accomplishments yet demonstrated by any human mind.

How many of you know how to coordinate all that is involved in building a person from a single cell? A single cell is quite capable of doing this without intervention from external intelligence. Beginning with just itself, it builds a vast communication network that coordinates an enormous complex of tasks by which circulatory systems, digestive systems, and the most advanced of nervous systems are constructed.

The philosophy of naturalism provides a very interesting consequential example. Common views of naturalism assume that the most advanced of intelligent beings were developed by nature without guidance or supervision from any sentient beings. Even the most brilliant of all geniuses have fallen radically short of such an accomplishment.

#2 quadclops

  • Guest
  • 316 posts
  • -1
  • Location:Pittsburgh, PA

Posted 18 May 2006 - 02:34 PM

Suppose you enter another universe through a wormhole. The physical constants in that universe are very friendly to superior intelligence but very hostile to sentience.


If I were to enter such a universe my own sentience would likely become compromised, and I would no longer be able to make any kind of "decisions" at all. [:o]

#3 quadclops

  • Guest
  • 316 posts
  • -1
  • Location:Pittsburgh, PA

Posted 18 May 2006 - 03:25 PM

Besides which, I realize that this is just a thought experiment but, wouldn't travel to other universes violate the fist law of thermodynamics?

Wouldn't it seem when you left this universe to go to another that, from the point of view of an observer in either universe, matter or energy had winked out of existance completely in one, and suddenly popped out of nowhere in the other? [huh]

How could this completely unprecedented quantum peculiarity be reconciled with the first law?

A friend of mine once pointed out that this would also seem to preclude the possibility of Time Travel.

sponsored ad

  • Advert

#4 Clifford Greenblatt

  • Topic Starter
  • Member
  • 355 posts
  • 4
  • Location:Owings Mills, MD

Posted 18 May 2006 - 04:53 PM

quadclops said:

If I were to enter such a universe my own sentience would likely become compromised, and I would no longer be able to make any kind of "decisions" at all.

Besides which, I realize that this is just a thought experiment but, wouldn't travel to other universes violate the fist law of thermodynamics?

Wouldn't it seem when you left this universe to go to another that, from the point of view of an observer in either universe, matter or energy had winked out of existance completely in one, and suddenly popped out of nowhere in the other?  [huh]

How could this completely unprecedented quantum peculiarity be reconciled with the first law?

A friend of mine once pointed out that this would also seem to preclude the possibility of Time Travel.

As I said, I was not clever enough to make the thought experiment realistic. This is why I requested that the technical problems be overlooked so that the focus may be on the value decision.

#5 Kalepha

  • Guest
  • 1,140 posts
  • 0

Posted 18 May 2006 - 05:03 PM

Hi, Clifford. What I'd personally be interested in is the motivation for this thought experiment and then analyzing the motivation.

#6 RighteousReason

  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 18 May 2006 - 05:04 PM

What are the mysterious code words "sentience" and "intelligence" that you see as so important?

#7

  • Lurker
  • 0

Posted 18 May 2006 - 09:28 PM

Clifford, one can overlook all the potential problems with this thought experiment if you adequately define "sentience" and "intelligence" for the purpose of this discussion. Your examples may or may not suffice. Like the plausibility of the thought experiment, the definitions you provide need not be disputed in this thread.

#8 mikeyg

  • Guest
  • 13 posts
  • 0

Posted 18 May 2006 - 09:39 PM

I would only have compassion for the sentient being, because it would be the only one who could "feel" (be aware of) it's pain.

It's difficult for me to picture a sentient being without ANY intelligence though.

It's sad to think that a sentient being without intelligence would never have a clue as to why it was suffering, and what potential options it had to alleviate itself of the pain it felt; despite possibly a very simple cause. This could also be true of an intelligent sentient being as well (depending on the complexity of the cause) and is certainly not any reason to feel less sorry for it. But I'm not sure if the sentient being without any intelligence could be tortured (in terms of it's having to endure it's pain through time):

I don't think a being with absolutely zero intelligence could think the equivalent of " This hurts, this hurts, This is aweful, how can I make this stop, I want this to end, I want this to end" because that would reflect a temporal understanding of it's existence, in terms of the concept of duration, and the possibility of an experience ending or continuing; which requires learning through experience - and this requires intelligence. You would need to know and remember that the pain you were under the previous instant, or segment of time, is connected to the pain you are currently under. So, for the unintelligent sentient being, it might be something like the equivalent of "ow, ow, ow, ow, ow, ow, ow, ow,....." without the understanding that the "ow's" it's experienced are connected, the past ones to the present, and the potential for there being future "ow's" - which will be connected to the previous ones.

I guess to me being tortured would be a particularly aweful experience beacuse of (and it might possibly require?) the fear and anticipation of future pain, the memory of enduring previous related pain, as well as the current pain one is under.

My compassion, then, would be even greater for the sentient being with the minimum intelligence to fear pain and remember pain.

But like I said, It's difficult for me to picture a sentient being with zero intelligence.

#9 RighteousReason

  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 18 May 2006 - 11:34 PM

Yeah, there is this vague intuitive notion of 'intelligence' vs. 'sentience', but without actually coming out and describing what each of these mathematically entails, all the philosophizing walks a very careful line between being a silly waste of time and propagation of harmful memetics..
  • like x 1

#10 DJS

  • Guest
  • 5,798 posts
  • 11
  • Location:Taipei
  • NO

Posted 18 May 2006 - 11:57 PM

I won't dignify this poll with my participation. [tung]

EMPIRICISM AND THE PHILOSOPHY OF MIND

#11 Lazarus Long

  • Life Member, Guardian
  • 8,116 posts
  • 242
  • Location:Northern, Western Hemisphere of Earth, Usually of late, New York

Posted 19 May 2006 - 01:24 AM

(don)
I won't dignify this poll with my participation.  [tung]


Too late [wis]

#12 DJS

  • Guest
  • 5,798 posts
  • 11
  • Location:Taipei
  • NO

Posted 19 May 2006 - 01:30 AM

heh, I meant specifically voting in the poll. Come on Laz, you should know by now that I would never restrict myself from blathering (or frivolously doling out links).

#13 Clifford Greenblatt

  • Topic Starter
  • Member
  • 355 posts
  • 4
  • Location:Owings Mills, MD

Posted 19 May 2006 - 09:22 AM

cosmos said:
Clifford, one can overlook all the potential problems with this thought experiment if you adequately define "sentience" and "intelligence" for the purpose of this discussion. Your examples may or may not suffice. Like the plausibility of the thought experiment, the definitions you provide need not be disputed in this thread.

hankconn said:
What are the mysterious code words "sentience" and "intelligence" that you see as so important?


Sentience is the fundamental property of the mind that separates us from oblivion.

Intelligence is a dynamic system of information processing. It can be simple or extremely advanced.

Nate Barna said:
Hi, Clifford. What I'd personally be interested in is the motivation for this thought experiment and then analyzing the motivation.


First, I need to explain what I do not hope to accomplish in this thread.

1. I am not attempting to prove dualism or disprove monism. I would like to do this, but I do not think this is possible and nothing in this thread will give weight or evidence to either position. Dualism could give a unique and invariant identity to a person, but the idea of sentience could also work as a generic property with no personal identity.

2. I am not attempting to prove supernaturalism or disprove naturalism. Again, I would like to do this, but the value of sentience relative to intelligence does not depend on whether sentience is a natural or supernatural phenomenon.

3. I am not trying to prove something about qualia (attn: Don). The language of qualia suggests quality of experience. Experience can have quality with or without sentience. It is simply a matter of how different minds differently process similar data. I do see an strong intimacy between mental processing of sensory data and sentience, but I do not think the idea of qualia properly captures this.

Having enumerated what I am not trying to accomplish in this thread, I will now make a simple statement of what I am trying to accomplish. All I am trying to demonstrate is that sentience is the one, fundamental characteristic of our minds that separates us from oblivion. Having said this, I fear that I have created a circular argument, because this is how I defined sentience for Cosmos and Hankconn. The problem with going any further is that any language I use to describe sentience could be used to describe something else and therefore fail to distinguish that something else from sentience. However, this extreme thought experiment is designed to distill the concept of sentience to something so simple and so powerful that its essence should become obvious to those who consider it. Some may find the use of extreme torment in the thought experiment offensive, but I can think of no more powerful manifestation of sentience than is found in extreme torment.

Edited by Clifford Greenblatt, 19 May 2006 - 10:17 AM.


#14 Clifford Greenblatt

  • Topic Starter
  • Member
  • 355 posts
  • 4
  • Location:Owings Mills, MD

Posted 19 May 2006 - 09:39 AM

mikeyg said:
I don't think a being with absolutely zero intelligence could think the equivalent of  " This hurts, this hurts, This is aweful, how can I make this stop, I want this to end, I want this to end" because that would reflect a temporal understanding of it's existence, in terms of the concept of duration, and the possibility of an experience ending or continuing; which requires learning through experience - and this requires intelligence. You would need to know and remember that the pain you were under the previous instant, or segment of time, is connected to the pain you are currently under. So, for the unintelligent sentient being, it might be something like the equivalent of  "ow, ow, ow, ow, ow, ow, ow, ow,....." without the understanding that the "ow's" it's experienced are connected, the past ones to the present, and the potential for there being future "ow's" - which will be connected to the previous ones.

I guess to me being tortured would be a particularly aweful experience beacuse of (and it might possibly require?) the fear and anticipation of future pain, the memory of enduring previous related pain, as well as the current pain one is under.

Anticipation of torment is a powerful form of torment itself. This kind of torment can be generated within the mind with no external cause at all. The interesting thing about anticipation is that it accesses memories of the past and forms ideas about the future, but it cannot access the actual past or future. It may be hard to imagine torment without memory or anticipation, because they quite effectively facilitate torment, but I think that extreme torment can exist without them.

#15 Kalepha

  • Guest
  • 1,140 posts
  • 0

Posted 19 May 2006 - 06:06 PM

Thanks for clarifying, Clifford.

If I appeal to some particular intuitions (vague or underdeveloped concepts) in isolation, as it is suggested, I may unquestionably have more compassion for the sentient entity. But even if the sentient entity was a reproduction of one of my closest states, I couldn't genuinely say I have more compassion for the sentient entity, given a larger context for those intuitions and making them inconsistent. First and foremost, I don't pretend that I know exactly what I'm talking about when I utter, "I have compassion for others." To what exactly do "I," "compassion," and "others" refer? It's not sufficient to "feel good" in a delusion where ignorant products of an incompletely known nature are related by an ignorant process. Thus, I am very much oblivious, regardless that you might attribute sentience to me.

Unfortunately, then, I also can't participate in the poll. Too many considerations plague. . . . Sentience fundamentally may be a restriction on functionality. If more intelligent, supposedly oblivious processes have a monopoly on sentient infrastructure, sentience is nothing but a ridiculously redundant hallucination of nonoblivion. And so forth.

#16 mikeyg

  • Guest
  • 13 posts
  • 0

Posted 19 May 2006 - 06:16 PM

Clifford Greenblatt:

Anticipation of torment is a powerful form of torment itself. This kind of torment can be generated within the mind with no external cause at all. The interesting thing about anticipation is that it accesses memories of the past and forms ideas about the future, but it cannot access the actual past or future. It may be hard to imagine torment without memory or anticipation, because they quite effectively facilitate torment, but I think that extreme torment can exist without them.


I agree that extreme pain can exist without memory and I would feel compassion for the hypothetical sentient being who would feel that pain, as mentioned in my last post. The unintelligent sentient being would feel pain, but it would be very dissimilar to the way we would - which was what I was trying to draw attention to.

#17 RighteousReason

  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 19 May 2006 - 07:55 PM

Sentience is the fundamental property of the mind that separates us from oblivion.


Are animals sentient? If not, then are you saying that humans are the only things that have sentience and humans are the only things that have intelligence (in terms of your definition of each)?

What is the un-mentioned context with which you mean "oblivion"?

Intelligence has extremely great value because it is highly beneficial to sentience. But what about intelligence for the benefit of intelligence alone?

Are you sure that sentience isn't so fundamental to intelligence that "intelligence for the sake of intelligence alone" isn't simultaneously for the sake of sentience, by necessity?

"Is there some intelligence so great that we would regard it valuable even if it had absolutely no effect on any sentient beings?"

If there is ANYTHING that has absolutely no effect on any sentient beings, then in what way could we regard such a thing at all (let alone with respect to it's value)?

All the inhabitants of this universe are radically more intelligent than any humans, but none of them are sentient.

But yet,

Each of these entities contribute highly to the welfare of each other. Together, they have made their universe an amazing place of great and dynamic beauty.


Just what about these entities is not separated from oblivion? How is their amazing place of great and dynamic beauty equivalent to oblivion?


Based on your intuitive idea of sentience, it follows that animals, plants, bacteria, television sets, and cardboard boxes all have sentience, to some degree (unless I may have gotten you wrong). Every quark of every proton of every atom of every molecule, and all the layers of abstraction and specification in between, all have sentience. Even a human has sentience. You can't have an intelligent object. Intelligence is a process, and because it's as such, it's sentient. Each idea you have in your head is sentient, because it has an effect on other sentient things- one of which being your physical body, and each one of it's trillions upon trillions of sentient pieces, and quite far beyond.

The reactions to your thought experiment have been so rediculously superficial that it's annoyed me enough to continue participating here...

#18

  • Lurker
  • 0

Posted 19 May 2006 - 08:47 PM

Hank

Are you sure that sentience isn't so fundamental to intelligence that "intelligence for the sake of intelligence alone" isn't simultaneously for the sake of sentience, by necessity?


It may be a physical necessity in this universe (which has yet to be fully established), but it is not a logical necessity in all possible worlds. All we require of Clifford's thought experiment is that it be logically consistent, without contradiction. He may choose to define (or redefine, depending on your position) sentience independent of intelligence for the purpose of this exchange.

#19 RighteousReason

  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 19 May 2006 - 09:00 PM

It may be a physical necessity in this universe (which has yet to be fully established), but it is not a logical necessity in all possible worlds. All we require of Clifford's thought experiment is that it be logically consistent, without contradiction. He may choose to define (or redefine, depending on your position) sentience independent of intelligence for the purpose of this exchange.

Do you think I'm a fuckin idiot?

#20 Kalepha

  • Guest
  • 1,140 posts
  • 0

Posted 19 May 2006 - 09:01 PM

All we require of Clifford's thought experiment is that it be logically consistent, without contradiction. He may choose to define (or redefine, depending on your position) sentience independent of intelligence for the purpose of this exchange.

While this is probably true, Clifford may as well have inquired into whether we agree on standard tautologies.

#21

  • Lurker
  • 0

Posted 19 May 2006 - 09:03 PM

Do you think I'm a fuckin idiot?


No.

#22

  • Lurker
  • 0

Posted 19 May 2006 - 09:12 PM

All we require of Clifford's thought experiment is that it be logically consistent, without contradiction. He may choose to define (or redefine, depending on your position) sentience independent of intelligence for the purpose of this exchange.

While this is probably true, Clifford may as well inquired into whether we agree on standard tautologies.


Perhaps he believed something consequentially significant could be derived from his thought experiment.

Clifford

However, this extreme thought experiment is designed to distill the concept of sentience to something so simple and so powerful that its essence should become obvious to those who consider it. Some may find the use of extreme torment in the thought experiment offensive, but I can think of no more powerful manifestation of sentience than is found in extreme torment.



#23 Kalepha

  • Guest
  • 1,140 posts
  • 0

Posted 19 May 2006 - 09:20 PM

Right. But usually nothing consequentially significant is derivable from tautologies, except sometimes novel tautologies, which, we probably agree, can be useful tools. The meanings imputed on them, however, tend to be consequentially significant. Therefore, the meanings become more of an issue, because of the consequences, rather than their logical base.

#24 jaydfox

  • Guest
  • 6,214 posts
  • 1
  • Location:Atlanta, Georgia

Posted 20 May 2006 - 02:18 AM

How is their amazing place of great and dynamic beauty equivalent to oblivion?

My assumption here is that this "great and dynamic beauty" is relative to sentience, in this case the sentience of the visitor from another universe. The intelligences themselves are not sentiently "aware" of this beauty, even if *mechanically* they respond to and/or produce it.

#25 RighteousReason

  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 20 May 2006 - 02:38 AM

not sentiently "aware" of this beauty, even if *mechanically* they respond to and/or produce it.


Haha...

#26 jaydfox

  • Guest
  • 6,214 posts
  • 1
  • Location:Atlanta, Georgia

Posted 20 May 2006 - 02:59 AM

I suppose there are a couple different issues here that complicate this for me. To me, "intelligence" without sentience has no intrinsic value, or perhaps more appropriately, its "value" is measured on a scale that is incomparable to sentience. The Louvre in France is worth less than a single human life, to me anyway, at least in terms of any "intrinsic" value (Don will probably hammer me on using the word "intrinsic" here). Of course, for pragmatic reasons, a decision between saving a life and saving the Louvre is not so simple: the Louvre has value in the lives of a very large number of sentients.

Let's say the Louvre is somehow wisked away to a far corner of the universe, where it'll be trillions of years before humanity could ever hope to find and recover it.

Then I'm given the choice between saving the Louvre or saving a single human life. Now it's plainly obvious to me that the single human life wins (unless for some reason I wanted that life to end, such as if that person was about to kill ten other people. But that's a complication that's useless to the original question).

Let's say there's a hypothetical program that's so powerful that it can play nearly perfectly: chess, go, hex, checkers, etc., the full gamut of "standard" board games.

The program is very efficient, being less than 32 MB of code and using less than 1 GB of memory, and it runs in a "normal" amount of time on PC using currently available technology. In other words, it's not just using tables or trying every combination. It's actually "intelligent", which of course explains why it plays nearly perfectly (but, say, better than any programs already in existence, short of "solved" games which can already be played perfectly). Short of lookup tables, it hasn't been determined yet if there are simple alogrithms that can play these games perfectly.

Anyway, the point is, this mythical program probably exists, if you could try all ~2^(2^29) programs of 32 MB or less for a particular computer chipset. In fact, there are probably a great number of such programs, some close variations, others quite different but similar in resultant functionality.

Now, would the actual code for one of these programs be worth more than a human life? Well, that *depends* on whether that code was available to other sentients. This code could be a great tool to be used by us humans to develop AGI faster, which in turn could lead to solving various problems facing humanity, and ultimately save billions of lives. Aside from this practical aspect, the program could provide a teaching basis for people to *enjoy* these various games as they learn them.

The value of this program is tied to its effect on sentients. But if a copy of this program existed by chance in a pattern of matter (or even on a CD created by superbeing with a weird sense of humor) somewhere billions of light-years away (i.e., where it can't affect us sentients), then its "value" is essentially null. It doesn't have intrinsic value.

What about alien sentients billions of light-years away? They can't affect me, or you, or any of us (short of sci-fi effects: FTL travel, etc.). So in a way, they're in the same boat as the program. But as far as I'm concerned, they have intrinsic value simply because they are sentient. Choosing between them and a human life is more difficult. Of course, there are all the complicating factors: a single human life touches directly and indirectly the rest of humanity, so losing that life has effects that ripple across billions of living an uncountably many future sentient humans.

But let's try an easy one: an entire race of sentient aliens, versus one human life. I'd be very inclined to choose to save the race of sentient alien. We can complicate it by saying, "What it the human was your son?", "What if the human was you?", "What if the human was someone who would develop and AGI ten years earlier than anyone else could, and hence save hundreds of millions of lives by getting the Singularity going earlier?"

But if we're just talking about one, rather ordinary, very anonymous human? I'd probably pick the alien race. But I wouldn't pick an unsentient intelligence, no matter how intelligent, if it had no way of impacting sentients. Not even if the program was intelligent enough to study humanity and solve all our political and economic problems (not that it could, since I already qualified this intelligence as one that has no way of impacting sentients), or write musical compositions more beautiful than any written by the human race, or to study and derive the laws of physics sufficiently to engineer technology we can't even dream of (unless that technology somehow sparks sentience, of course :) ).

#27 Kalepha

  • Guest
  • 1,140 posts
  • 0

Posted 20 May 2006 - 03:06 AM

Sounds like you're high on life, Jay. [tung]

#28 RighteousReason

  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 20 May 2006 - 04:29 AM

Jay


...

WHAT?

How can people even begin to speak without somehow answering the questions I previously posed? I really don't get it.

#29 DJS

  • Guest
  • 5,798 posts
  • 11
  • Location:Taipei
  • NO

Posted 20 May 2006 - 04:35 AM

unless that technology somehow sparks sentience, of course


Assuming there is some magical mystical (and completely distinct) thing called sentience, this would be an interesting possibility.

#30 DJS

  • Guest
  • 5,798 posts
  • 11
  • Location:Taipei
  • NO

Posted 20 May 2006 - 04:51 AM

Hank

Are you sure that sentience isn't so fundamental to intelligence that "intelligence for the sake of intelligence alone" isn't simultaneously for the sake of sentience, by necessity?


Yes, this is one half of the dichotomy, Hank.

All of this can be summed up with one question.

Do brain states=mind states?

Of course, intuitions will differ.

What creates intuitions?




1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users