• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
- - - - -

What is consciousness?


  • Please log in to reply
55 replies to this topic

#31 braz

  • Guest
  • 147 posts
  • 0
  • Location:Los Angeles, USA

Posted 09 March 2007 - 03:23 AM

Let me sum it up on how I percieve consciousness. It it completely bound with our self-created language, which expresses reactions to inner desires, which are based upon our own human instincts. Our physical senses of perception constantly supply the brain with new information, which in turn create a wider array of subconscious processes. The very fact that I am typing this right now simply expresses the fact that I am interested in this topic due to my human curiosity to learn facts about myself and the world.

To say it shortly: in my opinion, consciousness is the property of the brain which communicates and expresses the subconsious processes that incur within our minds. The subconscious processes are not controllable, the mere illusion that we are the masters of our bodies stems from another subconscious process that tells us to act so. Why does it exist? It just DOES. We can't say why the speed of light is the way it is, or why matter exists. These things just happen.

As of to the question of the soul, it seems utterly ridiculous to extend our personalities beyond our physical bodies. As Einstein said it "Since our inner experiences consist of reproductions and combinations of sensory impressions, the concept of a soul without a body seems to me to be empty and devoid of meaning." Indeed, we all have experienced a night of dreamless sleep when our consciousness is essentialy shut off. Many have experienced lapsing out of consciousness due to physical trauma or mental stress. WHY is it so hard to accept the fact that after the phsical death and disintegration of the brain, loss of consciousness is permanent? Our sheer human nature which formed through the process of evolution and natural selection is terrified and repulsed by the idea of death. The human mind is a very much erroneous machine, and based on the lack of information and knowledge, the brain will form incorrect perception of reality simply to satisfy the need for curiosity and vast array of inner desires. This would explain why humans have formed so many different types of religions and shaped most beliefs to their own image.

So here we are, tiny parasitic ogranisms created by chance on an insignificant spectacle of dust in the universe, trying to gain knowledge and undertanding. And at this point, only one thing is for certain: there lies no certainty ahead.

#32 ajar

  • Guest
  • 13 posts
  • 0

Posted 10 July 2007 - 02:00 PM

When I try to define or read definitions about consciousness, it seems like being on a shooting range of 600m trying to hit the maximum score.

Impossible some seem to say and bust their stay in the same instant.
Others try and then smiling give up, knowing it too difficult to master a gun that well.
Some have expensive snipers with which they use some time and hit the maximum score - but is it ethical to spend that much money or time on a such thing?

Then there are those, who after everyone has gone home, when it's almost a pitch black night, take a spike and run 600 meters and punch a hole in the target with a childish grin on their face. - but isn't that cheating, too little effort spent in a such thing?

Perhaps you can't define consciousness in this language, this culture and these scientific methods well at all... Perhaps need to evolve and develop an economy first?

Edited by ajar, 10 July 2007 - 02:13 PM.


#33 diamondhead

  • Guest
  • 31 posts
  • 0

Posted 10 July 2007 - 03:59 PM

Very interesting topic


Imagine in the future, there is a way to connect your mind to a computer and see your thoughts, your dreams, actual images!

The future is coming! : )

sponsored ad

  • Advert

#34 yoktar

  • Guest
  • 11 posts
  • 0

Posted 04 November 2007 - 05:21 PM

I believe there are layers of consciousness. And the God represents the top layer. :)

#35 AdamSummerfield

  • Guest
  • 351 posts
  • 4
  • Location:Derbyshire, England

Posted 13 November 2007 - 12:15 AM

I believe that consciousness simply refers to an awake state, or possibly a state of unwakefulness but that the brain is still recieving information such as body temperature. This belief of mine stems down; - that consciousness is the processing of sensory information.

- Adam

#36 eternaltraveler

  • Guest, Guardian
  • 6,471 posts
  • 155
  • Location:Silicon Valley, CA

Posted 13 November 2007 - 12:23 AM

Consciousness is analogous to free will. For all practical intents and purposes we have it, but on the most fundamental level it doesn't exist.

#37 kent23

  • Guest
  • 146 posts
  • 1
  • Location:University of Arizona, Tucson, AZ, EEUU

Posted 13 November 2007 - 09:37 AM

This is all a waste of time until we have a full mechanistic understanding of human neurobiology. Probably.

#38 Luna

  • Guest, F@H
  • 2,528 posts
  • 66
  • Location:Israel

Posted 13 November 2007 - 09:57 AM

I believe consciousness is just an illusion.
We're ntohing more than organic robots. complicated, yes. but robots.

#39 platypus

  • Guest
  • 2,386 posts
  • 240
  • Location:Italy

Posted 13 November 2007 - 04:31 PM

I believe consciousness is just an illusion.
We're ntohing more than organic robots. complicated, yes. but robots.

So we're unconscious, as in "passed out"? Or do you mean that the "self" is an illusion? (being "awake" cannot be an illusion as one needs to be awake to have illusions of any kind)

#40 Cyberbrain

  • Guest, F@H
  • 1,755 posts
  • 2
  • Location:Thessaloniki, Greece

Posted 13 November 2007 - 04:31 PM

noun: an alert cognitive state in which you are aware of yourself and your situation

From the different definitions of consciousness I found in dictionaries and Wikipedia, I would have to say that it's our cognitive ability
to be aware of our existence and to perform logic and reasoning processes. Yes our brain is nothing but a complicated computer,
but consciousness is the ability to obtain memory and to find a logical pattern in that memory in order to generate tasks for our bodies
to perform so that we may survive and live. Except that this process of analyzing a logical pattern in memory is so complex, that we
have been able to generate a broad level of understanding and knowledge. This knowledge is than reanalyzed in the same way as
the previous. But the analysis of this level of memory (since it not the same as the previous type of memory - ie it is memory we generate
in our minds) somehow has given us the ability to generate logical process with which we associate with the word "awareness". And thus
we have been able to understand our own independent existence based completely on cognitive processes that are used in establishing
order and pattern to memory we have perceived from our five senses.

Just my two cents [lol]

#41 dangerousideas

  • Guest
  • 60 posts
  • 0
  • Location:Alberta, Canada

Posted 14 November 2007 - 03:37 PM

For what it is worth, here is an extract from a post that I made about 10 years ago on Cryonet: http://www.cryonet.o...sp.cgi?msg=8324.

While I now consider this analysis to be somewhat simplistic (as a meta-model), and while it was clearly focused upon robotics (and hence practical considerations of sensing, computation, and actuation given hardware and software as it was understood in the day) I still think that the 4 "necessary and sufficient" elements that I identified within a "definition of conciousness", as well as the idea that a "thermostat level feedback loop" represents a "quantum of conciousness" are worthwhile contributions to the discussion.
---

An Alternate Definition (of conciousness) (drawn somewhat by my work on conceptual mapping
for autonomous systems):

A system is conscious if (and only if) it has all of the following 4
features:
1. it has the capacity to sense it's environment (it can perform "Data
Acquisition").
2. It has a capacity for "situational awareness" (it can perform "Data
Interpretation")
3. It has the capacity for "anticipation" (it can perform "Data
Extrapolation and Inference")
4. It is capable of "behavior" (it can act upon the "Reality State" from
within it)

Terminology can be a frustrating source of confusion. I will attempt to
supply clear definitions of my terminology so interested parties can
match the definitions to their own terminology without loss or
distortion of the information (meaning and truth) being conveyed.

My terminology is defined as follows:

1. Reality State - the observer independent state of reality as defined
by the Critical Realist philosophical point of view: "There is a
physical world which exists objectively and independently of our minds.
Our sense data can be used to examine the world itself."
2. Sensor - a physical system incorporating a measuring transduction
structure and a transmission structure to output results of the system.
3. Sensor Capabilities - the limiting characteristics of the sensor
system. In general these are technical limitations resulting from the
imperfect state of the art and the laws of science. These limitations
may also be intentional so that undesired behavior can be avoided.
4. Data Acquisition - the activity of obtaining sense data, and
transforming that data into quantitative information about the measured
parameter.
5. Sensor Report - a data structure containing, directly or indirectly,
quantitative information on the state of measured parameters. In
practice, the Sensor Report takes the form of a mapping of the
independent variables of the Physical Model to their measured or
inferred values.
6. Intelligence - a mechanism characterized by its intrinsic cognitive
ability to resolve problems in complex reasoning by manipulating
abstractions in an algorithmic and/or intuitive manner.
7. Algorithmic Intellect - a processing mechanism that uses mathematical
laws and algorithmic procedures to interpret or reduce data. In
practice, the Algorithmic Intellect creates quantitative descriptions of
the dependent variables using the independent variables from the Sensor
Report, according to the deterministic equations of the Physical Model.
8. Intuitive Intellect - a processing mechanism that rejects the
universal validity of the logical law known as the "Law of the Excluded
Middle" (a fundamental law of the Algorithmic Intellect), and allows the
probabilistic resolution of that class of propositions that are
undecidable in the absence of adequate proof. In other words, this is
the intellect that resolves ambiguities and generates creative original
solutions by supplying theories and hypothesis (which may be tested by
other means) when complete information is unavailable.
9. Reality Model - a conceptual map of the idea of reality itself. In
particular, this conceptual map will contain a model of the physical (ie
sensible or measurable) aspect (a model of being) and a model of the
operational (ie. Behavioral ) aspect (a model of becoming).
10. Physical Model - a deterministic set of relationships between
measurable parameters based upon physical science. In practice, this
model will take the form of a more or less fully defined set of
mathematical equations describing the relationships between measurable
or inferrible parameters.
11. Operational Model - a probabilistic set of relationships between
transformational entities based upon social and operational science. In
effect it is a probabilistic model of the steps (operations) required to
transform one Reality State into another.
12. Data Interpretation - the activity that transforms Sensor Reports
into Situational Awareness, through the application of Intelligence,
within the constraints of the Reality Model. In practice, this involves
first solving the equations of the Physical Model so that the Reality
State of being becomes more or less fully defined, and then assigning
qualitative and contextual significance to that Reality State of being
(ie. The solution matrix of the Physical Model) such that the driving
factors in the Operational Model become defined.
13. Situational Awareness - a data structure containing quantitative,
qualitative, and contextual information on the independent and dependent
parameters of the Physical Model, with tentative evaluations of the
undecidable propositions resolved by the Intuitive Intellect. All
driving factors (ie. Independent parameters) of the Operational Model
are declared or inferred in the Situational Awareness.
14. Anticipatory Control - the activity that generates a goal oriented
Instruction Set on the basis of an intelligent evaluation of the
situation. This evaluation is constrained by the Reality Model, which
identifies goals, and the set of operations needed to achieve them, on
the basis of its Operational Model.
15. Instruction Set - a data structure that contains a quantified
description of intentions that are ultimately reducible to sets of
controller impulses to mechanisms.
16. Entity - a coherent assembly of physical systems that acts upon the
Reality State from within it.
17. Entity Capabilities - the limiting characteristics of the Entity.
In general these are technical limitations which result from the
imperfect state of the art and from the laws of science. These
limitations may also be intentional so that undesired behavior is
avoided.
18. Mission Execution - the activity of engaging the entity to perform
tasks that alter the Reality State so that it corresponds to the goal
intended by the Instruction Set.


As a simple test, we can try to assign a very (very) minor level of
consciousness to a simple feedback mechanism such as an automatic
thermostat:
1. It senses temperature.
2. It forms a situational awareness (from its extremely simple
"thermostat level Reality Model" and "thermostat level Algorithmic
Intellect") about whether or not the temperature is below its set-point.
3. It decides to switch on (or not) based upon that temperature,
4. It demonstrates "observable behavior" by turning on the furnace or
the air conditioner.

Some will doubtless argue that at 3 there is no "anticipation"; however
I would argue that the anticipation in inherent in the simple
deterministic Reality Model of such a system. Others will argue at 3
that there is no opportunity for "wrong" choices (i.e. "free will"), and so there is no
consciousness because the outcome is pre-determined. I would argue
firstly that this supposes that an absence of determinism is a
pre-requisite for consciousness and there is no evidence for such an
assumption (for example, evidence that any system in the human organism
functions in a non-deterministic manner), and secondly that the
appearance of strict pre-determination is an illusion based upon the
macroscopic scale of the entity - if we consider a "nano scale
thermostat" then the possiblity of "wrong" choices becomes more
apparent.

My position is that the simple "feedback loop" discussed here represents
a possible "quantum" of conciousness, and that the high level
conciousness demonstrated by advanced entities (such as ourselves) is an
emergent "observable" created by the synergy of countless millions of
such quantums. The implication of this view for cryonics is that the
information concept of identity could theoretically be valid if it can
accurately simulate the function and synergy of these countless
quantums, many of which loop through the environment as well as the
entity, but that it is difficult to see how this might be done in a
practical sense without duplicating the entity physically at the scale
at which these quantums occur. Hence, I am inclined toward the concept
of reanimation via physical nanoscale repair, which I would consider
both necessary and sufficient to re-establish both consciousness and
identity.

#42 gashinshotan

  • Guest
  • 443 posts
  • -2

Posted 19 November 2007 - 05:07 AM

Consciousness is just a biological process. The evidence? You can change a person's attitudes, emotions, values, beliefs, even memories with drugs, suggestion, conditioning, and lobotomies ;).

#43 Lazarus Long

  • Topic Starter
  • Life Member, Guardian
  • 8,116 posts
  • 242
  • Location:Northern, Western Hemisphere of Earth, Usually of late, New York

Posted 02 April 2008 - 02:30 AM

Here is an interesting article on sensory perception and cognition.

What also interested me about it was that it arrived coincidentally in the next day's news immediately after listening to Daniel Dennet talk on the same subject.

Can we know our own mind?


Blind to Change, Even as It Stares Us in the Face

http://www.nytimes.c...amp;oref=slogin

When Jeremy Wolfe of Harvard Medical School, speaking last week at a symposium devoted to the crossover theme of Art and Neuroscience, wanted to illustrate how the brain sees the world and how often it fumbles the job, he naturally turned to a great work of art. He flashed a slide of Ellsworth Kelly’s “Study for Colors for a Large Wall” on the screen, and the audience couldn’t help but perk to attention. The checkerboard painting of 64 black, white and colored squares was so whimsically subtle, so poised and propulsive. We drank it in greedily, we scanned every part of it, we loved it, we owned it, and, whoops, time for a test.

Dr. Wolfe flashed another slide of the image, this time with one of the squares highlighted. Was the highlighted square the same color as the original, he asked the audience, or had he altered it? Um, different. No, wait, the same, definitely the same. That square could not now be nor ever have been anything but swimming-pool blue ... could it? The slides flashed by. How about this mustard square here, or that denim one there, or this pink, or that black? We in the audience were at sea and flailed for a strategy. By the end of the series only one thing was clear: We had gazed on Ellsworth Kelly’s masterpiece, but we hadn’t really seen it at all.

The phenomenon that Dr. Wolfe’s Pop Art quiz exemplified is known as change blindness: the frequent inability of our visual system to detect alterations to something staring us straight in the face. The changes needn’t be as modest as a switching of paint chips. At the same meeting, held at the Italian Academy for Advanced Studies in America at Columbia University, the audience failed to notice entire stories disappearing from buildings, or the fact that one poor chicken in a field of dancing cartoon hens had suddenly exploded. In an interview, Dr. Wolfe also recalled a series of experiments in which pedestrians giving directions to a Cornell researcher posing as a lost tourist didn’t notice when, midway through the exchange, the sham tourist was replaced by another person altogether.


These two discussions are relevant in two ways to this thread, first the debate over can we ever really understand the mind (Dennet) and second, if consciousness is the result of a summation of experience and that experience is largely sensory and potentially flawed; why do we treat the idea of the self as immutable and somehow sacrosanct?

It seems to me that the debate over uploading depends on the issue of self awareness as information not biology and ultimately so what if reassembled information is changed by the experience?

The only constant for living organisms is change in the first place.

I think the reason is the result of a common conflation of theories of mind with those of what people *feel* is the *soul* have corrupted the discussion.

BTW It appears there are a number of new threads I need to reference at the beginning of this one.

#44 Lazarus Long

  • Topic Starter
  • Life Member, Guardian
  • 8,116 posts
  • 242
  • Location:Northern, Western Hemisphere of Earth, Usually of late, New York

Posted 02 April 2008 - 03:31 AM

Bertrand Russel on the Mind Body Problem


The Problem of Consciousness by Daniel Dennett (6 Parts)












After listening to a few of these lectures I find much of the content gets recycled but it is still valuable to listen to the whole lecture to appreciate the nuance of applicability in each instance.

#45 Lazarus Long

  • Topic Starter
  • Life Member, Guardian
  • 8,116 posts
  • 242
  • Location:Northern, Western Hemisphere of Earth, Usually of late, New York

Posted 02 April 2008 - 05:59 AM

Here is another great lecture by John Searle: "Beyond Dualism" on the Mind-Body problem. (7 parts)















#46 Custodiam

  • Guest
  • 62 posts
  • 3
  • Location:Hungary

Posted 17 June 2008 - 01:18 PM

Welcome Everybody!

This is my first post. :)

Very interesting thread!

I have a few questions...

What if the method of scientific language in itself is incapable of describing consciousness?

I mean what would be the "solution" of consciousness? 42? Psi=mc2?

Can there be a scientific result at all?

Do we have to understand consciousness or understanding the brain is enough?

Edited by Custodiam, 17 June 2008 - 01:23 PM.


#47 Brafarality

  • Guest
  • 684 posts
  • 42
  • Location:New Jersey

Posted 17 June 2008 - 02:37 PM

Indeed, we all have experienced a night of dreamless sleep when our consciousness is essentialy shut off. Many have experienced lapsing out of consciousness due to physical trauma or mental stress.


This is actually a tough sticking point for mind=brain models of consciousness. You wouldn't think it because it seems to bolster rather than weaken the model, but here is the problem:

If consciousness is shut off during sleep. Where does it go? From where does it return when we awaken?

It has to be stored somewhere, for if consciousness is just an emergent property or byproduct or epiphenomenon of neural activity, then when that neural activity ceases to generate the consciousness, it should disappear forever. Gone! It need not happen on death alone, but at any time the brain stops generating it.
But it doesn't. It returns. We are ourselves after waking up from loss of consciousness. There is autobiographical memory/continuity/apparent permanence from a turbulent, flickering brain.
I think this should be called the 'Poof! and Poof! Effect' :)

Some (will locate links to articles/sites, etc. soon), with absolute certainty, conclude that this is proof of the mind surviving bodily death, the only problem being that, although consciousness seems to be stored somewhere beyond the brain, it only seems to function in the space time continuum when the brain is generating it, or summoning it or whatever term you prefer.

That is, consciousness, in some aspects, seems larger than the brain, but, apparently not in the basic conscious state we most often experience. Perhaps consciousness is not only a seemingly emergent brain property, it could also be an epiphenomenon of some other entity/abstraction.

Just a few impressions. :)

Edited by paulthekind, 17 June 2008 - 02:41 PM.


#48 E.T.

  • Guest
  • 183 posts
  • 3

Posted 17 June 2008 - 07:11 PM

Consiousness is the ability to receive input, process it, and expel consequential output. There are different types of consiousness, and different levels of algorithmic complexities. But, being anthropocentric (did I spell it right?), humans believe only their type of consciousness is real consciouness, which is fine, since people are free to define things however they want.

Edited by E.T., 17 June 2008 - 07:15 PM.


#49 kiriel

  • Guest
  • 53 posts
  • -0

Posted 21 June 2008 - 12:15 PM

The question: What is consciousness is in my opinion completely incorrect. The questions that should be asked is in my opinion is: what structure is there to this phenomena that might be called consciousness.

#50 Buddenbrook

  • Guest
  • 7 posts
  • 0

Posted 27 July 2008 - 11:10 PM

The hard problem of consciousness could very well be insolvable. Imagine a universe that is exactly like ours, except for one difference, all beings are information processing zombies (yet behave in the exact same way we do). Are the laws of nature in that universe any different?
The question could be similar to why do elementary particles or energy behave in certain manners? And why do they exist / keep on existing? And why do they have the qualities they do? They just do.

If consciousness was found to be irreducible, then that would huge in itself.


From what I have read and discussed most modern neuro-scientists consider consciousness a collective phenomena, that can not be localized to any particular area(s) of the brain. The human brain consists of around 100 billion neurons. An average neuron can be connected to upto 10,000 other neurons via synaptic connections. And a synapse can fire upto 1,000 times a second. The "synaptic self" explains to an extent why we are the kind of persons and personalities we are, but it doesn't explain consciousness.

The theory of collective phenomena is supported by the fact that there is no known type of brain damage that would shut down the consciousness of a person that is not in coma or a vegetative stage.

#51 drus

  • Guest
  • 278 posts
  • 20
  • Location:?

Posted 20 December 2008 - 12:27 AM

For what it is worth, here is an extract from a post that I made about 10 years ago on Cryonet: http://www.cryonet.o...sp.cgi?msg=8324.

While I now consider this analysis to be somewhat simplistic (as a meta-model), and while it was clearly focused upon robotics (and hence practical considerations of sensing, computation, and actuation given hardware and software as it was understood in the day) I still think that the 4 "necessary and sufficient" elements that I identified within a "definition of conciousness", as well as the idea that a "thermostat level feedback loop" represents a "quantum of conciousness" are worthwhile contributions to the discussion.
---

An Alternate Definition (of conciousness) (drawn somewhat by my work on conceptual mapping
for autonomous systems):

A system is conscious if (and only if) it has all of the following 4
features:
1. it has the capacity to sense it's environment (it can perform "Data
Acquisition").
2. It has a capacity for "situational awareness" (it can perform "Data
Interpretation")
3. It has the capacity for "anticipation" (it can perform "Data
Extrapolation and Inference")
4. It is capable of "behavior" (it can act upon the "Reality State" from
within it)

Terminology can be a frustrating source of confusion. I will attempt to
supply clear definitions of my terminology so interested parties can
match the definitions to their own terminology without loss or
distortion of the information (meaning and truth) being conveyed.

My terminology is defined as follows:

1. Reality State - the observer independent state of reality as defined
by the Critical Realist philosophical point of view: "There is a
physical world which exists objectively and independently of our minds.
Our sense data can be used to examine the world itself."
2. Sensor - a physical system incorporating a measuring transduction
structure and a transmission structure to output results of the system.
3. Sensor Capabilities - the limiting characteristics of the sensor
system. In general these are technical limitations resulting from the
imperfect state of the art and the laws of science. These limitations
may also be intentional so that undesired behavior can be avoided.
4. Data Acquisition - the activity of obtaining sense data, and
transforming that data into quantitative information about the measured
parameter.
5. Sensor Report - a data structure containing, directly or indirectly,
quantitative information on the state of measured parameters. In
practice, the Sensor Report takes the form of a mapping of the
independent variables of the Physical Model to their measured or
inferred values.
6. Intelligence - a mechanism characterized by its intrinsic cognitive
ability to resolve problems in complex reasoning by manipulating
abstractions in an algorithmic and/or intuitive manner.
7. Algorithmic Intellect - a processing mechanism that uses mathematical
laws and algorithmic procedures to interpret or reduce data. In
practice, the Algorithmic Intellect creates quantitative descriptions of
the dependent variables using the independent variables from the Sensor
Report, according to the deterministic equations of the Physical Model.
8. Intuitive Intellect - a processing mechanism that rejects the
universal validity of the logical law known as the "Law of the Excluded
Middle" (a fundamental law of the Algorithmic Intellect), and allows the
probabilistic resolution of that class of propositions that are
undecidable in the absence of adequate proof. In other words, this is
the intellect that resolves ambiguities and generates creative original
solutions by supplying theories and hypothesis (which may be tested by
other means) when complete information is unavailable.
9. Reality Model - a conceptual map of the idea of reality itself. In
particular, this conceptual map will contain a model of the physical (ie
sensible or measurable) aspect (a model of being) and a model of the
operational (ie. Behavioral ) aspect (a model of becoming).
10. Physical Model - a deterministic set of relationships between
measurable parameters based upon physical science. In practice, this
model will take the form of a more or less fully defined set of
mathematical equations describing the relationships between measurable
or inferrible parameters.
11. Operational Model - a probabilistic set of relationships between
transformational entities based upon social and operational science. In
effect it is a probabilistic model of the steps (operations) required to
transform one Reality State into another.
12. Data Interpretation - the activity that transforms Sensor Reports
into Situational Awareness, through the application of Intelligence,
within the constraints of the Reality Model. In practice, this involves
first solving the equations of the Physical Model so that the Reality
State of being becomes more or less fully defined, and then assigning
qualitative and contextual significance to that Reality State of being
(ie. The solution matrix of the Physical Model) such that the driving
factors in the Operational Model become defined.
13. Situational Awareness - a data structure containing quantitative,
qualitative, and contextual information on the independent and dependent
parameters of the Physical Model, with tentative evaluations of the
undecidable propositions resolved by the Intuitive Intellect. All
driving factors (ie. Independent parameters) of the Operational Model
are declared or inferred in the Situational Awareness.
14. Anticipatory Control - the activity that generates a goal oriented
Instruction Set on the basis of an intelligent evaluation of the
situation. This evaluation is constrained by the Reality Model, which
identifies goals, and the set of operations needed to achieve them, on
the basis of its Operational Model.
15. Instruction Set - a data structure that contains a quantified
description of intentions that are ultimately reducible to sets of
controller impulses to mechanisms.
16. Entity - a coherent assembly of physical systems that acts upon the
Reality State from within it.
17. Entity Capabilities - the limiting characteristics of the Entity.
In general these are technical limitations which result from the
imperfect state of the art and from the laws of science. These
limitations may also be intentional so that undesired behavior is
avoided.
18. Mission Execution - the activity of engaging the entity to perform
tasks that alter the Reality State so that it corresponds to the goal
intended by the Instruction Set.


As a simple test, we can try to assign a very (very) minor level of
consciousness to a simple feedback mechanism such as an automatic
thermostat:
1. It senses temperature.
2. It forms a situational awareness (from its extremely simple
"thermostat level Reality Model" and "thermostat level Algorithmic
Intellect") about whether or not the temperature is below its set-point.
3. It decides to switch on (or not) based upon that temperature,
4. It demonstrates "observable behavior" by turning on the furnace or
the air conditioner.

Some will doubtless argue that at 3 there is no "anticipation"; however
I would argue that the anticipation in inherent in the simple
deterministic Reality Model of such a system. Others will argue at 3
that there is no opportunity for "wrong" choices (i.e. "free will"), and so there is no
consciousness because the outcome is pre-determined. I would argue
firstly that this supposes that an absence of determinism is a
pre-requisite for consciousness and there is no evidence for such an
assumption (for example, evidence that any system in the human organism
functions in a non-deterministic manner), and secondly that the
appearance of strict pre-determination is an illusion based upon the
macroscopic scale of the entity - if we consider a "nano scale
thermostat" then the possiblity of "wrong" choices becomes more
apparent.

My position is that the simple "feedback loop" discussed here represents
a possible "quantum" of conciousness, and that the high level
conciousness demonstrated by advanced entities (such as ourselves) is an
emergent "observable" created by the synergy of countless millions of
such quantums. The implication of this view for cryonics is that the
information concept of identity could theoretically be valid if it can
accurately simulate the function and synergy of these countless
quantums, many of which loop through the environment as well as the
entity, but that it is difficult to see how this might be done in a
practical sense without duplicating the entity physically at the scale
at which these quantums occur. Hence, I am inclined toward the concept
of reanimation via physical nanoscale repair, which I would consider
both necessary and sufficient to re-establish both consciousness and
identity.



this is one of the most interesting and intelligent answers regarding the nature of consciousness that i have ever read. i was going to post my thoughts but dangerousideas seems to have taken the words right out of my mouth! haha! excellent post...we think alike, i too suspect quantum mechanics/processes play a larger role than is currently afforded.

#52 LET ME GET EM!

  • Guest
  • 60 posts
  • 0
  • Location:There

Posted 19 January 2009 - 06:47 PM

Consciousness is the unicameral-mind, as it were. Speaking physically, it is the new unified communication network within the brain whereby two hemispheres function synergetically to create higher order of operation which is self-referential and introspective. Speaking metaphysically, it is the new integrated inventorying of reality in abstract concepts that organizes myriad facts of experience in accordance with conceptually identified logic. In addition, consciousness is an operative modality of the brain or the mind which is inherently non-automatic. That is, there is nothing that causes consciousness to operate automatically... Consciousness is the causative factor relative to its own existence. There is nothing in existence that can make an individual conscious but his own act of being conscious. Consciousness exist as an entelechy; when it exists it exists in its full manifestation, and when it exists not it exists not at all. That means consciousness never evolves. It is an individual's very act of being conscious that brings consciousness into being its utter totality. Thus, in reality, when one is conscious, he is fully conscious with nothing missing and nowhere to evolve... Furthermore, in the beginning, however consciousness sought to operate within the lost matrix of the bicameral mind, that is, consciousness made the the entire inventory of brain-stored information emulate the dominate hemisphere of the bicameral mind. Consciousness operating in the emulated bicameral mentality is perceptivity-centered consciousness. It is when consciousness developed a sufficiently integrated conceptual knowledge internally that consciousness made the shift from the subjective to the objective and became aware of reality, objective and in essence abstract. Consciouness operating in the context of abstract and objective reality is a conceptuality-centered consciousness.

#53 anderl

  • Guest
  • 10 posts
  • 0

Posted 01 March 2009 - 04:55 PM

I agree withPosted Imagegashinshotan and Posted ImageWinterbreeze. Consciousness is just eh brains need to cope with being a reactionary engine and its ability to store and communicate information. It's nothing more than the bouncing of particles off of each other through complex structures and the recording of those actions to reproduce them over the over again. We react to the events in our environment and filter them throguh these complex structures that are brains and the brains responded to those events based on the collection of previous responses to similar or associated events and its subsequent reactions in the past.

Conciousness came into being when more and more events because virtual (within the mind) when the storage of information allows the structure that is the brain to go over simulations of events which allowed for improved performance to simialar events in the future. This mental internalization allows for a distinction between what the mind is doing and what the body is doing. This creates a self that is distinct form the environment. THe added bonus was that mental simulaitons could introduce objects or concpets that did not normals happen in the real world. That sparked creativity and probably allowed for tool development, craft, language and artistry.

#54 scrappy

  • Guest
  • 1 posts
  • 0

Posted 01 March 2009 - 06:04 PM

But for consciousness to happen there must be a language. It starts there. In other life forms besides humans—including other animals, fungi, plants, protests, and prokaryotes, there are languages of all sorts, many of which are chemical, but some others are visual, auditory, or tactile. In humans we have a symbolic language with syntax. But it's only an "extended phenotype," in Dawkins' sense, which is what all forms of language and consciousness are to their biological entities. Human conscuousness differs from others only in that we have a digitally symbolic language with syntax. A bird has a symbolic language for mating and other social activities in its song and dance, but it lacks syntax. It can communicate how desirable he/she is for mating, but he can't put it into syntax. A human can provide syntax when he says, "Mate with me 'cuz I love you." A sparrow can communicate his mating offer only by saying in song or dance, "Mate with me 'cuz I'm sexy."

From all I can gather, if you take away language you take away consciousness. It may be the one extended phenotype that is common to all life. And bear in mind that our genes have a digital language, too, according to Dawkins, Dennett et al., and it's symbolic (check out the genetic dictionary). Genes appear almost conscious in their deterministic strategies for survival (check out the Red Queen). And as far as immortality goes, they are as close to being immortal as anything can get. Some genes have survived for billions of years.

The key to immortality, I think, will come with recognizing that biological immortality will not happen at the organismic level. I'm curious about what's in store for us at the digital level (or eventually the quantum-mechanical level?), which has grown fantastically into useful dimensions. The Singularity will be only an issue of memory magnitude, not necessarily an advance in consciousness evolution, but it may be another phenotype extension that will rival anything so far.

Think about it. How many times a day do you go to a search engine for information? Then ask yourself, Is Google any better than Mrs. Mertz, your friendly and competent public librarian, was a few years ago? As such, I think consciousness is an extended phenotype and the Singularity will be only a better telescope.

Were you more conscious then with Mrs. Mertz's help? Or are you better off now with a search engine and your own fingertips? If immortality is an existential thing, this matters.

#55 Taelr

  • Guest
  • 29 posts
  • 0
  • Location:Sunnyvale, CA

Posted 11 June 2009 - 06:09 AM

lucid,

Artificial Intelligence is artificial.

Artifical simply means man made it doesn't mean the intelligence isn't real.

To the outside observer it may display symptoms of human consciousness, but it is just ones and zeros.

And the brain is not so different. Comprised of some 200 billion tiny microprocessors (neurons) that each accepts thousands of inputs from other neurons, and at a certain voltage threshold dependent on the inputs from other neurons a particular neuron will fire, and that is a digital function. To fire=1, to not fire=0. And all neurons are operaing in parallel. While the average frequency per neuron is only about 300Hz it does mean the probable aggregate frequency of the brain is some 10,000 times that of our best high end computers at this time. The brain is still a largely digital information processing engine, just massively parrallel.

I generally dislike arguing about definitions, so just read my post in context of the definition I use which is: Consciousness is the feeling of awareness of oneself and the world in which one lives. Consciousness according to my definition only applies to beings with a biological mind where as it is a feeling or a state of the mind.

For any AI that becomes self-aware it must also possess emotions if it to become independent. If we accept that emotions are no more than just processes in the brain and the brain is essentially digital then there is no critical reason why an AI, or perhaps we should use the new term now AGI, could not also exibit emotions. If an AI has no emotions then if set free what would it do, without the emotion of desire it would do nothing, it would have no reason to do anything. Emotions are the key to what drives human action. Robots perform functions because they have no choice, and many lower life forms simply react to external stimuli or hard-wired instincts. But an independent AGI would likely do nothing without an emotional status.

In terms of a definition of consciousness I would see no reason why this should be limited to biology only. What you describe as "feelings" would also need to apply to an AGI and there seems no inherrent reason why not.

But really I do not believe consciousness exists but is only a label we can apply to a cooperative group of semi-independent brain functions. For example self awareness is a result of achieving a threshold level of brain complexity. Is consciousness dependent on self-awareness? A newborn child and for several months is not self-aware, the brain has not grown sufficiently. But does the child possess consciousness during that time? We know such children have emotions and some degree of cognitive abilities. So when does consciousness arrive? It is an arbitary term we have created and doesn't really exist.

What we observe in healthy adults is that cooperative array of self-awarness, emotions, and cognitive abilities. Consciousness is simply the collective term we can apply to those functions.

Edited by Taelr, 11 June 2009 - 06:13 AM.


#56 Custodiam

  • Guest
  • 62 posts
  • 3
  • Location:Hungary

Posted 29 September 2009 - 12:26 PM

I think we forgot the results of Kantian philosophy. I have to point out, that Kant made it clear, that without consciousness there is no space or time or quantity or quality.

So we cannot successfully reduce or understand consciousness with any "hard" scientific method, because consciousness is the precondition of any science.

So I think mainly materialism (which is simply a false metaphysics) is to blame for the misunderstandings surrounding consciousness.

I think we only understand the shadows of real existence and we think that these shadows ("material objects and laws") are real.

We should make a paradigm shift in order to find the bases of a new science, which can understand consciousness.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users