• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
- - - - -

Brain beats all computers in memory capacity


  • Please log in to reply
59 replies to this topic

#1 celindra

  • Guest
  • 43 posts
  • 0
  • Location:Saint Joseph, TN

Posted 14 September 2003 - 07:54 AM


Brain beats all computers

Forecasters who predicted that computers are poised to become more powerful than the human brain have got it hopelessly wrong.

For the first time, researchers have calculated that the power of a single brain in terms of memory capacity and discovered that it is greater than all the computers ever made.

While even the biggest computer has a capacity of around 10,000,000,000,000 bytes (10 to the power of 12), the human brain has a colossal 10 followed by 8,432 noughts, say the scientists who made the calculations in the journal Brain and Mind.

----------------------------

Egads ...

If this is true, then we could be a long way from any successful mind emulation.

#2 Mechanus

  • Guest
  • 59 posts
  • 0

Posted 14 September 2003 - 10:15 AM

[lol]

They've disproven the Bekenstein Bound!

10^8432 is a ridiculous number. I wonder how they calculated it.

sponsored ad

  • Advert

#3 Thomas

  • Guest
  • 129 posts
  • 0

Posted 14 September 2003 - 10:18 AM

The amount of bytes stored in the brains is bellow 10 to the power of 12.

#4 Mechanus

  • Guest
  • 59 posts
  • 0

Posted 14 September 2003 - 10:29 AM

How do you know?

(10^12 has to be a lot closer than 10^8432, though -- I'm suspecting the researchers forgot to take a logarithm somewhere, or something.)

(Or maybe the reporter just screwed up. Ve says 10^12 = 10,000,000,000,000 too.)

Edited by Mechanus, 14 September 2003 - 10:45 AM.


#5 Thomas

  • Guest
  • 129 posts
  • 0

Posted 14 September 2003 - 10:57 AM

300 years of reading at 100 cps and remembering every word. That would be one TB. In fact, nobody has done something close to that. Visual data is more packed, but do we remember enough details from a picture seen for a minute, to need 60 kBytes to store this information? I very much doubt. 1 GB is a closer shot than 1 TB - I guess.

#6 Mechanus

  • Guest
  • 59 posts
  • 0

Posted 14 September 2003 - 11:05 AM

But is that really the right way to count? I don't think you've mentioned nearly all things a human remembers. What about events, facts, languages, sounds, music? Or unused memory? (Does that make sense?) Not all memory needs to be things you can consciously reproduce when commanded to.

Moravec estimates:

The best evidence about nervous system memory puts most of it in the synapses connecting the neurons. Molecular adjustments allow synapses to be in a number of distinguishable states, lets say one byte's worth. Then the 100-trillion-synapse brain would hold the equivalent 100 million megabytes.


here.

But I have no idea whether he's right.

#7 Thomas

  • Guest
  • 129 posts
  • 0

Posted 14 September 2003 - 11:17 AM

I am quite confident, that I can't recall 1000 bytes for every second of my life. It is possible (not likely) that this amount is stored, but not reachable. From the evolutionary point of view - why should I do that?

#8 Mechanus

  • Guest
  • 59 posts
  • 0

Posted 14 September 2003 - 11:29 AM

Dunno. Maybe it's not in the form of facts you can consciously recall, but more in the form of attitudes.

It's certainly true that you remember more than you can choose to recall at any moment. Humans know a horribly large amount of things, and these are not necessarily stored as efficiently as in a computer. Making judgments about how many bits of information you have per second of your life seems very difficult to me.

On the other hand, these are all things I know/understand too little about.

#9 Thomas

  • Guest
  • 129 posts
  • 0

Posted 14 September 2003 - 11:39 AM

Even so, if it is just an attitude, I can hardly believe, that it is a 32 bit attitude resolution, I have. Even in this case, the vast majority of data has been loosely compressed into several bytes. I would be shocked, if all my memories could be stored inside several Mega bytes. But I would recover from this kind of shock quite easily. ;)

#10 Mind

  • Life Member, Director, Moderator, Treasurer
  • 19,058 posts
  • 2,000
  • Location:Wausau, WI

Posted 14 September 2003 - 11:45 AM

10 to the power of 8432 cannot possibly be correct. Best guesses on the number of atoms in the entire universe are between 10 ^70 and 10^100. Even if the brain uses quantum methods for memory it would never be able to store more information than the entire universe contains. The reporter must have gotten the number wrong.

#11 Mechanus

  • Guest
  • 59 posts
  • 0

Posted 14 September 2003 - 12:16 PM

(Observable universe. But still.)

If the Bekenstein bound holds, then a system of the mass and size of the brain can contain about 10^43 bits at most (and for that, it would need to be a black hole, I think; it will certainly still be a huge overestimate).

IIUC, you can get a brain with 10^8432 bits of memory by having a black hole in your head that's about 10^4200 times the radius of the visible universe; or by having a huge network of wormhole-connected baby universes (and exotic matter and whatnot) in your skull. I don't really see another way.

#12 Thomas

  • Guest
  • 129 posts
  • 0

Posted 14 September 2003 - 01:00 PM

Their point is, that the brain are out of physics. Even out of mathematics. A Middle age rigmarole, that is what brains are. This is their point, I guess. [lol]

#13 Mechanus

  • Guest
  • 59 posts
  • 0

Posted 14 September 2003 - 02:00 PM

It wasn't the reporter who goofed:

http://ipsapp007.klu.../5/abstract.htm

Abstract

Despite the fact that the number of neurons in the human brain has been identified in cognitive and neural sciences, the magnitude of human memory capacity is still unknown. This paper reports the discovery of the memory capacity of the human brain, which is on the order of 10^8432 bits. A cognitive model of the brain is created, which shows that human memory and knowledge are represented by relations, i.e., connections of synapses between neurons, rather than by the neurons themselves as the traditional container metaphor described. The determination of the magnitude of human memory capacity is not only theoretically significant in cognitive science, but also practically useful to unveil the human potential, as well as the gap between the natural and machine intelligence.


"to unveil the human potential"? As in, "we only use 10% of our brain"?

If this paper is right, then we only use approximately 0,000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000 00000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000001 % of our brains. (Assuming that what we actually use is less than 10^20 bits.)

;)

Am I missing something?

If not, how does this sort of rubbish get through peer review?

#14 Lazarus Long

  • Life Member, Guardian
  • 8,116 posts
  • 242
  • Location:Northern, Western Hemisphere of Earth, Usually of late, New York

Posted 14 September 2003 - 02:10 PM

Am I missing something?

If not, how does this sort of rubbish get through peer review?


Obviously you aren't missing anything that is how it gets past peer review because they are using such small amounts of their brains. Though I do prefer scientific notation instead of having to process so many zeros please ;))

#15 Mechanus

  • Guest
  • 59 posts
  • 0

Posted 14 September 2003 - 02:20 PM

Heh. Sorry about that; I felt the need to show how tiny that percentage is, and "10^-8400" doesn't look nearly as small.

People should be confronted with enormous numbers now and then. Especially immortals, since they will one day be that old. ;)

Is there a name for numbers that are small enough to write down, but large enough that they fill screens when written out?

#16 Lazarus Long

  • Life Member, Guardian
  • 8,116 posts
  • 242
  • Location:Northern, Western Hemisphere of Earth, Usually of late, New York

Posted 14 September 2003 - 05:11 PM

Is there a name for numbers that are small enough to write down, but large enough that they fill screens when written out?


Not that I am aware of but if there isn't there should be so let's invent one. [>]

I'll start out with minimaximals, or how about Zenumerials?

Zenotation [?]

I think we should name them after Zeno because the smaller they get the larger the expression of them ;))

This is an implied mathematical oxymoron, not a true paradox per se but one that definitely is constantly approaching one. Not to mention that is what we are doing when we apply Zeno's paradox on always halving the distance to the whole. The remainder is constantly smaller and smaller. [lol]

#17 MichaelAnissimov

  • Guest
  • 905 posts
  • 1
  • Location:San Francisco, CA

Posted 16 September 2003 - 10:46 AM

Now now, let's not go calling big numbers ridiculous, they did find a use for Graham's Number, right? (Heh.) And now that you mention the Bekenstein Bound, it's actually interesting that numbers larger than the quantity of Planck masses in the universe have mathematical value.

The journal "Brain and Mind", hmm..? *Makes a mental note to never read it in the future.* Anyway, even if two pounds of meat really could memorize 10^8432 bits of information, then we could simply duplicate the relevant structural characteristics of that meat in a computer and upload ourselves into that. Or, if that proves to be impossible, we could just fab larger organic brains from scratch, preserving the characteristics giving rise to this Bekenstein Bound-smashing memory value, and upload ourselves into that...and so on.

#18 80srich

  • Guest
  • 33 posts
  • 0

Posted 17 September 2003 - 10:54 PM

Maybe that was just a really really clever bloke ;)

Bet he was British;)

#19 darktr00per

  • Guest
  • 52 posts
  • -1

Posted 21 October 2003 - 10:37 AM

Well, I dont see how they can start to compare memory cap. of computers to brains. The brain mainly uses visuals and visual trigger to store memory. A computer that stores bits of information 1 and 0, on or off. How can that even translate into electro-chemical composition of the brain. As far as I know there is no universal for information. Also, I think that the use of hybrid computers will come to light, as far as brain neurons are going to be molded with conventional computer chips. Also, molecular computers will open a new door for cap. for memory and computer function.

#20 nefastor

  • Guest
  • 304 posts
  • 0
  • Location:France

Posted 02 November 2003 - 12:42 AM

Moravec's estimate (the 100-trillion-synapse brain) refers to raw "uncompressed" capacity of the brain and one can't really find a problem with that.

However, the brain is what, in computer science, we call a "processing memory". Today's processing memories are mostly limited to content-addressable memory (CAM) that allows you to make two-way translation tables (particularly useful in network routing equipment) or "best-match" CAM.

The brain also works like a CAM, but adds another (inherent) layer of processing : lossy data compression. Lossy data compression is very powerful, let's take a simple example :

Uncompressed video, in DVD quality, takes 26 MB/s.

The same video, compressed in MPEG-2 with 10 audio tracks (AC-3, 2 languages), takes about 1 MB/s. Video quality is almost impossible to distinguish from uncompressed video.

The same video, compressed in MPEG-4 with 4 audio tracks (WMA, 2 languages), will only take 100 to 200 KILObytes per second... and even if the quality starts lacking, the picture is still very watchable and unmistakable.

The type of data compression the brain uses is far more advanced than MPEG : it will adapt the level of data loss to the usefulness of the data, measured in relation to how much you use (think about) the data. Data loss will always happen on details, so that the most important feature(s) of your memory are "forgotten" last.

If you remember an entire movie by heart for having seen it 50 times over, then this memory alone is equivalent to about 5 or 10 gigabytes of data. And that is really far from everything you know (I suppose :) )

Back to synapses : according to my old neurology / neurobiology lessons, I'd say the state of a synapse cannot be coded satisfyingly on 8 bits, like Moravec suggests, but rather on 16 bits at the very least. 32 bits would seem more than adequate.

By "synapse state" I assume Moravec means "synaptic coefficient", as those used in neural networks simulation (programming). That would be, in C terms, a signed dword type of data (32-bit signed integer).

Jean

#21 ocsrazor

  • Guest OcsRazor
  • 461 posts
  • 2
  • Location:Atlanta, GA

Posted 25 January 2004 - 10:09 PM

Hi Gang,

Just returning, and scanned this real quick, haven't read this article yet Michael, but great find. That number seems pretty extreme, BUT... a couple of points from the neuroscientists perspective:

1. The information contained in the structure of the informationally unique connections of the human brain represents the combined information from the 4 billion years of evolution required to produce it and the experience of all the sensory data processing a human has ever done in their lifetime. As an individual physical system, it is THE most complex object human science has ever encountered. The level of organization found in it far exceeds anything on the astronomical scale.

2. All estimates by computer scientists about the processing power of the brain greatly underestimate its processing power. If you want a true estimate, you have to consider dendritic spines (which undergo constant change), neuromodulators, glial effects, and a few other variables that aren't being considered in these calculations. All of these directly affect processing power and greatly increase the complexity and capabilities of the brain.

#22 ocsrazor

  • Guest OcsRazor
  • 461 posts
  • 2
  • Location:Atlanta, GA

Posted 26 January 2004 - 10:11 PM

Follow Up - I just read this paper and presented it in a journal club today. The summary is that the mathematics is fairly rigorous, there is no major flaw, but this number does represent an upper bound based on taking a very straight forward calculation of all the neurons in the central nervous system being able to possibly connect to every other neuron, for a total of 1000 connections per neuron. They are probably a few orders of magnitude high, but not tremendously far off the mark. I know their starting assumptions very well and they are accurate, and this estimate is probably much closer than anything that has been propsed before. I'm not sure about their estimates of computational memory storage though.

#23 Omnido

  • Guest
  • 194 posts
  • 2

Posted 10 April 2004 - 10:57 PM

Indeed, the human minds biological functionality is quite amazing.
However, we must also remember that our synapses are dying daily, and their deaths number in the thousands.
Perhaps it is because of their efficiency that we are able to continue with each passing day and our minds are able to make up for the loss without any immediately noticable effects.

If you look at it, scientists have not yet quantified exactly how many neurons and neural connections are required to even represent a word, or a number, or the various amounts of other information which are crosslinked to a singular word or number.
While in terms of sheer computational power, the human mind no doubt exceeds even the fastest computer. But one must remember that even the slowest computer still computes with exactitude, and without flaw, where the human minds analog interactions are built upon action potential and probabilities.

#24 ocsrazor

  • Guest OcsRazor
  • 461 posts
  • 2
  • Location:Atlanta, GA

Posted 13 April 2004 - 04:34 PM

Hi Omni,

Synapses don't typically die, some of them are constantly reorganized yes, extended and retracted, but there is much evidence to show that the total number of stable synapses continually increases through most of an mammal's adult lifespan. During childhood there are a few waves of great increases of synapses then a paring down process, but during adulthood it appears as if it is a somewhat linear increase.

The way mammalian brains store information is tremendously different than what our current computational systems do it, but they still have to follow the same laws of informational physics that the rest of the universe does. There is much evidence to suggest the storage method being used is holographic, i.e. that new information is layered on top of old information networks. We don't have a strong handle on how words are stored or how human brains do mathematical calculations, but we do understand quite well how sensory data is being abstracted, processed, and stored in a number of neural systems, so we are not completely in the dark about what the storage capacity of neural systems is.

I take great exception to the statement that computers always compute with exactitude, this is a matter of degree, it is problem-type specific, and no system is ever "perfect". Computers can solve particular classes of mathematical problems well because they are very good at repetition and exhibit stability of computation for that class - gates are probalistic devices as well, we just choose a large enough voltage difference definition so that it becomes a binary system in our perception. This is very energy and material inefficient though, and represents clumsy engineering which we hopefully will refine in the future. Mammalian brains are currently much faster, energy efficient, and more robust at solving problems in environments of high complexity and nonlinearity. You could structure neurons to do fast linear mathematical problem solving and be very stable in this processing as well - but their hasn't been an evolutionary need for an organism to solve these types of problems (a biological neural computer that exceeded the precision, energy efficiency, and size requirements of current computational systems would be possible to build, but its probably not worth the effort because of the legacy problems associated with cellular life support - respiration, feeding, etc. - but the properties of biological network computation need to be studied intensely so they can be abstracted to engineered systems)

The reorganization of connections problem is refered to as the stability/plasticity problem for neural systems. For the class of problems your system should be able to solve this is a critical variable, linear mathematical computation is easy. We have not yet solved this problem well (either materially or mathematically) in artificial computational systems for highly complex problem spaces.

#25 Omnido

  • Guest
  • 194 posts
  • 2

Posted 13 April 2004 - 11:50 PM

Hey there Ocsrazor. :)

Yes indeed, I believe I mis-quoted it. I was referring to Neurocellular death, not the death of the synapse(s) themselves. My apologies.
I agree with you that the computer is highly inefficient, yes. By comparrison, the amount of power needed to operate a 2GHz silicon CPU for most of todays computer systems is indeed exponentially greater than the energy used by a human brain. On that, I couldnt agree more.
But as you stated, the legacy problems would tend to discount the idea as not cost efficienct, and ultimately futile.

However, indulge me in a sci-fi idea. Do you think it would be plausible to develop a genetically engineered "super neuron" of sorts; one that could act as a universal type of neuron, capable of connecting to 1000 other identical types, and for a process in which the sole purpose was intended for raw calculation designed to match if not exceed todays silicon medium standards?

#26 Mind

  • Life Member, Director, Moderator, Treasurer
  • 19,058 posts
  • 2,000
  • Location:Wausau, WI

Posted 14 April 2004 - 02:29 PM

but this number does represent an upper bound based on taking a very straight forward calculation of all the neurons in the central nervous system being able to possibly connect to every other neuron, for a total of 1000 connections per neuron. They are probably a few orders of magnitude high, but not tremendously far off the mark.


I would say this is a major flaw in the argument. Every neuron in the nervous system does not, as far as we know, communicate with every other neuron (certainly not for simultaneous calculation). "If every computer chip on the planet was connected with every other chip...etc" then the computing power of silicon would greatly outpace that of any one human brain.

And I must come back to the point that their memory estimate is 8,000 orders of magnitude larger the number of atoms in the entire universe. Theoretically then, one human brain could simulate 10^8332 universes (assuming 10^100 atoms in one observable universe) within itself. Just sounds wacky. I know the math of their calculation is correct, but their initial assumptions seem off the mark.

#27 ocsrazor

  • Guest OcsRazor
  • 461 posts
  • 2
  • Location:Atlanta, GA

Posted 14 April 2004 - 05:31 PM

Hi Omni - on the superneuron - I do believe something like this would be possible, but I'm not sure if geneticly engineering biological neurons to the job would be the most efficient way to build something like this - as I mentioned above though, abstracting biology to engineered systems would be a strong possiblity as the materials (MEMS, nano) catch up to what cells are capable of.

Hi Mind - A couple of clarifications - the initial assumption was for 1000 connections per neuron, with their being able to make those 1000 connections to any other neuron in the system. As I pointed out - this is obviously not the case, there is local structure in a brain. That said, from the neuroscience literature I think that 1000 connections per neuron is a little on the low side, it is probably closer to 10000, and this means that the local connection density is higher than anything in the universe. What I am trying to get at is that they may not be too far off the mark.

As to the hyperastronomical numbers, you have to realize that the connectivity of an atom is tremendously lower that of a neuron. There is something extremely interesting here I haven't finished working out yet, but the initial assumptions to play with are the possible number of configurations a system can be in and the speed at which you can alter those configurations. What this is really about is how fast a system can explore state spaces. I don't want to give away too much because I have a feeling I may be on to something critical about information flux density in different systems, and may try to write something of publication quality this summer.

The really interesting thing to think about is what happens when brains can model universes in totality, which is not so far fetched looking at these calcualtions - something me and John Smart have had a few discussions about, lots of interesting philosophical questions to chew on. Also you are absolutely right that if all the silicon chips in the world were connected to each other they would greatly outpace a single brain (given that the connections also had fast reconfigurability), but this is a HARD (material engineering) problem. Connectivity density is a key variable being ignored by AI/GI right now. The other HARDER (mathematical) problem is HOW to connect them. If you can solve these two problems, you will have created superintelligence. My guess is that human minds alone won't be able to solve this problem analytically, we are going to have to use machine assisted directed evolution (and a strong dose of information from neurobiology) to produce these systems if we want to create them.

Best,
Peter

#28 Cyto

  • Guest
  • 1,096 posts
  • 1

Posted 16 April 2004 - 07:10 AM

Making sense of the brain's mind-boggling complexity

Leading scientists in integrating and visualizing the explosion of information about the brain will convene at a conference commemorating the 10th anniversary of the Human Brain Project (HBP). "A Decade of Neuroscience Informatics: Looking Ahead," will be held April 26-27 at the William H. Natcher Conference Center on the NIH Campus in Bethesda, MD.
Through the HBP, federal agencies fund a system of web-based databases and research tools that help brain scientists share and integrate their raw, primary research data. At the conference, eminent neuroscientists and neuroinformatics specialists will recap the field's achievements and forecast its future technological, scientific and social challenges and opportunities.

"The explosion of data about the brain is overwhelming conventional ways of making sense of it," said Elias A. Zerhouni, M.D., Director of the National Institutes of Health. "Like the Human Genome Project, the Human Brain Project is building shared databases in standardized digital form, integrating information from the level of the gene to the level of behavior. These resources will ultimately help us better understand the connection between brain function and human health."

The HBP is coordinated and sponsored by fifteen federal organizations across four federal agencies: the National Institutes of Health (NIMH, NIDA, NINDS, NIDCD, NIA, NIBIB, NICHD, NLM, NCI, NHLBI, NIAAA, NIDCR), the National Science Foundation, the National Aeronautics and Space Administration, and the U.S. Department of Energy. Representatives from all of these organizations comprise the Federal Interagency Coordinating Committee on the Human Brain Project, which is coordinated by the NIMH. During the initial 10 years of this program 241 investigators have been funded for a total of approximately $100 million.

More than 65,000 neuroscientists publish their results each month in some 300 journals, with their output growing, in some cases, by orders of magnitude, explained Stephen Koslow, Ph.D., NIMH Associate Director for Neuroinformatics, who chairs the HBP Coordinating Committee.

"It's virtually impossible for any individual researcher to maintain an integrated view of the brain and to relate his or her narrow findings to this whole cloth," he said. "It's no longer sufficient for neuroscientists to simply publish their findings piecemeal. We're trying to make the most of advanced information technologies to weave their data into an understandable tapestry."

The conference will feature neuroscience opinion leaders on the first day, followed by HBP grantees on the second day. There will also be a poster session at the working lunch and at the reception at the end of the first day.

"The presentations will highlight what is now possible because of these ten years of research in Neuroscience Informatics," added Koslow.



#29 Mind

  • Life Member, Director, Moderator, Treasurer
  • 19,058 posts
  • 2,000
  • Location:Wausau, WI

Posted 19 April 2004 - 02:46 PM

In the news article:

A cognitive model of the brain is created, which shows that human memory and knowledge are represented by relations, i.e., connections of synapses between neurons, rather than by the neurons themselves as the traditional container metaphor described.


I kind-of understand what you are saying about connectivity and state spaces Peter. Are you sure that they are not confusing the memory capacity of one state of neural connectivity with the number of potential states available.

Tell me if I am correct: If one neuron changes a connection within the brain then the memory situation has changed (albeit very slightly) and the previous state is no longer perceptible. With each change in connectivity we get a different set of memories. Therefore, while the number of possible states of neural connectivity could be 10^8432, we can only use one state at a time, or to say it differently, we are only consciously aware of the present state of our mind and memories.

sponsored ad

  • Advert

#30 ocsrazor

  • Guest OcsRazor
  • 461 posts
  • 2
  • Location:Atlanta, GA

Posted 19 April 2004 - 05:58 PM

Hi Mind,

Quick reply on this one -> So yes, our memories ( I'm assuming its connectivity) represent the total base of states our brain can be in at any one time. Consciousness though, is but one process running at any one time among millions(?) and uses only a limited subset of the activity in the brain. It requires a great deal of lower level activity to exist on top of, but it is directly connected to very little of that activity. A tremendous amount of abstraction goes on to produce consciousness, and it is the information contained in these 'reality filters' which represents a great deal of the complexity we see in brains. Think about the astronomical number of details your brain knows (its specified connectivity acquired either through evolution or learning) about how your world works. The state your brain is in is an incredibly detailed picture of the outside world, and the chances of it being such a good representation as a opposed to a random set of connections, probably justifies the level of connectivity they are talking about. That ONE state represents a huge amount of searching the possible state space.

So states of consciousness is a different subject (just at a different level) from the underlying complexity required to produce it. Put another way, consciouness is the emergent property of LOTS of specified complexity.

Hope this was clear, will try to clarify if necessary.




2 user(s) are reading this topic

0 members, 2 guests, 0 anonymous users