• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
- - - - -

Brain beats all computers in memory capacity


  • Please log in to reply
59 replies to this topic

#31 randolfe

  • Guest
  • 439 posts
  • -1
  • Location:New York City/ Hoboken, N.J.

Posted 21 April 2004 - 11:55 PM

I might be missing something here but everyone seems to be thinking mainly in mathematics and connections.

Can a computer distinguish between the smell of a rose and a gardenia? Perhaps some technology can "hear" insofar as translating the spoken word into typed text. But can a computer "understand" what it hears?

How do you measure the memory required to remember certain smells? For that matter, is more memory used to remember or recognize a sound as opposed to remembering or recognizing a smell?

And how can someone measure the "memory" required to make a judgement as to whether music is harmonious or not?

It seems to me that this discussion of memory is a bit too linear....

#32 ocsrazor

  • Guest OcsRazor
  • 461 posts
  • 2
  • Location:Atlanta, GA

Posted 22 April 2004 - 12:53 AM

You aren't missing anything randolfe, mathematics and connections are what memory is about.

Yes, a computer could certainly tell the difference between a rose and a gardenia, that is just separation of odors by molecular identity. This has been done for a while now. The amount of storage for sound or smell will likely be highly dependent on how complex the sensory information you are trying to encode would be. Smell is certainly more primitive in its appearance in the biological world, but sound tends to be very simply encoded in the brain. Quantification of memory in the way you are suggesting is just now starting to be understood in large scale neural systems and is made difficult bc changes to store the memory happen across the entire cortex in some instances.

The type of memory required for harmony recognition would be a great deal of learning spread over several subsystems in the brain. To quantify something like this, we will have to get a good handle on what is changing when an animal learns something new. Nicolelis' experiments with robotic arms controlled by motor cortex are getting at quanitifying the output side of learning and memory and there are a few groups, in artifical sensor systems studying the input side.

One of the points in my above posts I'm trying to make is that biological computation is not linear at all, but is tremendously more complex than anything we currently engineer. If you mean by linear that consciousness is something non-mathematical, you would be suggesting that it is supernatural, which there just isn't any grounds to have a discussion on. If you are talking in the linear vs nonlinear sense then I fully agree that the brain is nearly completely nonlinear in its operation, but it is still quantifiable, albeit highly complex.

Best,
Peter

sponsored ad

  • Advert

#33 diadulus

  • Guest
  • 6 posts
  • 0

Posted 26 April 2004 - 12:33 PM

Perhaps the simplest way to understand how a brain really works is to consider its ability to extrapolate the identity of individuals et al from what would seem in analysis to be a small amount of data. Another point to consider is just how inaccurate the brain can be and how perception of an image is something related to self and not necessarily to another individual.

"Buety is in the eye of the beholder"
"Computers compute, we percieve"

#34 ocsrazor

  • Guest OcsRazor
  • 461 posts
  • 2
  • Location:Atlanta, GA

Posted 26 April 2004 - 07:14 PM

Hi Diadulus,

The brain uses a tremendous amount of data to do tasks such as face recognition. Data from experience becomes hard coded (through synapse and cell/network physiology changes) into filters which allow rapid processing of large volumes of information. I'm not sure what you mean about the brain being inaccurate - this is highly context dependent. All tasks which are important to brains (in the evolutionary sense) are exceedingly accurate, much more accurate than any artificial computational system in existence.

Best,
Peter

#35 diadulus

  • Guest
  • 6 posts
  • 0

Posted 28 April 2004 - 01:01 PM

Hi Peter,


Inaccurate relates to how we as individuals percieve the world as in some people who hear music also see colors associated with those sounds or some people may believe they see a familiar face in a crowd when they have not. In other words we don't all see the same picture or percieve the exact same things in the exact same way. Point being we use a common language to explain even though what we are explaining may be completely different from what the other person understands even though we will both agree and find nothing to dissagree about, OK thats a bit philosophical but just consider the PC and Mac both do the same things and are both computers but neither work from the same architecture.

I'm familiar with fuzzy logic and ANN's and GA's along with the way in which the human brain functions through both electrical and chemical signalling also that it's relatively slow when working on inline arguements but when dealing with arguments like facial recognition it multitasks all that data through its tremendously parallel neural greyness and for the most part is highly accurate even when the provided data set is small and it's forced to extrapolate or assume. But the mind can play tricks on us and we can filter out those inaccuracies and pretend we are exceedingly accurate along with selective forgetfulness.

As for context every situation is context, the light may play tricks a distraction may occur our body chemistry may be slightly inebriated but somehow we muddle through. The other point abouth context is that certain situations can awake strong emotions within people, re life changing situations, so that one person who listens to wagner may feel happy whilst the other might be trying to jump out of the window. There are many other less extreme examples and maybe even some more extreme examples which would indicate that the mind does not always filter out but in some cases filters in.

#36 jestersloath

  • Guest
  • 15 posts
  • 0

Posted 30 April 2004 - 11:30 PM

[glasses] I under stand that the brain is so small and the universers is so big the question is then how can something that barly makes up a fraction of the universre have more room to store information then it. well i know this sounds crazy but since the brain is SO mystirous and so say unused, just beacuse we dont know what it does dosnt mean we dont use it. the universe is busy all the time doing every thing there is to do, never stoping just keep working. now the other part of our brain is sitting there doing what people call nothing when really that nothing provides lots of time and space and room too store information. Like someone said earlier just beacuse we cant remember on command or if we cant remember it ever dosent mean its not there float or what ever it doesdoing nothing. We probly do remember it too at times but we dont recgignize that we remember it.

its simple... [sfty] (iam so not cool, iam fat too lol)

#37 ocsrazor

  • Guest OcsRazor
  • 461 posts
  • 2
  • Location:Atlanta, GA

Posted 03 May 2004 - 12:03 AM

Diadulus - the inaccuracies you are referring to are not those inherrent in the brain's computation, but those which are in the communication protocols between humans - we do NOT have a robust common language and this has very little to do with the accuracy of brain computation.

If you had an organism whose sole existence was dependent on rapid linear computation, it would likely be faster than the current computational structures because of biological systems to parallelize problems and because of the ability of single neurons to do highly parallel computation much faster than current transistors. Biological brains are tuned for rapid reaction to messy real world situations which are extremely complex and highly nonlinear.

The situations you mention really don't have anything to do with accuracy of computation, but speak more to the differences that exist between individual brains which have developed under different conditions.

Jestersloath, I am still trying to parse what you wrote :)

#38 jestersloath

  • Guest
  • 15 posts
  • 0

Posted 04 May 2004 - 10:44 PM

10 to the power of 8432 cannot possibly be correct. Best guesses on the number of atoms in the entire universe are between 10 ^70 and 10^100.






What ment was protaining to this
I ment that even thought the universe is so big dosent mean that our brain cant have more memory space than the number of atoms in the universe. I mean like i said look how much of our brain does nothing and just sits there, it has to be there for a purpose.

Sorry, i know i made the 1st confuseing i reread it and with out the quote it dosent makes sense


I have a question? umm the empty space in space has nothing no breath able are or anything, would it still have matter and atoms in it? [mellow] [hmm]

#39 Mind

  • Life Member, Director, Moderator, Treasurer
  • 19,058 posts
  • 2,000
  • Location:Wausau, WI

Posted 04 May 2004 - 11:07 PM

Peter, I have attached a little scribbling below to better illustrate what I meant in my previous post. I know this is very simplistic...there are different strengths of connectivity and such things...but I just wanted to get a thought across. This is just a hunch about the astronomical numbers involved, because I did not have access to the original paper, just the news reviews.

Attached Files



#40 Mind

  • Life Member, Director, Moderator, Treasurer
  • 19,058 posts
  • 2,000
  • Location:Wausau, WI

Posted 10 May 2004 - 10:09 PM

Here is a related story about the available computation in the universe (given its current expansion).

Accelerating universe will limit technology

The authors come up with a limit.

The duo calculated that the total number of computer bits that could be processed in the future would be less than 1.35x10^120.


Obviously (wry smile) we need to put these fellows in touch with the brain researchers of this thread. With their "brain" numbers we could easily simulate another 10^8332 universes (assuming 10^100 computing particles in the universe from my previous post) and keep computing to our heart's desire. In fact with 10^9 brains on the earth we might as well make it 10^8341.

Just having some fun Peter.

#41 ocsrazor

  • Guest OcsRazor
  • 461 posts
  • 2
  • Location:Atlanta, GA

Posted 10 May 2004 - 11:18 PM

Hi Mind,

There are two separate issues being considered here as well. The authors only talked about static storage, and in this respect they are correct as far as I can tell -> the brain could store a hyperastronomical number of bits of information about the real world. You are correct that we only have access to one particular state at any one time, but this doesn't change the fact that the structure of a brain actually represents that much information.

The second question you are trying to get at is the ability to alter and retrieve the stored information. Much of that information is statically stored structural information encoded by evolution and our environment during childhood development, but that doesn't mean that it isn't information (it is processed information about how the world works). What you want a answer to is 'how much dynamic storage does a brain have?'

My guess is that it is still a hyperastronomical number (i.e. greater than 10^100 bits) because the cortex is incredibly plastic in its connectivity and has very fast dynamics for altering that connectivity.

I would be very curious how they calculate the energy required to process a bit in the physics paper you mention. My guess is that they are assuming you always have to move particles to process, this is an incorrect asssumption. All you need is a minimum relationship change between nodes in a network to compute and large highly connected networks provide huge computational savings through nonlinear processing methods.

#42 A941

  • Guest
  • 1,027 posts
  • 51
  • Location:Austria

Posted 30 November 2006 - 02:25 AM

Some Time ago ive read that we only have mere 250mb of Memory.
Sound scary!

250mb, thats smaller than a very bad game from the early 90ties.

One would think we should be able to store more Information in Our brain.
How is it possible that we could talk many different languages (gramer, pronouncation, toungmovement, ...) and so on?
And we are not only storeing Things we see or hear also how to move etc.
Arre 250mb really enough for that?

#43 A941

  • Guest
  • 1,027 posts
  • 51
  • Location:Austria

Posted 30 November 2006 - 07:53 PM

No one interessted in this discussion anymore?

#44 eternaltraveler

  • Guest, Guardian
  • 6,471 posts
  • 155
  • Location:Silicon Valley, CA

Posted 30 November 2006 - 09:23 PM

Arre 250mb really enough for that?


no, thats why we have dramatically more

#45 veryfaststuff

  • Guest
  • 20 posts
  • 0
  • Location:Infinity

Posted 22 March 2007 - 02:42 AM

I think people need to get familiar with the world to understand the ability of the brain. Power can be defined as something with much more versatility, functionality, efficiency, and ability to store many types of data. We could store all types of data that the computer can store in milleninary(base ~1000 number system). Plus, we store data 'certain' types of data much more efficiently and compact than the computers. Think about what you did yesterday. You would probably remember a whole stream of information simutaneously.

First of all, I'll give a comparison between the brain and the computer:
1) The brain is under a much higher load than the computer at any time of life. Even in sleep, our brain reorganizes neurons, restores chemical imbalances, release hormones, etc. Our brain never "shuts down" unless we die. The computer requires rest every once in a while. Have you ever used a 40 year old operating system? Even after 10 years, it slows down to less than 10% of its initial efficiency.

2) The brain is power efficient.

3) The brain can't store numbers easily only because people are trying to 'memorize' a foreign data type. Our brain is familiar with the 'taste', 'touch', 'smell', 'sight', and 'sound' data types. So, people try to memorize a number through an interpretation of a long string of data split into individual bits. Mnemonic users know better. Just simply compress the data into our native data types to effectively store information to be able to easily recall in a nonlinear fashion. Computer recall linearly at this moment. At this link(http://www.recordhol...ist/memory.html), you will see things that "seem impossible." As a beginner mnemonic user, I can only recall six thousand digits of pi and memorize only 500 digits in realtime over a timespan of 20 minutes. The champions, however, do much more. A Japanese man could recall over 90,000 digits of pi with little to no error. This leads into my next point.

4) The brain is supposed to error many times
Think about the surroundings of a human. We see, smell, and hear things around us almost all the time. We also are constantly on the "guard state", a continual "firewall and antivirus" that blocks out many types of foreign invaders such as the reflexive slap on that annoying fly that sits on your neck while you type on imminst's forum. While the computer could utilize 100% of its power at one time, we have to allocate most of our attention to other things. In the end, we only have about 5% of our maximum "cpu time" utilized for the conscious task at hand.

(running out of time... will update later)

#46 crayfish

  • Guest
  • 31 posts
  • 0

Posted 22 March 2007 - 01:54 PM

Very interesting discussion - one point that came to me is: Are all neurons and interconnections involved in memory formation and storage, potentially or actually?

#47 Heliotrope

  • Guest
  • 1,145 posts
  • 0

Posted 02 August 2008 - 08:17 AM

you guys realize the implications of this finding? this sounds too wacky, crazy, and way too good to be true. human brain capacity: 10 followed by 8,432 noughts, 10 to the 8432 power, 10^8432, is that number measured in bits/bytes/megabytes or what? geez, the number is so huge that it's well on its way approaching infinity.

I didn't think the brain could or would even have needed to be developed to be this powerful, this massive, this limitless in capacity by mere evolution alone, even 3.5 billions years of evolution of life. this number, 10^8432, representing brain memory capacity, this "limitless" 10^8432 does not seem parsimonioius, a waste of memory storage. what human would need that much storage space? the answer: an immortal probably. if i'm immortal and have "eternity" to spend, I'd read all the books, music, movies ever existed and even after memorizing ever letter and number in every library, know the sum of our civilization's knowledge, it would likely be a tiny fraction of 10^8432

according to wiki, the cumulative amount of data stored in the brain over a 70 year lifetime is in the order of 125 MByte , and i conservatively estimate my brain has probably the capacity of 50 to 100 terabytes based on conservative scientific sources, certainly more than enough to last me many lifetimes.

#48 Shoe

  • Guest, F@H
  • 135 posts
  • 1

Posted 02 August 2008 - 06:27 PM

Holy shit! Talk about ancient thread! BTW, I think HYP86 says some interesting things ^

#49 Mind

  • Life Member, Director, Moderator, Treasurer
  • 19,058 posts
  • 2,000
  • Location:Wausau, WI

Posted 03 August 2008 - 12:21 AM

There are a lot of GREAT Imminst threads that need to be brought back into the light of day, because they could be updated and compared with the newest research results, societal trends, and philosophies. There is a fantastic laboratory of discussions waiting to be tested by time. So if you are ever trolling through the old discussions and find something interesting, be sure to update it with the latest thoughts.

#50 nanostuff

  • Guest
  • 17 posts
  • 0

Posted 30 August 2008 - 01:28 PM

This is the problem with neuroscientists making computational comparisons, the results end up being hilariously wrong. They apparently don't have a clue what a meaningful bit is. Similar values could be derived for 15 year old EDO memory, or even an old rock. Everything in nature has an incredibly high information capacity.

However I don't even think that's enough to explain their out-of-whack 'estimate'. It seems as if they took their initial ridiculously high value, and multiplied it by the same ridiculous value and perhaps multiplied it a number of times equivalent to the multiple. Conveniently they seem to have used the number of connections as the faltered basis of their estimate, as if they thought, not being the computer scientists they should have been, that increasing the number of connections increases data capacity to the number of available states.

This is beyond flawed, this is plain old 'point and laugh'.

#51 Mind

  • Life Member, Director, Moderator, Treasurer
  • 19,058 posts
  • 2,000
  • Location:Wausau, WI

Posted 30 August 2008 - 07:36 PM

As I mentioned earlier, it boils down to the researchers mistaking ALL POSSIBLE BRAIN STATES with USABLE BRAIN STATES. The brain can only use one state at a time. The state is constantly changing. We can't consciously "use" every possible neuronal connection at once.

#52 Luna

  • Guest, F@H
  • 2,528 posts
  • 66
  • Location:Israel

Posted 06 October 2008 - 05:43 AM

If the brain has only about 50 to 100 Tera Bytes then it is really soon to be "last decade's" hard drive.

#53 Heliotrope

  • Guest
  • 1,145 posts
  • 0

Posted 06 October 2008 - 08:49 PM

If the brain has only about 50 to 100 Tera Bytes then it is really soon to be "last decade's" hard drive.


wikipedia says on longterm memory:

Since the brain has approximately 1015 synapses, one can argue that brain has a maximum capacity of about 100 TByte, possibly more if one synapse can store more than 1 bit of information.



i guess
some ppl suffering from hydrocephalus like with water compressing and "shrinking" their brain or suffering other memory stuff related to structure , or with less neurons and synapses could have way less than 100TB, but should still be on order of terabyte

#54 Luna

  • Guest, F@H
  • 2,528 posts
  • 66
  • Location:Israel

Posted 07 October 2008 - 01:15 PM

An assumption is also that the brain works by linking, for example if you think about bicycle, it will link all the neurons relevant, such as:
"Mine"
"wheels"
"transport vehicle"
"danny's"

Which could make those terabytes much more efficient, but this does not put it at advantage still.

#55 Heliotrope

  • Guest
  • 1,145 posts
  • 0

Posted 17 October 2008 - 04:05 AM

An assumption is also that the brain works by linking, for example if you think about bicycle, it will link all the neurons relevant, such as:
"Mine"
"wheels"
"transport vehicle"
"danny's"

Which could make those terabytes much more efficient, but this does not put it at advantage still.



yeah, doesn't the linking create much more storage and so many more different states for the brain.

Human brain's maximum capacity is at 1 to 1.25 terabytes is based on storing 1 bit of information per synapse, but i suppose if humans really live extremely long and start pushing the envelope on long-term memory, we will find ways to adjust/"evolve" to storing many more terabytes, exabytes, yottabytes... whether still using our 100% raw meatware or implant organic memory chips/drives in brain.

Right now, the situation seems like an awful waste of "space", like when you have a hard-drive with 1 TB capacity and you never use more than a 1% of it even till the disintegration of the drive, storing only a few megabytes/gigabytes of various files, constantly erasing them and replacing them, keep the essence but only keeping some important parts "permanent" like your self-identity/self/OS and really really really important information to last for many years and decades

Edited by HYP86, 17 October 2008 - 04:12 AM.


#56 Heliotrope

  • Guest
  • 1,145 posts
  • 0

Posted 25 January 2009 - 02:10 AM

here's the wiki on memory capacity of brain

http://en.wikipedia....ong_term_memory

The brain stores long term information by growing additional synapses between neurons.[1] Since the brain has approximately 10 ^ 15 synapses, one can argue that brain has a maximum capacity of about 100 TByte, possibly more if one synapse can store more than 1 bit of information. By no means do humans store that much information. Experiments in the mid 1980s showed that humans can store only 1-2 bits/second in their long term memory.[2] The cumulative amount of data stored in the brain over a 70 year lifetime is therefore only in the order of 125 MByte.[1][2]


[edit] Duration
Studies undertaken by Bahrick et al can predict that long term memory can indeed remember certain information for almost a lifetime. However factors can in fact reduce or extinguish information completely. Childhood amnesia is a factor effecting long term memories duration, there are very few people who can remember information or events before the age of 3 or 4.






cumulative amount of data stored in the brain over a 70 year lifetime is therefore only in the order of 125 MByte. ...

Edited by HYP86, 25 January 2009 - 02:11 AM.


#57 lunarsolarpower

  • Guest
  • 1,323 posts
  • 53
  • Location:BC, Canada

Posted 25 January 2009 - 04:07 AM

cumulative amount of data stored in the brain over a 70 year lifetime is therefore only in the order of 125 MByte. ...


I came up with about 200 MB when I did my own calculation of this. The thing people forget when they talk about this is how much the brain can extrapolate to "fill in the blanks". No one keeps a 6 megapixel image of their mother's face in their memory. However someone who was a good artist could likely use 50-100 bits of memory to draw a picture of their mother's face that was nearly as accurate as a 6 megapixel image. This fantastic ability to compress data is one way we are able to do so much more than the raw number (200 MB) would suggest and also explains why rote memorization of word-for-word data is so incredibly slow and hard.

#58 Heliotrope

  • Guest
  • 1,145 posts
  • 0

Posted 02 February 2009 - 07:53 PM

cumulative amount of data stored in the brain over a 70 year lifetime is therefore only in the order of 125 MByte. ...


I came up with about 200 MB when I did my own calculation of this. The thing people forget when they talk about this is how much the brain can extrapolate to "fill in the blanks". No one keeps a 6 megapixel image of their mother's face in their memory. However someone who was a good artist could likely use 50-100 bits of memory to draw a picture of their mother's face that was nearly as accurate as a 6 megapixel image. This fantastic ability to compress data is one way we are able to do so much more than the raw number (200 MB) would suggest and also explains why rote memorization of word-for-word data is so incredibly slow and hard.



Can you post the calculation? Your 200MB capacity differs a lot from 125 MB, but still in same order of magnitude. Did you come up w/ 200MB for continuously storing info at rate of 1 to 2 bits per sec? After you include sleeping + eating + other things, esp. sleep (where no new info can be taken in, ~third of your life), then maybe if you condense it, a person may only use 100MB.


EDIT: wiki article now says a person can store as much as 15 GB of mem over a lifetime

But human brain's amazing tho, condensing and eliminating (some say a person doesnt forget anything, it's just hard to recall all things), and like you say may even reproduce high-def. 10 Megapixel pictures of Mom. In computer, a few video clips would eat up the 125-200Mb. That amt is not even a fraction of a movie

Edited by HYP86, 02 February 2009 - 07:58 PM.


#59 valkyrie_ice

  • Guest
  • 837 posts
  • 142
  • Location:Monteagle, TN

Posted 22 February 2009 - 07:29 AM

I have always thought that the minds amazing memory capacity is due in large part to a process of symbolization of data. In other words, we catagorize data and use symbols to replace actual data.

For example take a can. A can can vary in form, size, contents, etc, but the basic CONCEPT of can is a constant. So what you remember is not so much a memory of the can you are looking at so much as the concept CAN with indentifying tags that distinguish it from other cans you have known.

By replacing real world data by conceptual representations, the mind can compress the daily input of data into much smaller and more managible chunks. Language is simply a derivitive of the symbolic processing that the brain carries out, an extemely low bandwith serial data stream that is capable of only partially conveying the totality of data processed by the brain.

There is probably a name for this school of thought, since I doubt it's unique, but I cannot at the moment recall where or if I have read about this previously.

sponsored ad

  • Advert

#60 mentatpsi

  • Guest
  • 904 posts
  • 36
  • Location:Philadelphia, USA

Posted 12 October 2009 - 05:16 AM

I have always thought that the minds amazing memory capacity is due in large part to a process of symbolization of data. In other words, we catagorize data and use symbols to replace actual data.

For example take a can. A can can vary in form, size, contents, etc, but the basic CONCEPT of can is a constant. So what you remember is not so much a memory of the can you are looking at so much as the concept CAN with indentifying tags that distinguish it from other cans you have known.

By replacing real world data by conceptual representations, the mind can compress the daily input of data into much smaller and more managible chunks. Language is simply a derivitive of the symbolic processing that the brain carries out, an extemely low bandwith serial data stream that is capable of only partially conveying the totality of data processed by the brain.

There is probably a name for this school of thought, since I doubt it's unique, but I cannot at the moment recall where or if I have read about this previously.


That's a pretty interesting theory that makes a lot of sense. I believe i learned an aspect of it in Cognitive Processes, where a very similar concept is called schema.




3 user(s) are reading this topic

0 members, 3 guests, 0 anonymous users