tommus wrote:zenmonkey wrote:we process language as sounds, not as text.
There are lots of people who learn a second language only to read it (and perhaps to write it) but who rarely if ever hear or speak it. So sound is not really involved in storing the information. Yes, we process the sounds, but the knowledge and memory of language is not stored as a sound. There aren't thousands of little sounds buzzing around in our memories. The necessary knowledge and memory must be stored in the brain as some sort of code, and storage of code requires fairly accurately known numbers of "bits" of data. So I think the amount of "memory space" that it takes to store whatever we learn as language (even if we learn it by sound) is interesting. Besides the data for the vocabulary, we have to store knowledge of what things look like (an elephant, a tree, etc.) but that is not counted in the amount of memory required for fluency. A person would have all that recognition of things, etc. even if they grew up totally isolated and never learned a language. Agreed; there would be a difference in the amount of bits require to store the sound of a word compared to the amount required for only the spelling of a word. But there would be a correlation (long sounds correlate to long words). So if a person learns to read, write, listen and speak a language, perhaps the storage requirements are something like 4 x 1.5 MB, plus overhead for grammar, expressions, etc.
Maybe, but the article is not about those people. We don't store sound either. We store electrochemical potentials (which are understood to be managed as sounds, even from reading... there is a thread here somewhere. Edit: see dave's post. ). Bits are binary representations of information - once again, the brain does not store bits. This is what you get when you as a brilliant mathematician to theorise about how the brain function. The von Neumann paradigm was a nice theoretical exercise on how the mind might be considered as a computer - except physiologically speaking, this model is nonsense. There are those today (although I am not a strong believer of this) that the mind functions a quantum level and that, yes, some function is potentiated there. If you are interested in that... then a more modern reading would also include those estimations (which then toss out by several orders of magnitude) that entire paper.
That brain model is the one Penrose and Hameroff developed in the early 1990s and what they call the Orchestrated Objective Reduction (Orch OR) model. Penrose's book Emperor's New Mind talk about this.
When you write MB... do you mean Bytes? Are we talking about units that represent 8 bits, 16 bits or 64 bits? Knowing that a byte can be anyone of those depending on the hardware - I have here devices that have a byte of 64 and 8 bits. So saying 1.5 MB has no sense, if the brain "hardware" (synapse potential firing) is tristate (sometimes) or higher...
As to whether a person is able to recognise all things without language ... that's a totally different subject, and a fascinating one, to which, if memory serves correctly, Oliver Sacks answer with a mitigated "no". The experience of feral children suggests that in many cases, these individuals that do not use language also lose some primary abilities to create general inferences. Often it is impossible to teach them language later in life.
VyingEye wrote:I think it's worth pointing out that this paper is written from an information theoretic approach.
Exactly. And is not about how information is actually managed by the brain/mind.