BRAIN VS MOTHER TONGUE
The human brain is the most systematic one. It consists of roughly 100 billion neurons. Each of these neurons has the capability to make 1000 connections, representing about 1000 potential synapses, which largely do data storage work. Multiplying each of these 100 billion neurons by the approximately 1000 connections it can make. You would get 100 trillion data points or about 100 tetra bytes of information.
A duo of researchers, one in the University of California and others in the University of Rochester has discovered that it takes approximately 1.5 megabytes of brain data to store and use the native language. Their paper was published in the Journal Royal Society Open Science, Francis Mollica, and Steven piantadosi. They have described applying information theory to add up the amount of data needed to store the various parts of the English Language. They have described applying information theory to add up the amount of data needed to store the various parts of the English language.
As infants, we adults begin to acquire and speak the language of those around us. Have you ever thought how do you learn a new language without trying to put a separate effort for learning it? Well! It is just a mystery but scientists know that it entails much more than storing words alongside definitions like a dictionary. There are associative clues with the words. For example, the concept of swimming with the word fish or even water or aquatic animals. The brain also consists of information that insists us how to pronounce a word and how it can and cannot be used with other words and the sounds that make a word when spoken.
Mollica and Piantadosi made a new effort to undertake the task of converting the capacity utilized by our brain to store a language into data amount. They have utilized information theory, a branch of mathematics that focuses on how information is coded via sequences of symbols.
The researchers have taken quantifiable size estimates into accounts that estimate various aspects of English language. They assigned phonemes, a sound that stack into spoken words. On average, humans use 50 phonemes where each phoneme takes around 15 bits to store.
Coming to the vocabulary part, the average person knows 40,000 words which approximately takes 400,000 bits. Next, looking into semantics, they have calculated memory utilized for 40,000 words as 12 million bits. Word frequency is the important factor which made to add 80,000 bits. Finally, 700 bits was calculated to store system syntax rule. Summing up all researchers concluded that it would take approximately 1.56 megabytes which is almost close to store a single digital picture.
Similarly, a research study of newborn babies has revealed that humans are born with the innate skills needed to pick out words from the language. The study was published in Development science. A new study was published in Frontiers in Psychology says that at that age of 20, a native English speaking American knows 42,000 dictionary words.
You have probably heard this before that humans can only access 10% of their brain while the rest of our cognitive capacity goes untapped. What if Humans used 100% of their brains? Readers, it´s your turn to answer this question. Please give your answers in comment sections.
– Saranya Nagarajan