Memory – which ironically is popularly portrayed by the elephant – is what really sets us apart from the rest of the animal kingdom. Not learning by rote or repitition so as to become an instinct over time, but as a conscious, intellectual point of reference in our decision making process that leads to better longterm planning and advancement for the species.
All animals, even microscopic one-celled creatures, have memory. But about one million years ago, our hominid ancestors experienced something extraordinary that would play a critical role in turning them into modern human beings: they created a second, artificial, form of memory that existed outside their skulls that could exist outside their presence, even their existence.
Now, those two forms of memory are again about to converge and become one – and the implications to humanity may be as profound as those from that initial schism. This transformation, as profound as any in human history, began with a creature called Homo Egaster. H. Egaster was an impressive character. Literally: the males stood over six feet tall, with the kind of brow we associate with the much later Neanderthals. H. Egaster was also among the most accomplished hominids in our family tree. For example, he discovered fire. He also led mankind out of Africa for the first time. But Egaster’s greatest achievement was one over which he had little control: he learned to talk, and to do that he first learned how to listen.
These new skills, as profound as any in human history, were the product of three seemingly minor evolutionary changes. The first was a widening of the cervical vertebrae in the throat, which realigned the larynx and tongue and enabled Egaster to introduce unprecedented complexity in the sounds he could make. The second was a new configuration of the middle and outer ear that allowed Ergaster to hear sounds, especially vowels, that other apes still struggle to hear. Finally, and most important, Egaster began to develop a thin new layer of nerve cells, the cerebral cortex – wadded up, they filled all of the folds of the brain, stretched out they made a sheet about the size of a Hermes scarf – that allowed for the powerful processing of this new verbal data.
Unfortunately, Homo Egaster’s brain wasn’t big enough to take full advantage of these new tools, so while he likely did invent human speech (fire, exploration and speech – not a bad legacy) it could not have progressed far. Rather, the task of inventing language would fall to his successor, H. Heidelbergensis. Heidelberg Man is one of the most impressive hominids of all (ourselves included). Standing as tall as seven feet (the average Heidelberg Woman was well over six feet), Heidelberg Man was the first true toolmaker; he buried his dead; and appears to have invented both clothing and adornment – all suggesting a creature with a strong sense of self, of life and death, of the cosmos, and most important for our purposes, language.
Why is language so important? Four reasons. First, it enables one to convey information over large distances, even beyond line of sight, a skill that is vitally important in travel and hunting. Second, it acts as a social glue, further tying together members of the family or tribe into larger social structures. Third, it creates metaphor, the bundling together of two diverse ideas into a something new and thus an addition to knowledge. Fourth and most important, it allows knowledge to be conveyed through time, from one generation to the next, thus accumulating human intellectual capital. The human oral tradition begins here; the acquired wisdom of one generation doesn’t die with that generation.
Evolving alongside language was a gestural tradition, much of it related to hunting. Even today, modern hunters like the Bushmen of the Kalihari, so as not to spook prey, will organize themselves with a combination of hand gestures and images drawn with a stick or finger in the dirt. At some point in human history – we used to think with early Homo Sapiens (Cro-Magnon Man), but now perhaps late Neanderthals – those two traditions merged. The picture became the representation of the spoken word. Painted deep in caves, these pictograms became the language of ritual and religion; the keeper of those images, and the leader of the rites surrounding them, was the shaman or priest – the first separate class in human society and the beginning of the division of labor that led to modern society. Just as important, those images, whether painted on a wall or etched into bone or rock gave mankind the first medium of memory into the indefinite future and not be dependent upon the survival of human intermediaries.
It is probably not a coincidence that it was during this same period that what we think of as human consciousness, that great divider between humans and animals likely first appeared. Certainly we thought before language, but language enabled us to think in ways, and about ideas, that no living thing on Earth had ever thought before.
Writing had begun, and with it the split between ‘human’ and ‘artificial’ memory had begun as well.
From this point – between 50,000 and 20,000 years ago – on, human and artificial memory take on wildly divergent paths, yet still remain tethered to each other by their need to operate through living beings. The human brain spends the next Millennia exploring its capabilities and limitations. Its facility is made manifest through art and literature, religion, commerce, politics and science – its first efflorescence, about 5,000 years ago during the Bronze Age, is one of mankind’s greatest achievements . . .one that can be read in the first epics, the Pentateuch of the Bible, the Egyptian Book of the Dead, the Mahabharata, the Iliad, and earliest of them all, Gilgamesh, itself the story of the creation of human civilization. At one point, each of these epics were part of the oral tradition, their thousands of lines evidence of the power of human memory, and the need to safeguard them outside the heads of a few poets and minstrels an important force in the creation of national languages.
By the time of the Roman Republic, the power of human memory was being pushed to limits rarely reached before or since. On a daily basis in this bureaucratized state, scribes and priests carried around massive amounts of record data and rituals. But it was the orators, spurred by treatises like Rhetorica ad Herennium and those by the greatest practitioner of all, Cicero (it was he who called memory ‘the guardian of all things’), learned to use mnemonic devices and other tricks to memorize speeches that could run for hours or hold in their heads vast documents. This tradition, all-but alien to us now, survived almost to the twentieth century as both feats of memory and the mysterious syncretic tradition of mystics, numerologists, kabbalists, alchemists, the likes of Giordano Bruno and Renaissance ‘memory theater.’ By comparison, the history of artificial memory is the story of technological innovation as it was applied in daily life.
Whereas spoken language has largely evolved through the collision of different cultures – and more recently, the need to develop new words to describe new ideas or phenomena — written language has mostly been changed by its medium and application. In Sumer where modern writing is generally considered to have been invented, wet clay was the most common medium on which to write, first as counting marks on jars used for commerce, and eventually on clay tablets. As everyone knows, writing in mud is difficult, so the Sumerians used reed styluses to impress straight indentations in the clay – cuneiform. Happily, when those tablets dried in the sun, they became hard and durable; unhappily, they also cracked or melted in rain. On the other hand, if they happened to be in a library that burned down, they were fired rock hard and could last centuries – which is why we have Gilgamesh.
And human memory? The 20th century saw the greatest advances in our understanding of the brain in Millennia. Psychoanalysis, behaviorism, psychoactic drug research, saw enormous advances in brain science in the first decades of the century. Then, thanks largely to the miniaturization and data acquisition made possible by the new digital devices and the large volume memory storage they demanded, research into how the brain worked took off. That revolution continues to this day. Meanwhile, thanks to Moore’s Law, the pace of innovation in the digital world has raced even faster, and at an exponential pace. The computer that sent a man to the moon is now dwarfed by the cell phone in your pocket. Add to this the Internet, which puts all of the world’s memories at the fingertips of every living person and suddenly the two tracks of human and artificial memory seem to be converging after all these centuries at a shocking – some would say alarming – rate. Today, we have accumulated, addressed and stored many times more of mankind’s memories than ever before. We are poised on the brink of a Great Convergence. But as our magnetic memories begin to fade and with our billions of microprocessors at risk of erasure from some unpredictable, but likely, Carrington Event of the sun – we have to ask ourselves if we are so busy looking to the future that we have forgotten our duty to the past to safely preserve the present.
By Michael S Malone