I often feel bad that I don’t have a better memory. My memory isn’t bad, but it’s not good either. I have never had an easy time of memorization, memorizing Bible verses and acting lines throughout high school was always difficult for me. Though it was something I could accomplish, it took a lot of time and patience, and I never got to the point that I was able to easily commit things to memory. Now my memory in some respects is good, but in others spotty. If I want to remember a birthday or anniversary it has to be in my phone or I have to see it on Facebook to remember it, otherwise everyone but my immediate family is completely forgotten about, and this makes me feel bad, because remembering things in our culture has a high social value.
And yet, I also conversely believe that memory, specifically in the form of memorization of facts, figures and narratives, is no longer a necessary technology, it has been surpassed numerous time in the same way that being able to walk long distances is no longer a necessary technology, since it too has been rendered, for the most part, obsolete. This is not to say that memorization, much like walking long distances, isn’t good for humans and doesn’t serve a purpose, but rather to say that I’d like to examine the idea I have that because my memory is not good I am not succeeding as a person, working hard enough because if I was I would remember things better.
I would like to first posit that memorization is a function of technology. A more apt description might be technique, and I think that the shared root of the two words, ‘techno-’ reveals that memorization on a grand scale is not an inherently human trait, but a skill learned through the application of a method or system devised by humans to artificially increase our capacity for the long-term storage of information. When viewed from this angle, the act of repetition, which I would posit is the most basic form of memorization (rather than the more advanced technique of mnemonics) is strikingly identical to a computer writing information onto its hard disk; the difference being that a computer is a thousand times more efficient.
In this sense memorization is not a signal of high moral character but merely the adoption of technology for use in day to day life. What’s more interesting is that the first use of this memorization technology that comes to my mind is in the use of ancient storytellers, who memorized their respective canons and passed them down to the next generation through a combination of repetition and their own homebrew mnemonic devices. While these stories weren’t just stories in the way that contemporary culture has demeaned the concept (an imaginary or fictional tale bearing little or no relevance to the actual world except through its social value of humor, education or moralization), they were the stories of where the storytellers had come from, what they had survived and what they had accomplished. These stories were the oral history of the people who were telling them, and the technology of memorization was their link to the past, to their heritage and their culture. Thus the technology of memorization is indistinguishable from the art of storytelling and the science of history.
But not all of humanity was content with the technology of memorization of oral histories. Some humans altered the technology of information storage; they upgraded their memory capacity through the use of symbols and ultimately the development of writing. The cave paintings in Lascaux are the some of the first efforts at upgrading memory capacity, and although primitive and short sighted (they ran out of memory very quickly!), opened the door to improving the technology of memory. Eventually the pictures evolved into symbols and the symbols into words, a huge technological advancement. Though writing, and its opposite reading, were neither popular nor ubiquitous until contemporary times, the creation of a new form of memory meant that a record of events or a group of stories (really, what a contemporary distinction I’ve made) could be recorded without memorizing them. This had its advantages and disadvantages. Records could survive even if the people whom they happened to and who first recorded them didn’t, living long lives in libraries or etched into the walls of tombs. The flip side is that all writing is coded to those who don’t understand the lexicon, thus all records are hidden until such time as someone with the patience, skill and desire to unlock the code does so. The records are also much more cumbersome than the human record. To flee a monastery in the midst of Viking raid is fairly easy compared to fleeing a monastery in the midst of a Viking raid carrying the painstakingly recorded multivolume history of the area and its people. Thus the development of writing, though a very useful technological leap, likely didn’t have a great effect on the memory of most of the individuals for a very long time, though it did have a great effect on the survival of the memories of groups of people who would otherwise have long ago been forgotten or overlooked, which is strikingly valuable today.
Until the spread of literacy writing and reading were a very limited technological skill set, but in today’s society that’s no longer the case. In the modern world literacy is ubiquitous (99% of Americans can read) and there is more recorded information than any one person could ever hope to read in 100 lifetimes, much less one. Memory is cheaper than it’s ever been and information is flowing faster than our mechanical memory can record it. I no longer need to memorize statistics, because for instance, I can pause my writing, flip over to Firefox, type ‘American Litera’ into the Google and Google will finish my typing with suggestions. Our technology anticipates our needs, further reducing the need for the memorization of facts, figures and even spelling. This is, however, an interface; the technology needs a user, it cannot determine what we need on its own.
The outsourcing of memory to machines and the ubiquity of literacy has decreased the need for the personal technology of memory and increased the need for something called meta-literacy, media-literacy or simply filtration. The personal technology of the information elite is no longer memorization or literacy, the two technologies that are the foundation of where we’re at, but filtration. This technology, even at its lowest level, is far from ubiquitous. For instance, according to a 1993 study by the US Government ’21% to 23% of Americans were not able to locate information in text”, could not “make low level inferences using printed materials” and were unable to “integrate easily identifiable pieces of information.’ So almost everyone can read what is written down, many of them don’t understand what it means or what its significance is. Memory and literacy are requisite building blocks for this pyramid, but the new peak is filtration. The first requisite is the memory of the millions of computers that store the information we’re accessing conveniently termed the World Wide Web, the second requisite is the ability to decode the symbol system that the information is coded in, i.e. literacy. But these two alone are not enough to get good information for the Web.
To extract useful, relevant data we must not only have access to the information and then know how to read the actual, basic language, but we must also be able to navigate a loosely codified matrix of symbols and signs that have nothing directly to do with the English (or Chinese or Urdu…) language in which the information is encoded, but are directly relevant to the authenticity and authority of the information we’re examining. These codes are likely infinite and constantly evolving, since the code that the Oxford scholar is using to navigate the jungle of information is not the same code that the Oakland crunker is using to do the same thing. In the world where memory is cheap and literacy is ubiquitous to be successful in using the Internet, watching television or even navigating a large metropolitan area, we must be able to discern useful information from useless information, we must actually read less in order to learn more, and to read less we must construct our own personal filters, our own personal structures of authenticity and authority to help us ‘read’ the metacontextual data that is always attached to the information being offered in order guide us to new and relevant information.
This is new. This is not how it has always been. Our generation has access to more (conflicting) information than any generation before has. Thus memorization of the 50 states and 194 (Google isn’t sure about this one) isn’t nearly as important as it used to be. The Internet will take care of that for us. What the Internet will not take care of is teaching us to discern useful and trustworthy information from useless and untrustworthy information. The Internet will not teach us how to do this, in fact, the Internet will encourage us not to do this, as it is comprised of sites that would prefer us not to be meta-literate, not to have filtration systems since the most profit, legal and illegal, scrupulous and unscrupulous, comes by taking advantage of the unaware. Certainly on a basic level it is unlikely that even the most media unsavvy will fall for a phishing scheme or be lured into divulging personal information more than once, since this powerfully and directly effects our ability to survive, but will most people take the next step to determine what information can be trusted and what can’t? Will they, in short, develop their own filter, their own critical thinking skills to examine the information that is being presented? Not likely. This example from the information technology blog ReadWriteWeb.com is incredibly telling about the incredibly low meta-literacy levels of most internet users, and I think another good example is software installation. How many people do we know who are smart, thinking individuals, very successful in other realms of society, who still can’t, no matter how many times they install a program on their computer, unclick the “Install Company X’s toolbar,” button, so their web browsers are riddled with useless, potentially harmful, information-gathering novelty toolbars? In my life, I’ve known a lot of those people. They have been taught to trust the companies that they are purchasing or using products from, and as such have no concept that they are being given something they don’t need. As basic literacy was slow to catch on, so too will meta-literacy.
For me, meta-literacy is what I’m good at, extracting information from the flood of data that threatens, on a daily basis, to overwhelm me (and you). So, no, I don’t have a good memory. I will likely never memorize another sonnet or soliloquy, much less the spelling of soliloquy (thanks MSWord!), but in the world that I inhabit, those skills are a waste of my time and energy. Sure, the one time a decade that I decide to hand write a letter I have to do a couple of drafts, but in the long run, I save a lot of time at the computer by letting the more efficient machines take care of that for me so that I can spend my time and energy sorting through the useless information to find and interpolate the useful.