Knowledge representation issue in learning
(from P.A.Wozniak, Economics of Learning)
It has been known for ages that the way knowledge is represented affects the way it is remembered, and consequently, the degree of how easily it can be retained in memory over a longer period.
The art of mnemonics is as old as the art of learning, and professional mnemonists, with their trained memory capabilities, can truly leave an average mortal speechless in face of top mnemonic feats. Indeed, the mnemonic techniques are very easy to apply, and most of capable students, more or less consciously, use them in their daily routine. However, by conscious understanding of the rules and principles, even the best students can gain a great deal.
The basic principle of mnemonics, from the neurobiological point of view, is to build memory images from as many previously stored engrams as it is only possible. As visual processing in the human brain seems to involve much more sophisticated circuitry than, for example, verbal processing, extensive use of visual imagery is a key to success. Instead of memorizing a nonsensic telephone number, the student can memorize a collection of visual scenes unequivocally mapped to numbers, and generate a unique, easily memorizable sequence of graphic events that can serve as an effective way of representing the number. Recalling the phone number can then be equivalent to invoking the stored visual event and translating it to the sequence of digits, or more often, to a sequence of two-figure numbers. As I will try to argue in later paragraphs, minimizing the number of synaptic connections involved in storing memories is the key to maximizing retention over a longer period. Representing new memories as easily recoverable composites of old memories serves exactly that end.
To simplify the discussion of knowledge representation issues with respect to the complexity of neural connections involved in storing particular engram, I will shortly introduce a concept of a synaptic pattern.
As early as since the introduction of sensitive neural activity measurement techniques, it has been known that memories can be associated with spatiotemporal patterns of synaptic activity, or synaptic patterns in short. There exists a substantial terminology confusion as far as naming the concept of the synaptic pattern is concerned. Therefore, it is worth to notice than in relevant literature, the notion of synaptic patterns, often devoid of its temporal component, might be used more or less synonymously with terms such as: cell assembly, neural structure, synaptic structure, synaptic net, synaptic activity pattern, etc.
As I will try to show in the chapter devoted to biological aspects of memory, the complexity of synaptic patterns is likely to be in strict correlation with item difficulty (e.g. as expressed by the A-factor). It is therefore central to understanding the principles of effective representation of knowledge in self-instruction systems based on active recall and repetition spacing.
Items that do not comply with minimum complexity of synaptic patterns will, in the course of repetition, gradually experience loss of its components. In other words, memory will accomplish natural selection of the core synaptic pattern with elimination of all additional connections that do not get uniformly stimulated at repetitions. In later sections, I will use the term pattern extraction to describe the phenomenon of selecting the core synaptic pattern in the course of repetitions.