The New Old Way of Learning LanguagesPrint
Now all but vanished, a once-popular system of reading Greek and Latin classics could revitalize modern teaching methods
By Ernest Blum
September 1, 2008
In this country and abroad, there is a sense of malaise if not crisis about the state of foreign-language education. Critics note that it takes too long to acquire a foreign language, that the results hardly justify the investment of time and effort, and that students are unable to apply what they learn. But this kind of disgruntlement was already rife at the end of the 17th century. John Locke, in Some Thoughts Concerning Education (1692), wrote: “How . . . is it possible that a child should be chained to the oar, seven, eight, or ten of the best years of his life, to get a language or two, which, I think, might be had at a great deal cheaper rate of pains and time, and be learned almost in playing?”
Such discontent did not die down in the succeeding years, and by the early 19th century an obscure, peripatetic businessman by the name of James Hamilton interjected himself into the controversy in both the United States and his native Britain. His influence—his attacks on the dominant method of teaching foreign languages and the publication, by his disciples, of classical texts embodying his methodology—extended well into the 20th century. With a fiery, polemical style, he made waves: “Mankind are thirsty for real knowledge and will not long put up with the shadow of it. Either the teacher will find out a mode of communicating a knowledge of the learned languages in a shorter time and more efficaciously than has been hitherto done, or the study of these languages will be relinquished altogether.”
Hamilton is the one who popularized interlinear translations of Greek and Latin classics. The system attracted a large following, and the technique was applied to the teaching of French, Italian, and German as well. John Stuart Mill tells us in his Autobiography (1873) that he learned German through the “Hamiltonian System,” a term that had become synonymous with interlinear translations. Virtually out of print today, “interlinears” in Hamilton’s time and beyond were derogatively branded by their critics as “crutches,” “cribs,” “ponies,” and a number of unmentionable terms.
As for Hamilton, he is a figure on the edge of oblivion. Although he accurately predicted the decline of the “learned languages,” his 70-page tract, The History, Principles, Practice, and Results of the Hamiltonian System, which enjoyed a controversial renown when published in 1829, has all but vanished. Today there appear to be only four copies left—three in libraries in Great Britain and one in the United States, at the library of Amherst College.
Hamilton (1769–1831) is important because he was one of the last major proponents of a pedagogical tradition, extending from antiquity, that made the study of texts the dominant focus of the teaching of foreign languages. In this method, teachers explicated the literal meanings of the words, phrases, and sentences of those texts. But by the 18th century, such disclosure was under frontal attack. Teachers had settled on grammar as the main subject matter, and students were expected to provide the meanings of texts by themselves, aided by a dictionary. Today there is an almost total absence of interlinear translations, since the transparency of such texts would preempt students from their main task of parsing the grammar.
The rigorous new demands on language students have not been accompanied by corresponding results. In the last half of the 20th century, an explosion of computer-based studies of large texts, called “corpora,” has demonstrated that the number of words needed to read foreign-language books exceeds by several multiples the amount of vocabulary that is acquired by most foreign-language students. This huge vocabulary gap explains why it is impossible for most students to read extensive, sophisticated materials in foreign languages. Even many who are academically involved with foreign languages must depend heavily on dictionaries, consult translations, and accept reading with blind spots because of time constraints.
In view of these endemic problems, the demise of Hamilton’s interlinear books leaves an untimely lacuna in our educational system. In essence, the Hamiltonian book was designed as a formatting scheme to maximize the amount of information available to the reader of a foreign language. Hamilton’s interlinear format offered a “royal road” to the great texts of Greece and Rome. His format could serve as a template for access to all of the world’s important texts in an era when these texts are in precipitous decline.
Unlike most bilingual books, where the translation is on one page and the original text on the opposite page, the Hamiltonian system brought both texts into close contact. For example, Hamilton adjusted the typesetting of the two texts so that a word or phrase of the original fits just above the English equivalent. Hamilton also revised the word order of the original text to conform to the word order of modern languages, overcoming perhaps the greatest difficulty for modern students of classics. Although Hamilton was inheriting a well-established system—in 1703, Locke himself produced an interlinear translation of Aesop’s Fables in Latin and English—he was the first to promote interlinears as a consumer product. In newspaper advertisements and through unabashed public relations, Hamilton developed a mass market for his texts and his language courses for adults.
A gifted promoter, Hamilton attacked the prevailing grammar-and-dictionary methods, which he called “the common plan.” As could be expected from his acerbic style, he was himself attacked as a charlatan by outraged schoolmasters. But Hamilton tells us that he relished such public denunciation, since it fanned his publicity. During several years, Hamilton and a staff of instructors taught classes of as many as 100 adults in Philadelphia, Baltimore, Washington, Princeton, New Haven, Hartford, Boston, and in Montreal and Quebec, where he taught French to Englishmen. When Hamilton returned to Great Britain in 1823, he offered classes in London, Liverpool, Manchester, Edinburgh, Dublin, and Belfast.
Hamilton died two years after the publication of his tract, by which time, by his own reckoning, he had enrolled and taught some 10,000 adults. Hamiltonian interlinear translations in Latin, Greek, German, Italian, and French had been distributed in Great Britain, the United States, and the West Indies. On the Continent, the books were sold under the “Système Naturel” label, and sales reached as far as Calcutta.
Only some of the preeminent Greek and Latin classics identified with Hamilton were published during his lifetime. The rest were published by his American followers. The total of 11 volumes, published by D. H. Silver & Son of London, comprise works by Homer, Xenophon, Virgil, Cicero, Livy, Horace, Ovid, Caesar, Sallust, Juvenal, and Cornelius Nepos. The first eight volumes were also published, until 1966, by the now-defunct David McKay and Co. of New York. (Hamiltonian-style Greek and Latin texts in interlinear Italian translations are still available in Italy.)
Besides inspiring this legacy of accessible Greek and Latin classics, Hamilton left in his tract a series of notable pronouncements on foreign-language pedagogy, although the pamphlet itself is a loose pastiche of reminiscences, lectures, newspaper clippings, and letters to editors.
What is the main point explicated in Hamilton’s tract? The common plan then current emphasized the preliminary rote study of grammar and a dependence on dictionaries. In contrast, Hamilton proposes extensive reading of foreign language texts from day one, with interlinear translation disclosing all information required to understand each sentence.
“Reading,” he writes, “is the only real, the only effectual source of instruction. It is the pure spring of nine-tenths of our intellectual enjoyments. . . . Neither should it be sacrificed to grammar or composition, nor to getting by heart any thing whatever, because these are utterly unobtainable before we have read a great deal.”
Hamilton goes on, “As reading is the source of all real instruction, so it is also the sole, the only means by which the words of a language can be acquired. . . . The man who has not learned to read knows only those words which he has learned in conversation; his vocabulary is smaller than can well be imagined.” Nevertheless, Hamilton did not oppose the study of grammar, only its timing. “The theory of grammar should be taught only when the pupil can read the language, and understand at least an easy book in it,” Hamilton wrote, in agreement with Locke. Contemporary corpora studies have also identified vocabulary recognition as the main variable in reading success.
Hamilton speculated that the method of teaching languages in his era was not the method of previous times. Rather, he claimed, in the Middle Ages “it is highly probable that they taught by some such process as I have adopted.” We know today that this is essentially correct. For example, starting in the 12th century, pupils studying the Latin classics had access to the kind of basic didactic information embodied in Hamilton’s interlinear translations. In hundreds of extant manuscripts used as textbooks, canonical Latin texts are packed with glosses in the interlinear spaces and page margins, suggesting to modern scholars that medieval schoolmasters actively explained the most elementary details of these texts.
According to Robert Black, who made a census of 324 such glossed textbooks used in Florence from the 12th to the 15th century, the interlinear glosses provide Latin synonyms, explain word order in terms of modern languages, spell out figures of speech, explicate basic grammar, and supply missing but understood words. All of this “revealed a uniformly rudimentary level of comment” to assist novice students in their understanding of the philological meanings of the texts. Medievalist Virginia Brown, in her 2006 lecture at the University of Sydney, “Getting Down to Basics: Elementary Teaching of Virgil in the Renaissance,” relates that many manuscripts with extensive commentaries were written by copyists for beginning students and even for self-study. For example, around 1500, Hermannus Torrentinus states that his copies of Virgil are “user friendly” (familiariter) and “can be understood easily by all” (ab omnibus facile possint comprehendi).
This level of linguistic transparency is in contrast to the withholding of information by language teachers that was dominant already in Hamilton’s time and remains largely in force today. Although rarely commented on, the current practice constitutes an extraordinary anomaly in the contemporary educational system. In no other classrooms on campus is basic information systematically withheld as a matter of policy and principle. What is withheld is the information on the meanings of words, phrases, and sentences the students are reading. Students are expected to parse these for themselves, as a kind of perennial homework, after they have memorized grammatical rules and vocabulary lists. So students persevere, doing the translating and the explaining in the classroom, and it is the teacher who listens—a complete role reversal. Of course, teachers posing questions and students performing exercises are time-honored procedures to stimulate thought, but it is taken for granted that students will have access to a textbook with complete information on every detail of the course. In the case of the modern foreign-language curriculum, comprehensive information on the meanings of the words, phrases, and sentences the students are reading is routinely hidden in the teacher’s manual.
Few, if any, university presses in this country today publish a word-for-word, line-by-line interlinear translation of a literary classic in any foreign language. It should be noted that Harvard’s bilingual Loeb Classical Library contains texts on opposite pages—and thus doesn’t allow neophyte learners to read the texts, since the connections between corresponding words on the two pages are not readily apparent to novices. Virtually the only interlinear translations of classic texts published by any American publisher in recent years are the Greek New Testament, parts of the Hebrew Old Testament, and line-by-line interlinear translations of Chaucer.
One of Hamilton’s motivations in promoting his system was to reverse the dependence of students on dictionaries, a practice that he believed was inadequate to solving the problem of vocabulary ignorance. But it was only in the 20th century that the depth of this problem was disclosed in the form of an intractable vocabulary gap.
This vocabulary gap remains disconcertingly large regardless of what measurement of vocabulary is used, whether “word types” (dictionary words plus their inflected forms); “lemmas” (dictionary words); or “word families” (bundles of related lemmas).
The computer studies of corpora, which put into relief the vocabulary challenge for the general reader, are based on what is known as Zipf’s law, proposed in the 1949 book Human Behavior and the Principle of Least Effort by a Harvard linguist and statistician, George Kingsley Zipf (1902–1950). Zipf discovered his empirical law, which has been confirmed in many languages, without the aid of a computer, by manually tabulating the 29,899 word types and 260,430 total words in James Joyce’s Ulysses.
Zipf’s law tells us that the frequency with which distinct vocabulary words occur in book-length texts and larger corpora declines in a generally regular, fixed, and simple way, as the number of vocabulary words in the text increases. For example, the second most frequent word in a text occurs, by and large, one-half as often as the most frequent word. The third most frequent word occurs one-third as often as the most frequent word; the fourth word, one-fourth; the 10th word, one-tenth; the 1,000th word, one-thousandth, etc. In other words, Zipf’s law states that the frequency with which any given vocabulary word occurs in a text is inversely proportional to that word’s rank in the vocabulary list, starting from the most frequent word.
There are two overriding consequences of Zipf’s law, each with a contradictory impact on language learners. Only a few words account for the overwhelming bulk of words used in a language. In English, for example, the word the accounts for 7 percent of all the words used, and only 10 words account for up to 23 percent. For many languages, a rule of thumb is that a mere 100 to 150 words account for around half of any text. In the Greek New Testament, only 319 words account for just under 80 percent of the text. This is, of course, a welcome situation for getting one’s feet wet in a language.
But the second consequence of Zipf’s law is troubling for those who would master the reading of languages. When one gets beyond the high-frequency words, the overwhelming number of vocabulary words that we encounter are low-frequency words. In the case of the Greek New Testament, for example, 319 words account for almost 80 percent of the text, but it takes 5,118 infrequently used and unfamiliar words to cover the remaining 20 percent. In large corpora, the words that occur only once (so-called hapax legomena, or Greek for “said once”) account for around half of the total vocabulary.
As authors Christopher D. Manning and Hinrich Schütze put it: “The basic insight (of Zipf’s law) . . . is that almost all words are rare.” Take the pioneering Brown University Corpus of U.S. word usage, published in 1967 and considered one of the most influential corpora in the history of corpus linguistics, even though its million total words are now dwarfed by other corpora. The Brown Corpus was taken from the common types of contemporary American written texts any literate person could come across. Of its vocabulary of 53,076 word types, 36,135 occur only once, twice, or three times—comprising 68 percent of the vocabulary. While these low-frequency words as a whole account for only 5 percent of the tokens in a typical Brown Corpus textual sample, this is equivalent to some 20 or more words on a printed page—all popping up from 36,135 relatively uncommon word types. Such infrequent words are particularly salient and rich in “surprise value,” and they enhance a sentence’s informational value, according to Henry Kucera, co-author of the Brown Corpus. “The less frequent the word,” he says, “the more important it will be . . . for an understanding of the communication.”
This kind of vocabulary challenge is troublesome, but there is some relief. Researchers on vocabulary acquisition have determined that virtually all literate people acquire an extensive vocabulary through reading and simultaneously inferring the meaning of unknown words by their contexts. For example, it is estimated that students enter primary school with 5,000 word families, or bundles of related words, and acquire an additional 1,000 or more a year through this method, attaining a vocabulary of 20,000 word families in college. But there are serious constraints on inferring meanings from contexts. Leading researchers in this field, such as Paul Nation and Batia Laufer, have determined that for readers to be able to guess words from context, they must recognize at least 95 percent of the words in a text; and Nation himself has recently emphasized 98 percent as the threshold for reading with pleasure, without stopping to consult a dictionary or glossary.
However, attaining even a receptive knowledge of 98 percent of the words in the texts we read represents a formidable challenge. For example, one must know 15,851 lemmas to cover 97.8 percent of the million words in the Brown Corpus, and even that amount is just 42 percent of the total of 37,851 lemmas found in the Corpus. In a text-coverage study of Time magazine, researchers Kiyomi Chujo and Masao Utiyama discovered that 14,000 lemmas provided only 96.9 percent of text coverage. Although text-coverage data on other languages is relatively limited, the minimum vocabulary needed to read only 95 percent of the words in any Dutch text is 10,000 to 11,000 lemmas, according to J. H. Hulstijn. In French, Serge Verlinde and Thierry Selva of the Modern Language Institute (Belgium) found that 22,000 lemmas of the Dictionnaire du français account for only 94.14 percent of the words in issues of Le Monde and Le Soir.
This amount of vocabulary appears to be far in excess of the number of words that most students of foreign languages acquire. In a study in Japan, Kiyomi Chujo determined that Japanese high school students acquire only 3,200 English lemmas from textbooks. In the United States, the number of Spanish words studied by first-year college students using the three most popular textbooks range from 1,965 to 3,217, report Mark Davies of Brigham Young University and Timothy L. Face of the University of Minnesota. In Great Britain, students of French do not acquire more than 3,300 words on average, and national tests do not set higher word standards, reported James Milton of Swansea University. And at the University of Leipzig, 79 percent of students of English do not attain a receptive knowledge of 5,000 lemmas after eight years of study, according to Erwin Tschirner of the university’s language department, and 72 percent cannot recognize 3,000.
These vocabulary deficits help explain why test results of student achievement in reading foreign languages, in both the United States and worldwide, are far from impressive. Indeed, based on data of a 2006 joint report of the Education Testing Service and the Council of Europe, it appears that, worldwide, most students studying English as a foreign language do not have the proficiency to read English academic books and other diverse writings. For example, during a recent 15-month period, only 15 percent of those taking the Test of English as a Foreign Language scored high enough to demonstrate an ability to read “a wide range of demanding, longer texts,” as determined by new minimum performance test scores set by the council through its international protocol, the Common European Framework of Reference for Languages.
Students studying English as a foreign language are not the only ones whose reading competence is in some doubt. American students of foreign languages demonstrate precarious reading skills as well. Even though universities do not publish test results in this area, the College Board released results for six Scholastic Aptitude Test foreign-language reading tests (in French, German, Italian, Latin, Spanish, and Modern Hebrew) administered to high school seniors between 2000 and 2002. The results were for typical groups of test takers—3,750 persons in Spanish, 3,607 in French, 255 in Italian—who had taken language courses over a two-to-four-year period. For five of the six tests, students on average answered a notably low percentage of the questions correctly—between 50 and 60 percent. Attempting to read lengthy texts in a foreign language would likely be an exercise in futility for these students.
This is why researchers in applied linguistics are recommending new strategies to fill the gaps in vocabulary—for instance, massive rote word learning, guessing word meanings by context, and using graded textbooks. But the efficacy of these fixes is problematic. The rote learning of words is not palatable to many students and deprives them of contact with actual language. Making students read texts without an adequate vocabulary induces frustration and defeatism. And while graded readers are a valuable resource, they are limited in vocabulary and intellectual interest.
Only extensive reading of texts with interlinear translations, as advocated by Hamilton, can offer a comprehensive solution to the problems involved. Most student readers of long foreign-language texts must have broad vocabulary assistance. We are talking here about reading several million words of running text, since thousands of unfamiliar and infrequent words must be encountered multiple times to be retained in the mental lexicon. And few long texts of any kind even exist with accessible, comprehensive information on the meanings of their words. Surely a dictionary is not up to this repetitive, Sisyphean task, whose full solution remains the use of interlinear translations. A few such translations are now being developed digitally in audio-visual, database, and Internet formats, but these should not replace print as long as we live in a print world.
In the absence of full disclosure in the great texts, the educational system continues to deny these texts their place as intrinsic knowledge, equal to, indeed superior to, the knowledge embedded in translations. As the Italian aphorism elegantly puts it, “Traduttore, traditore” (a translator is a traitor) to the original text. By leaving the great texts largely inaccessible, however unwittingly, the university denies unfettered access to them that it affords to other knowledge. Strangely, few on campus acknowledge that students, and all others, are freely entitled to information about the words of great texts—based on the principle that these words are part of the knowledge for which universities are stewards.
The walling off of great texts, with the exception of those written in English, is not a good development for the literate public, for students, or for our educational system. For millennia, the study of original classic texts constituted the core of education. The decline of these texts thus poses a challenge to the universities, to liberal education, and indeed to the future of civilization.
It is a challenge for which Hamilton, in his time, provided a feasible plan of action. In our time, the plan should be even more ambitious. For each major literary language, publishing fewer than a hundred different classic texts in interlinear and audio formats would help reverse the decline of interest in reading the world’s great texts, expand the learning of foreign languages, and enhance informed communication among peoples.
Ernest Blum is a writer and holds a Master of Arts degree in comparative French and German literature from the University of Chicago.
Comments are closed for this post.