top of page

Efficacy vs intuition - phonics goes for a knockout blow in the late 19th century but is floored by


By the late nineteenth century in the United States the vice-like grip of the whole-word method on initial reading instruction and the reliance on the McGuffy Reader was starting to diminish as the long-term effects of the method were starting to become evident. Pupils made apparently huge initial strides in reading fluency in the early grades, but as Smith notes from a contemporary journal ‘…pupils in the higher grades are not able to read with ease and expression; they have so little mastery over words that an exercise in reading become a laborious effort at word-calling…’ (2002, p124). There is no doubt as to where the blame is placed: ‘…the inability of pupils in the higher grades to call words is the legitimate outgrowth of the teaching of the word method…’ (2002, p124).


There were further problems. By ignoring the alphabetic code and by teaching reading through symbol recognition the word method, by definition, ignored the letters that formed words. The method had raised concerns as to the efficacy of promoting accurate spelling from its first institution. In the introduction to his primer, ‘the New Word Method’, John Russel Webb includes a glowing account of its effectiveness from a teacher that contains the flippant yet portentous warning, ‘It was soon discovered that the children could not spell…’ (1846, p4). By the 1890s the concerns had grown to such an extent that Joseph Rice (an early proponent of evidence-based practice in education) conducted a survey of pupils in the public schools of the United States. Thirty-three thousand pupils were given reading and spelling tests with the conclusion that ‘phonics led to better results in reading than word methods…’ (1893, p 144). Rice (1912) also concluded that the best spelling results were obtained where the phonic method was used.


As a result, there was a significant and explicit shift towards phonic teaching methods particularly in early reading instruction. Phonics had never actually gone away. It was an integral part of the word method; however, it was always employed after word learning and in an analytical form often many months after reading instruction had begun. This analytic approach was thus predicated on word recognition and was thus not taught as a decoding strategy but a spelling tactic. Synthetic phonics programmes started to develop apace as the word method proved ineffective, and although they still prejudiced Webster’s (1785) alphabetic order rather than codifying sound relationships, the failure of the word method led to some clear pedagogical conclusions. Rebecca Pollard’s ‘Synthetic Method’ was adamant that: ‘Instead of teaching the word as a whole and afterwards subjecting it to phonic analysis, it is infinitely better to take the sounds of the letters for our starting point,’ (1889, p3). She then states with resounding prescience, ‘There must be no guess work;’ (1889, p4).


There was, nonetheless, a shadow being cast on phonetic methods from the considerable influence of George Farnham and his tome, ‘The Sentence Method of Teaching Reading’ (1881). Farnham’s logic articulated that reading instruction material should be neither letter-based, nor word-based but focus on the smallest unit of sense: the sentence (Diack, 1965). This was the word method extrapolated to the extreme. Children learned whole sentences and repeated them until they mastered the meaning and sense with all relevant stresses and emphasis. The sentence was then analysed and fractured into words and then letters and then sounds. Farnham’s method was opportune and played well with the general goal of the period of developing an interest in literature as the learning of whole stories became integrated into the method (Smith, 2002). Farnham’s method had spectacular results when children’s reading, writing and spelling was tested against those being taught by the word method and phonics instruction such that he maintained in the forward of his book that the problem of how to teach the reading, writing and spelling successfully, ‘had been solved’ (1881, p2).


The results of the experiments were chronically invalid. Children who had been taught by his method were doing no more than reciting a learned script where all the stresses and syntax had been drilled to perfection. They were being compared to children who were decoding unseen text using phonic strategies and who had not reached automaticity along with children who had been taught the word method and were encountering some words that they had not yet memorised. In terms of the spelling and writing, Farnham’s charges were merely encoding a script they had learned to spell and write to perfection. Nevertheless, the results were taken on face value and the sentence method became an established part of elementary teacher instruction in the eastern United States training colleges.


Although there is no concrete evidence of Farnham’s (1881) book crossing the Atlantic (Diack, 1965) England was not immune to the sentence method but sourced this pedagogy from Belgium and the work of Ovide Decroly with disabled pupils. Decroly used whole sentences to teach reading but favoured commands which were learned verbatim, then broken down into words and then syllables and subsequently into individual sounds as phonetic analysis (Hamaide, 1977). This differed from Farnham’s methods which explicitly eschewed phonic analysis. Nevertheless, the effect was the same: the prejudicing of word learning with later phonic analysis was not decoding instruction (McGuinness, 1999).


The sentence method was a further projection of the word method’s attempt to solve a perceived, intuitive yet non-existent problem: that early readers were not reading with fluency, often sounded stilted and wooden, read slowly, without prosody and with numerous stutters and corrections. In England, Holmes, of Her Majesty’s Inspectorate, noted in his criticism of phonic methods of instruction, ‘the dismal and unnatural monotony of sound which pervades every classroom…’ (quoted in Diack, 1965, p77) and Russell Webb’s ‘The New Word Method’ delights in the absence of ‘unpleasant tones and drawling,’ (1846, p3) as a result of his whole word teaching. To this day, the sounds of a novice reader decoding can be painful and were dismissed by children’s poet Michael Rosen as ‘barking at print’ (2012, michaelrosenblogspot.com).


What was being ignored by these critics was the necessary phase of decoding through which a novice reader must pass as they associate the letters and combinations of letters with the sounds they represent, articulate those segmented sounds and then synthesise them to form a word. This phonological neurological route will be further complicated as the reader’s brain attempts to access meaning (which may or may not exist for a young child) through a lexical neurological route (Dehaene, 2014). The resultant oral expression of this process and of the developing reader practicing within this phase will be a necessary slow, stuttering and often monotone manifestation. By attempting to leapfrog this essential phase of reading and reading practice by favouring prosody over decoding and automaticity, the word method and sentence method advocates privileged the appearance of reading fluency over the potential actuality of it and why, as discussed above, reading in older students plateaued at such a low altitude of expertise. This pedagogical misconception infected reading instruction for almost a century ultimately mutating into the disastrous whole language approach to reading (Goodman, 1973) and condemning generations of English speakers to semi-literacy.


Nevertheless, phonic instructional models were proving increasingly popular especially among teachers (Diack, 1965) although all continued to organise their teaching chronology alphabetically rather than by sound until a school mistress at Wimbledon High School designed the first truly ‘linguistic’ approach to reading at the turn of the nineteenth century (Diack, 1965). Dale’s (1902) ‘On Teaching of English Reading’ based the teaching of reading on the forty-two sounds in the English language and once the sounds were mastered children were taught to match each sound to its letter or digraph (McGuiness, 1999). Dale insisted that no letter names were taught initially, indeed, she used no letters whilst sounds were being learned and she made the vatic link between decoding and spelling to avoid, ‘unnecessary and fruitless labour,’ (1902, p10) in the future.


Her masterstroke was almost accidental. By organising sounds by the location of their formation in the mouth when speaking – children felt the movement of their mouths, tongues and jaws when creating sounds – they were naturally arranged together and the letters that represented these sounds ordered thus. The result was the first recorded move away from Webster’s (1832) alphabetic structure of phonics to an approach that codified phoneme to grapheme correspondence systematically by sound. Children recorded these on a frame on the inside of their school desks. When learning to decode words, children came to the front of the class and held up cards with the letters that represented the sounds in the word. The sounds were individually spoken by the whole class and then the cards drawn together, and the sounds blended to create the word. The first linguistic phonics programme had been created.


Although phonetic instruction techniques were in the ascendency at the turn of the century the word and sentence methods were too well rooted to wither. All they required was a chance watering for them to flourish once more. They hadn’t long to wait and ironically it was scientific research that rescued them.


In 1885, American PhD student James Cattell carried out a series of laboratory studies at Wundt University in Germany. Using a tachistoscopic technique he gauged that it took a literate adult ten milliseconds to read four individual unrelated letters. In his further testing he found that in the same amount of time, a literate adult could read a short sentence of four words, or approximately twenty-four letters. He concluded that if in ten milliseconds the limit for unrelated letters was four, then the words were not perceived in terms of letters and that humans do not therefore read by letters but by words. Cattell’s findings were replicated in 1969 by Reicher and remain undisputed to this day. His conclusion, however, is not only disputed, it holds not a shred of veracity (Dehaene, 2009).


None of this would have mattered had not Edmund Burke Huey, an American educator, not stumbled upon Cattell’s findings and written the bestselling “The Psychology and Pedagogy of Reading’ (1908). Huey’s book stopped the phonic method of teaching in its tracks. The word method now had scientific proof that it was the only true word and a powerful advocate proselytising its gospel.


This blog is number 9 in a series of blogs.

You may be interested in reading:

1. https://www.thereadingape.com/single-post/2018/01/07/Phonics---the-lever-of-civilisation

2. https://www.thereadingape.com/single-post/2018/03/30/Why-English-is-just-so-darned-difficult-to-decode---a-short-history

3. https://www.thereadingape.com/single-post/2018/04/09/Phonics-2000-years-ago---Quintilianthe-accidental-phonics-genius-with-a-toxic-legacy

4. https://www.thereadingape.com/single-post/2018/04/29/Martin-Luther---the-unshackling-of-literacy---but-a-fatal-flaw

5. https://www.thereadingape.com/single-post/2018/06/01/Universal-literacynot-in-England---the-historical-roots-of-a-divided-education-system

6. https://www.thereadingape.com/single-post/2018/06/10/How-America-helped-ruin-English-reading-instructionwith-some-help-from-the-Germans

7. https://www.thereadingape.com/single-post/2018/07/08/Short-term-gains-long-term-pain-The-Word-Method-starts-to-take-hold

8. https://www.thereadingape.com/single-post/2018/08/06/How-assessment-and-inspections-ruined-reading---England-1870

Follow the reading ape on twitter @thereadingape



bottom of page