top of page

Analyse this…synthesise that – a pedagogical focus.




This is not a blog about phonics, but let’s start with phonics…


When mathematician/philosopher/polymath Blaise Pascal developed a rudimentary systematic synthetic phonics system in 1655 he (probably unknowingly) undermined the only other phonics approach that existed at the time – analysis of a word’s letters and the represented sounds read aloud to an emergent reader – and most of the time this analysis would have to be done by the reader.


The arguments between the efficacy of analytic phonics and synthetic phonics have a long and bellicose past with the fundamental pedagogical difference being the point at which the sounds represented by the letters are identified by the emergent reader when identifying an unknown word (that is a word that cannot be read instantly as a whole). With analytic phonics, the unknown word is identified by a third party as a whole word (usually the teacher but often left to the emergent reader) and the sound letter correspondences then analysed for the phonics learning. With synthetic phonics, the emergent reader is taught letter sound correspondences systematically and then uses this knowledge to identify unknown words (which initially will be the vast majority until orthographic processing develops) by blending (synthesising) those sounds together. Consider the word ‘hyperandrogenic’ which you may never have encountered before. When first seeing this word, you may slow down and utilise your phonic knowledge to sound out the word, synthesise(blend) those sounds and read the word; or you may get someone who has prior knowledge of the word to read it to you and then analyse the sound pattern relationships in the hope that you will remember these the next time you encounter it. (Much more on this at the excellent-https://www.parkerphonics.com)


Most of the research (Chall, 1967; NICHD, 2000; Johnston and Watson, 2004; Torgenson et al.,2006) suggests that for initial reading instruction, synthetic phonics is the more efficient. There have been dissenting voices and studies (Glazzard, 2017, Clark, 2017, Bowers, 2020) but the teaching of synthetic phonics in England is all but statutory (Mills, 2020 – good overview here). Tunmer et al. (2015) suggested that the theoretical framework for synthetic phonics instruction for emergent readers is more robust because it is systematic. New readers learn the English alphabetic code systematically and then apply that learning through effortful practising which results in successful decoding and word reading. Analytic phonics, on the other hand, leaves much to chance and is, like whole language approaches, far more reliant on statistical learning (Treiman, 2018). First, the unidentified word has to be identified as misread by a teacher, then the teacher has to identify the correct phonic pattern that has been misidentified followed by further teaching. All of these stages can be avoided or ignored if the teacher so wishes (or does not have the expertise or code knowledge or just wants to get on with the text) with the likelihood that the misidentified pattern is not encountered again for some time. That is not say that analytic phonics cannot work, it can, it just leaves much room for missed learning, has no systematised approach and is bemusing for struggling readers.


Putting aside the arguments of which method is the more effective, there is perhaps much we can learn from the debate in terms of instruction of emergent and developing learners beyond early reading.


The terms analytic and synthetic have their philosophical roots in Kant’s (1787) ‘Critique of Pure Reason’ where he considered analytic judgements ones where the predicate belongs to the subject and are thus true by definition (All bachelors are unmarried men) and synthetic judgements where the predicate lies outside of the subject merely amplifying the concept and can only be true by experience (All bachelors are unhappy). Analytic statements thus contain a priori knowledge – they are knowable without experience, and synthetic statements contain a posteriori knowledge – they are only knowable through experience.


By taking a few liberties with Kant, we may find some pedagogical illumination. Where analytic phonics requires the word to be identified initially, the process starts with a priori knowledge – the word ‘paint’ is made up of the letters p,a,i,n,t but is always identified first and is thus true by definition, but synthetic phonics requires that the letters indicate the sounds necessary to identify the word – the letters p,a,i,n,t represent the sounds p, ā, nt. The word might therefore be misidentified at ‘pant’ or ‘plant’ or mispronounced as ‘pa-int’ so is not true by definition but by experience and knowledge is thus a posteriori.


Thus, knowledge that requires experience also demands effortful activity for its application. It may be this requirement of effortful activity by definition that make synthetic approaches to learning so effective, particularly in the early stages of study and particularly when acquiring procedural knowledge. As Willingham (2009) suggests, the human brain is not designed for thinking and eschews hard work. This is the reason the brain attempts to automatise as many activities as possible. Paradoxically, it is the effortful activities that lead to most learning. Cunningham et al (2011) suggested that the cognitive effort of unsuccessful decoding attempts were far more useful in developing instant word recognition than having the word identified for the reader by a third party. Quintilian’s approach to early learning put considerable emphasis on effortful repetition of syllables to the point where he was accused by Capella (Bennet, 1991) of developing a ‘fatigue-march’ progression towards reading (1991:135). It was only after complete mastery of word reading, grammar and the elements of rhetoric did Quintilian allow his Roman charges to move onto analysis of texts for their rhetoric value. In other words, he utilised systematic synthetic approaches which involved significant, effortful practice until mastery before he moved onto an analytic approach.


In terms of reading comprehension development, this suggests that when readers create meaning from the words alone by utilising a bottom-up reading strategy (Carrell, 1989), they are applying a synthetic approach to the identification of the words but an analytic approach to meaning by assuming that knowledge is created from of the words and not experience – this is what skills-based reading comprehension programmes encourage. Conversely, when readers apply a top-down approach to meaning creation (Millis et al., 2006) they bring an analytic perspective to their reading with an assumption that there exists a priori meaning in the mind of the writer but that they, as readers, must synthesise their knowledge and experiences to attempt to identify this a priori knowledge.


This is perhaps what lay at the heart of Corson’s (2019) unfavourable comparison of Doug Lemov’s concept of close reading with Derrida’s ideas. Lemov, because he is a pedagogue who seeks systematic, applicable techniques to empower students, utilises synthetic, effortful approaches to complex meaning creation which must assume a priori knowledge. In other words, the reader is working on the assumption that there is a priori meaning embedded in the text by the writer. Derrida, on the other hand takes a post-structuralist approach that assumes all knowledge is synthetic, a posteriori and dependent on experience and interpretation. This is far more difficult to create a pedagogical approach for as, like Humpty Dumpty, the reader can always argue that a word means, ‘…just what I choose it to mean – neither more nor less.’ And Humpty goes on the nail the big question here, which he suggests is, ‘which is to be master – that’s all?’ (Carroll, 1865).


So, in terms of how we teach most effectively, which is to be master, a synthetic approach or an analytic approach?


The answer is, inevitably, nuanced. It depends on which phase the pupil is in their learning and perhaps this is where phonics instruction can help. When a pupil is learning to read, it appears far more effective to take a synthetic approach to phonics. A comprehensive understanding of the letter/sound correspondences, taught systematically, enables pupils to then apply those learned correspondences, recognise regular patterns, and identify words. Regular reading and the identification of more and more words is then self-taught through statistical learning (Share, 1999) enabling teachers to take a more analytic approach to the creation of meaning from text. Once readers gain greater knowledge and develop high quality schematic and experiential frameworks, they can then be encouraged to question an author’s a priori intent and finally consider other influences that the writer may have little control over and how those words are interpreted and synthesised by individual readers.


Afterthought

It is interesting considering the above when applied to the consideration of the teaching of writing.


We often rush through the initial synthetic approaches to teaching early writing spending negligible time on letter creation, word writing and sentence practising, preferring a more analytic approach driven by the belief that as writing always has an a priori foundation rooted in the writer’s intent we must focus on that final outcome. More interestingly, most writing programmes utilise the ‘exemplar text’ as the paradigm for text creation. This is entirely an analytic approach and eschews the above assumption that emergent learners benefit from an initial synthetic approach. Perhaps text creation, rather than analysis is the more effective option, however rudimentary the results. It is difficult to learn to play tennis merely by watching great players and analysing what they are doing. This is much more effective when you’re a competent player. Is this why one of the few effective approaches to the teaching of efficient writing for young writers, that has a significant research base is so roundly ignores…sentence combining?

コメント


bottom of page