Michael Lawrence

Testing 3, 2, 1


Скачать книгу

of the flood of interest in Finland’s education system: every year hundreds of delegations comprising teachers and policymakers from all over the world pour into Helsinki to see this nirvana for themselves.

      So popular has it become that international visits are strictly regulated and have to be paid for: a presentation costs €682 (£588 as at November 2019) per hour and a school visit €1240. (Weale, 2019)

      Back in Tampere, I did not dare mention that the school I work at (like most other secondary schools) also has two rounds of examinations (on top of the NAPLAN ones in years 7 and 9) in each of years 9, 10 and 11, meaning students sit no fewer than 12 rounds (each ‘round’ could be a period of up to two or three weeks comprising examinations and study time preparing and revising for the big day) of examinations from grade 3 through to year 12.

      And I nearly forgot the AGAT (the ACER General Ability Test, ACER being the Australian Council for Educational Research). This is designed to help teachers assess learning potential and aptitude in years 2 to 10. Finland makes do with only the final matriculation exam.

      ‘Does it work?’ they asked, unable to suppress their shock.

      The answer was a simple ‘No.’

      ‘Well, why do they do it? Why does the teacher allow the students to do it?’

      The last question really hit home. There is no reasonable answer other than to say that this is the way we have always—well, at least for the last decade and a half—done things, which seemed terribly inappropriate even as the words left my mouth. I wanted to be able to say that when the test helps us to identify students who have a weakness in their learning we are then able to provide suitable support to enable them to overcome this and continue succeeding in their education.

      But this would have been a lie.

      The truth is that when weaknesses are identified through NAPLAN testing there is no set policy for addressing the issue. In fact, we (well, the My School website) will encourage parents to remove their sons and daughters from the poorly performing schools (or not to send them there at all) by making the schools’ results public.

      The suggestion appears to be that the resultant drop in student numbers and public shaming will somehow encourage the poorer-performing schools to ‘lift their game’. That school funding is based on student numbers ensures that the poorer-performing school will also be punished financially. Parents in a position to move their children will do so, but what of those who cannot afford the money or time to relocate them to a more distant school? They remain in a classroom where many of their peers have also not done so well on the NAPLAN test.

      In a school whose funding has been cut and whose better teachers are probably feeling somewhat disillusioned by all of the above, they—like the better-off parents—will be looking out for another school if possible.

      If there is little to be gained by students in the NAPLAN test, then how do their teachers and principals fare? One can only imagine the stress of being principal at a school with the lowest NAPLAN score in its town or city.

      Public shaming, likely loss of student numbers and funding, parent responses and students (not to mention their school) labelled the ‘worst’ in town. All for a standardised testing system that really does no favours for the students, who are almost certainly from the most disadvantaged part of town. Feeling somewhat guilty, I stayed silent on the fact that not only were we administering world record numbers of tests, but we were also teaching towards these tests, making their content the curriculum and judging the merits of entire schools, teachers, principals and individual students according to the results they yield.

      Teachers in Australia are trained in the ‘mandatory reporting’ of anything resembling child abuse in any form. How could we have done this to so many thousands of students?

      How many students and adults now loathe mathematics, science or English (perhaps there is a bright side to the fact that NAPLAN omits the arts!) because of early NAPLAN experiences? If the one third of the class I asked the question to are indicative, we are talking many thousands of students.

      I have often compared the obsession with increased NAPLAN-styled testing to presuming you could change the temperature by looking at the thermometer more frequently.

      The discussion rarely turns to why we should expect improvement. Indeed, the only logical reason to expect any is that schools are now teaching for the NAPLAN tests, effectively making them the curriculum (at the expense of many far more useful faculties such as creativity). The fact that results are still not improving should be a cause for further concern.

      According to a study published in the Australian Journal of Language and Literacy reporting a survey of more than 200 year 7 and 9 teachers across NSW in 2017, ‘nearly 60 percent disagreed with the statement that NAPLAN provides important information on the literacy skills of students’.

      ‘NAPLAN’s out of control,’ said Chris Presland, president of the NSW Secondary Principals’ Council: ‘The problem with it goes beyond teaching to the test, there’s certainly an over-obsession with data and pressure on schools to perform because of the comparative nature of the data.’

      Why has it taken a visiting American professor to tell us this?

      The US has seen similar results with its national testing program; after a decade-plus of the NCLB (No Child Left Behind) test, results indicate no changes in the ‘achievement gap between poor and wealthy students and gains on achievement tests are small, even after extensive time has been allocated in schools across the nation for direct preparation for the tests’. (David C.Berliner, 2014)

      In Victoria, the My School website appears to suggest that one of the main criteria for selecting a school should be NAPLAN results. The system encourages students to move from lower NAPLAN-scoring schools as if this will magically improve educational outcomes. As previously mentioned, the students remaining in the lower-scoring schools will supposedly somehow have any deficiencies identified by the test rectified by means unspecified. This is an unlikely outcome as the school concerned will most likely forfeit some of its funding through loss of student numbers and quality teachers who may have elected to move elsewhere rather than continue in the environment of a school depleted of funding and its most talented students.

      In Australia it can be difficult to find an educator who is not caught up in the standards movement—though I suspect many are not there by conscious choice.

      The movement is, at its core, the idea that the best we can do is ensure every student has a minimum standard of certain skills. Apparently, it follows that if they have these skills they can work out the rest from there. I have witnessed situations where teachers have worked countless hours trying to fit the government-recommended curriculum into the allotted hours for a given subject, only to be told that the entire subject had been scrapped by the school or, worse, that much of the content was now deemed unnecessary as it was not relevant to the VCE examination (and therefore the ATAR score) component of the subject.

      A recent report from libertarian think tank the Centre for Independent Studies pushed back against criticism of NAPLAN, stating: ‘A test cannot be blamed for a lack of improvement—this would be analogous to blaming a thermometer for a hot day or criticising scales for a lack of weight loss.’ (Joseph, 2018) The report at no point addresses exactly how looking at the thermometer (or scales) with greater frequency can improve results. The assumption is maintained that if we teach the ‘basics’, whatever they may be, creativity and all other necessary capacities will follow.

      Acclaimed educationalist Sir Ken Robinson addresses this need for the basics. ‘The old