27 Nov 2012

The Trouble With Testing

By Chris Bonnor
The findings of a new report on the Gillard Government's NAPLAN test are nothing new for education experts. The test was always bound to hijack measures of school achievement, writes Chris Bonnor

The only thing worse than a Cassandra, warning of impending disaster, is the person who emerges from the wreckage saying "I warned you".

I'm asking for trouble because when it came to the the application and abuse of NAPLAN testing, I was both. I was hardly alone — perhaps in the very best of company now that the Whitlam Institute, the Melbourne Graduate School of Education, and the Foundation for Young Australians, have issued their report, "The Impacts of High Stakes Testing on Schools, Students and their Families: an Educator's Perspective".

The Sydney Morning Herald featured the report on its front page, ironic given the Fairfax press led the charge to create league tables out of NAPLAN scores in the first place.

As The Whitlam Institute's Eric Sidoti says in the foreword to the report, the NAPLAN testing regime is plagued by unintended consequences well beyond its stated intent and represents a shift to "high stakes" testing:

"As the NAPLAN results become linked with funding and policy decisions, pressure for improving scores has vastly impacted on teachers, their practices and the curriculum. Educators are increasingly speaking out of the associated work pressures, higher workloads, narrowing pedagogy and diminishing time for quality teaching and learning."

Not only that, but the report cites research which shows how the tests "dominate the curriculum to the neglect of rich and important areas such as history, geography, physical education and music".

Federal School Education Minister Peter Garrett was quick to counter the conclusions of the report, claiming that principals and teachers are telling him all is well with NAPLAN — proving he thinks a good one-liner will trump evidence every time.

Then the Australian Curriculum Assessment and Reporting Authority (ACARA) collectively washed its hands of the impact of NAPLAN on students and schools, claiming ''excessive test preparation is not useful''. Try telling that to the hapless school principals who get beaten up by school comparisons.

We all know that teachers and principals shouldn't react to NAPLAN by ramping up test practice, teaching to the test and reducing time for other studies. But this was always going to happen when the stakes were raised and the test scores used to sum up the worth of each school and its teachers. Eric Sidoti kindly refers to such consequences as unintended, but they were certainly well known.

And the school comparisons and league tables reached absurd heights in a very short time. The Fairfax press added reading + writing + spelling + grammar + punctuation + numeracy, then divided all by six to produce a score and a rank for each school. Columnist Paul Sheehan then used the ranks to come up with all manner of half-baked conclusions, declaring heroes among some schools and making thinly veiled assumptions about the rest.

None of this would matter to Julia Gillard who conjured up the perfect storm of political advantage by making NAPLAN the essential core of the My School website. She pressed all the right buttons: openness, accountability, information — who could be opposed to any of this? She frequently recounts it as her greatest achievement as the then Education Minister. For this very reason not much if anything will change in the short term. The educators have now had their say, the government will just plough on regardless.

The other problems with NAPLAN and school performance, as reported on My School, are well known. Jane Caro and I thought they were worth a whole chapter in our book, What Makes a Good School. Gillard referred to NAPLAN and My School as akin to exposing schools to the cleansing rays of sunshine — the reality is that NAPLAN is a useful but narrow spotlight and leaves everything around it in the dark.

The problem is that, as the report demonstrates, less time is being spent on things outside this narrow spotlight. These include the subjects and activities that make a substantial contribution to the academic and cultural value of what schools do. They include the essential depth and diversity that hook kids into learning. Sometimes this requires innovative approaches to teaching and to school organisation — but this is far less likely and deemed to be more risky in a NAPLAN-driven testing regime.

The hijacking of NAPLAN to sum up and compare school achievement has always been fraught. It was ameliorated to some extent by giving each school an Index of Community Socio-Educational Advantage (ICSEA) which claimed to enable comparisons between schools. The initial ICSEA was a standing joke, more recent versions are much better but they still don't enable accurate comparisons of individual schools. We are still being told it is okay to compare NAPLAN scores between schools that have to be inclusive and others which select their enrolments.

Because when all is said and done it is each school's enrolment which is the main determinant of what is billed as school achievement. NAPLAN scores rise or fall between 30 and 40 points for each 100 point rise or fall in ICSEA. The single exceptions can be explained by problems with the measure, as much as by anything schools may or may not do.

This most recent report on NAPLAN joins a string of criticisms of the Howard and Rudd-Gillard governments' addiction to reform that makes little difference to student achievement — but commonly widens the gaps between the advantaged and disadvantaged. Sure, it has created a headline or two but nothing much will change.

Log in or register to post comments

Discuss this article

To control your subscriptions to discussions you participate in go to your Account Settings preferences and click the Subscriptions tab.

Enter your comments here

butlerad
Posted Tuesday, November 27, 2012 - 22:36

My eldest sat the dreaded test this year....she was "sick" on the day of the test...not to worry they made her take it in the school office the next day.
This seemingly endless quest to rank and score education institutions (from primary to tertiary) is one of the biggest mistakes we have made in our "modern" society.
Learning, the quest for knowledge, for solutions to problems, is not found on a test sheet it is found by doing. It is found by testing your boundaries and theories....I mourn for my kids' future....their worth is measured with a test score.

meski1
Posted Wednesday, November 28, 2012 - 12:49

What's the problem with NAPLAN, in particular? It'll be used to measure school performance, but before that year 12 exams (used for university entrance) were being used to measure school performance.
Kids, get used to exams. You'll encounter them in universities, you'll encounter them as a preselection for apprenticeships, recruiters will throw them at you, employers will. Being "sick" on the day won't usually let you take it the next day, it'll rule you out for the job/place you were after. And they never get to be easy or fun to do, because many are designed to not be finished in the available time.

fightmumma
Posted Wednesday, November 28, 2012 - 16:17

My younger son's naplan was so different from his usual school report that I was very concerned. I enquired at the school and they just said that the naplan doesn't tell much, it isn't worth much. (Mind you, for BOTH my children they tried to convince me to sign the exclusion forms - my elder due to his autism and my younger due to learning difficulties - but where the school never offered any assistance and resisted my desperate pleas to give them special needs help!!). So I carted him off to a child psychologist for a proper test and he discovered my son's HIGH IQ but a learning disability (school was useless in any form of help). The next year his dot on his school report in maths actually went BACKWARDS. I thought this very suspicious and enquired again about his school report (this time it was more in line with what the naplan's results). The teacher's specific explanation was that the school had had some "gaps" in its maths curriculum and was now in the process of catching up the students. My lingering question is - how can the school and teacher in Gr5 claim one level of competency when it was clearly a straight-out LIE - as shown by the naplan and by the following year's school report/admission by Gr6 classroom teacher?

I have a knee-jerk reaction against naplan and such formal stressful testing at such a young age - but it definitely helped me to see problems with my son's education that the school was dishonest is telling me about. I have worked in schools who were preparing for naplan - and much of the school day is spent repeatedly doing mock tests solely for the purpose of naplan. Is this how we want our children's school days being spent?

I believe naplan is helpfl if parents use it how I did - however it should be done outside of school hours and by a non-biased third party.

yogesh
Posted Thursday, October 24, 2013 - 23:46

I hope you’ll keep up the good work and maintain the standard.   free sex live