The Trouble With Testing

0

The only thing worse than a Cassandra, warning of impending disaster, is the person who emerges from the wreckage saying "I warned you".

I’m asking for trouble because when it came to the the application and abuse of NAPLAN testing, I was both. I was hardly alone — perhaps in the very best of company now that the Whitlam Institute, the Melbourne Graduate School of Education, and the Foundation for Young Australians, have issued their report, "The Impacts of High Stakes Testing on Schools, Students and their Families: an Educator’s Perspective".

The Sydney Morning Herald featured the report on its front page, ironic given the Fairfax press led the charge to create league tables out of NAPLAN scores in the first place.

As The Whitlam Institute’s Eric Sidoti says in the foreword to the report, the NAPLAN testing regime is plagued by unintended consequences well beyond its stated intent and represents a shift to "high stakes" testing:

"As the NAPLAN results become linked with funding and policy decisions, pressure for improving scores has vastly impacted on teachers, their practices and the curriculum. Educators are increasingly speaking out of the associated work pressures, higher workloads, narrowing pedagogy and diminishing time for quality teaching and learning."

Not only that, but the report cites research which shows how the tests "dominate the curriculum to the neglect of rich and important areas such as history, geography, physical education and music".

Federal School Education Minister Peter Garrett was quick to counter the conclusions of the report, claiming that principals and teachers are telling him all is well with NAPLAN — proving he thinks a good one-liner will trump evidence every time.

Then the Australian Curriculum Assessment and Reporting Authority (ACARA) collectively washed its hands of the impact of NAPLAN on students and schools, claiming ”excessive test preparation is not useful”. Try telling that to the hapless school principals who get beaten up by school comparisons.

We all know that teachers and principals shouldn’t react to NAPLAN by ramping up test practice, teaching to the test and reducing time for other studies. But this was always going to happen when the stakes were raised and the test scores used to sum up the worth of each school and its teachers. Eric Sidoti kindly refers to such consequences as unintended, but they were certainly well known.

And the school comparisons and league tables reached absurd heights in a very short time. The Fairfax press added reading + writing + spelling + grammar + punctuation + numeracy, then divided all by six to produce a score and a rank for each school. Columnist Paul Sheehan then used the ranks to come up with all manner of half-baked conclusions, declaring heroes among some schools and making thinly veiled assumptions about the rest.

None of this would matter to Julia Gillard who conjured up the perfect storm of political advantage by making NAPLAN the essential core of the My School website. She pressed all the right buttons: openness, accountability, information — who could be opposed to any of this? She frequently recounts it as her greatest achievement as the then Education Minister. For this very reason not much if anything will change in the short term. The educators have now had their say, the government will just plough on regardless.

The other problems with NAPLAN and school performance, as reported on My School, are well known. Jane Caro and I thought they were worth a whole chapter in our book, What Makes a Good School. Gillard referred to NAPLAN and My School as akin to exposing schools to the cleansing rays of sunshine — the reality is that NAPLAN is a useful but narrow spotlight and leaves everything around it in the dark.

The problem is that, as the report demonstrates, less time is being spent on things outside this narrow spotlight. These include the subjects and activities that make a substantial contribution to the academic and cultural value of what schools do. They include the essential depth and diversity that hook kids into learning. Sometimes this requires innovative approaches to teaching and to school organisation — but this is far less likely and deemed to be more risky in a NAPLAN-driven testing regime.

The hijacking of NAPLAN to sum up and compare school achievement has always been fraught. It was ameliorated to some extent by giving each school an Index of Community Socio-Educational Advantage (ICSEA) which claimed to enable comparisons between schools. The initial ICSEA was a standing joke, more recent versions are much better but they still don’t enable accurate comparisons of individual schools. We are still being told it is okay to compare NAPLAN scores between schools that have to be inclusive and others which select their enrolments.

Because when all is said and done it is each school’s enrolment which is the main determinant of what is billed as school achievement. NAPLAN scores rise or fall between 30 and 40 points for each 100 point rise or fall in ICSEA. The single exceptions can be explained by problems with the measure, as much as by anything schools may or may not do.

This most recent report on NAPLAN joins a string of criticisms of the Howard and Rudd-Gillard governments’ addiction to reform that makes little difference to student achievement — but commonly widens the gaps between the advantaged and disadvantaged. Sure, it has created a headline or two but nothing much will change.

New Matilda is independent journalism at its finest. The site has been publishing intelligent coverage of Australian and international politics, media and culture since 2004.

[fbcomments]