Getting Smarter About School Testing


Almost as soon as the MySchool website was launched in February, so many visitors logged on that the site went into complete meltdown. This traffic was accounted for on the grounds that parents are more than ever interested in the education of their children. 

I’ll admit that I was part of the crowd. I went to the site, plugged in the name of my children’s school and I wasn’t surprised. The school — located in an inner city suburb, populated by the children of mainly white (upper) middle class professionals and those of recently arrived immigrants, refugees and asylum seekers — performed as expected. I can’t tell you how well or bad because I simply didn’t take enough notice. No doubt, had there been a subtle red marking indicating that the achievements of the school’s students were below standard, I might have taken more notice.

On MySchool, I can compare my children’s school to "similar" schools. Similarity here is based on an index devised for the site, the Index of Community Socio-Educational Advantage. I can also, with one click, compare the performance of my children’s school to up to 20 other schools within an 80 kilometre radius.

The problems with the data as it is presented on the site have been well documented. See here, here and here, for starters. 

The most contentious issue is with how the Index of Community Socio-Educational Advantage is used to help measure similarities between schools. The Index relies on census data about the geographical location of the school — and not the socio-economic data about the actual population of the school.

The other glitch in the presentation of the data which has received plenty of attention concerns the way in which non-selective entry schools are being compared with selective entry schools. For the average mum or dad, the nuances are difficult to discern. Let’s face it, for most parents, national test results are used as an "at a glance" reference point so that they can see whether little Ignatius and Scarlett are performing as they should be.

Parents need to be educated about data analysis and the initial purpose of the test results upon which the MySchools data is based. While parents certainly have a right to access this data, I think they — and the media — also have a responsibility to understand the various ways in which the data can be interpreted.

The NAPLAN (National Assessment Program for Literacy and Numeracy) test, which most Australian students in years three, five, seven and nine will sit next month, is under threat of being boycotted by the Australian Education Union (AEU) for fears that the results will be used on the MySchool website to create league tables.

The AEU has made it clear that it does not oppose the publication of test data nor does it dispute the purpose of the tests themselves. Its stance that schools should be accountable for the teaching and learning that goes on in their schools is on record (pdf). At the time of publication, the AEU had not yet reached a clear agreement about whether the tests would be run or not.

Yet when it comes to reporting the NAPLAN tests and the possibility of a boycott, the AEU’s reasoning is often over-simplified into a straightforward argument against league tables.

The popular press is doing plenty to perpetrate the hysteria associated with league tables. Mild cases of Schadenfreude erupt across the land when the Murdoch press reports that a $6000 per term school performs worse than the local government school — all in the name of "reporting the facts". These stories can have enduring — and unfair — effects on school communities as teachers’ professionalism and skills are scrutinised by the media, then parents, who used basic data under which more complex issues lay.

There’s a much more important use though of the test data than reassuring parents and selling newspapers. Its primary use should be to help governments ensure that all kids in Australian schools are getting the chance to become literate and numerate at the most basic, level for their age of learning. This is not an endorsement that this basic level be the accepted standard for all students — this is the level to which children should be entitled.

I am in a fortunate position to see whether my child is attaining and maintaining acceptable levels of literacy and numeracy and I also have more time to interpret the subtleties of the data presented. And while I am by no means dismissive of test data, I am aware that there are various limitations to such testing. I realise that my children are fortunate: they are happy, healthy kids who demonstrate all the developmental characteristics for their age. But what about those kids whose parents are not in the same position — or those students who are not attaining the basic levels of literacy and numeracy for their age-group?

These questions were highlighted in a report by Helen Hughes last week when it became obvious that basic literacy and numeracy was still not being achieved in schools which had high Indigenous populations. What about students who are from low-socio economic areas (where — like it or not — the NAPLAN data shows that literacy and numeracy skills are lower than in higher socio-economic areas) or, god forbid, those students who are in middling classes and not attaining the basic level of literacy and numeracy?

I just hope that once the community is more educated about what the data can tell us regarding basic standards of literacy and numeracy then this hysteria surrounding league tables and the media’s perception that this is what parents want will diminish and that real improvements can be made to children’s education in this country.

The analysis of data can be difficult though and first impressions can be deceptive. For example, a test result showing that your child is "at the national benchmark" does not mean that there is no cause for worry — the benchmark indicates the most basic level of literacy and numeracy that is acceptable for that stage of learning. And really, aren’t the parents who are going to scrutinise the data, going to want to find out that their kid is in the upper band of achievement?

Victorian parents pleased that their year three student was attaining the national benchmark might be alarmed to find out that by the Victorian curriculum standards, the national benchmark only relates to year two level in Victoria. It’s this kind of information that parents need to make informed decisions.

In its crusade to provide information that every parent should know and has a right to know, The Australian has not let the subject alone and last weekend, it published the league table of the "Best 100 schools".

These are exactly the kind of league table that the AEU opposes. The barrage of justificatory articles also published in The Australian is not surprising — and seems placed to ward off any guilt about the consequences of publishing such information. Until the media stops simplifying the analysis of the test data into "good and bad" and catastrophising education issues, change is not going to occur. Similarly, until parents break the idea that education is always about competition, misperceptions of the purpose of testing will remain.

I had considered briefly a few weeks ago whether or not I should withdraw my children from the test. Testing, after all, was irrelevant to their learning. There is a bigger picture to consider though, and it involves all Australians — especially those young Australians who, for whatever reason, may be disadvantaged. The data, if used for good, and not for evil, may — just may — help improve school education in this country.

So, next week, when my children’s Victorian school offers the tests, I hope that the parents are informed enough so that they have a fuller picture of what the data actually means and are able to look beyond their own playground to support all teachers and students across the nation

New Matilda is independent journalism at its finest. The site has been publishing intelligent coverage of Australian and international politics, media and culture since 2004.