School Testers Deny NAPLAN Side Effects

0

The Australian Senate’s Education, Employment and Workplace Relations Committee is currently holding an inquiry into the effectiveness of the "National Assessment Program – Literacy and Numeracy", better known as NAPLAN.

Over 70 submissions have been received. Of particular interest and significance is the submission from the Australian Curriculum, Assessment and Reporting Authority (ACARA), the custodian of NAPLAN. (The submission is number 58 on this list.)

One of the focus questions in the inquiry’s terms of reference refers to the “unintended consequences” of NAPLAN’s introduction. This is an important question given widespread but mainly anecdotal reports in Australia of test cheating; schools using test results to screen out “undesirable enrolments”; the narrowing of the curriculum; NAPLAN test cramming taking up valuable learning time; Omega 3 supplements being marketed as able to help students perform better at NAPLAN time; and NAPLAN test booklets hitting the best seller lists.

Here is what ACARA’s submission to the Senate inquiry has to say about the issue:

“To date there has been no research published that can demonstrate endemic negative impacts in Australian schools due to NAPLAN. While allegations have been made that NAPLAN has had unintended consequences on the quality of education in Australian schools there is little evidence that this is the case.”

The submission goes on to refer to two independent studies that investigated the unintended consequences of NAPLAN.

One research project, led by Greg Thompson, focussed on changes to pedagogical practices as a result of the existence and uses made of NAPLAN. ACARA dismissed its findings, saying that changes to teaching practices were driven by teachers, not by NAPLAN. Yet Thompson's report makes clear that teachers feel pressured to make these changes, that they don’t agree with, because of NAPLAN's role as a school accountability measure. In other words ACARA rules “out of court” any unintended consequences of NAPLAN that relate to changes to teachers’ practice.

ACARA also dismisses a survey undertaken by the Whitlam Institute because it “suffers from technical and methodological limitations, especially in relation to representativeness of the samples used”. The survey was completed by nearly 8500 teachers throughout Australia and it was representative in every way (year level taught, sector, gender, years of experience) — except for oversampling in Queensland and Tasmania.

In relation to this sampling concern the writers even reported that they weighted the responses to compensate for this sampling differential. For ACARA to dismiss it because of a spurious sampling critique is intellectually dishonest at best.

ACARA’s dismissal of these two sound research projects is in contrast to its treatment of unsupported statements by education stakeholders about enrolment selection concerns.

In response to the claims that some schools are using NAPLAN results to screen out “undesirable students”, ACARA states that it is aware of these claims, but appears willing to take at face value comments from stakeholders who represent the very schools accused of unethical enrolment screening:

"It is ACARA’s understanding that these results are generally requested as one of a number of reports, including student reports written by teachers, as a means to inform the enrolling school on the strengths and weaknesses of the student. The purpose for doing so is to ensure sufficient support for the student upon enrolment, rather than for use as a selection tool. This understanding is supported by media reporting of comments made by peak bodies on the subject."

There is a history to this issue that began in 2008, almost as soon as the decision to set up MySchool was announced by Kevin Rudd as part of his school transparency agenda.

Three years ago this month I wrote an article about the importance of evaluating the impact of the MySchool Website and the emergence under FOI of an agreement in September 2008 by State and Commonwealth Ministers of Education to:

“[C]ommence work on a comprehensive evaluation strategy for implementation, at the outset, that will allow early identification and management of any unintended and adverse consequences that result from the introduction of new national reporting arrangements for schools.”

It was clear from the outset that this evaluation should have been managed by ACARA as the organisation established to manage the school transparency agenda.  In 2010, in response to my inquiry to ACARA on this ministerial agreement, the CEO of ACARA stated that it was not being implemented at that point in time because early reactions to hot button issues are not useful and because the website did not yet include the full range of data planned.

One of the vital elements of any useful evaluation is the collection of baseline data that would enable valid comparisons of any changes over time. Why wasn't data collected as part of the initial MySchool process?

Has time allocated to non-test based subjects reduced over time? Has teaching become more fact based? Has the parent demographic for different schools changed as a result of NAPLAN data or student demographic data? Are more resources allocated to remedial support for students who fail to reach benchmarks? The answers to these questions and more are harder to discern because of a lack of foresight. 

What evaluation did happen was driven by the possibility of schools being shamed through media responses to NAPLAN and other concerns. Again, correctives should have been built into the design elements of MySchool. There is no real value in waiting years before deciding corrections are needed.

Anyone who seriously believed that MySchool measured a the full range of educational data was dreaming. Waiting for the full range of data meant, in reality, an indefinite delay. There are still data items in development today.

So now, five years on from the ministerial directive that there was a need to actively investigate any unintended consequences, there is still no comprehensive evaluation in sight. One suspects that ACARA finds this quite convenient and hopes that its failure to act on this directive stays buried.

However, education ministers still had concerns. In the Ministerial Council for Education, Early Childhood, Development and Youth Affairs April 2012 Communiqué (pdf) the following was reported:

“Ministers discussed concerns over practices such as excessive test preparation and the potential narrowing of the curriculum as a result of the publication of NAPLAN data on My School.  Ministers requested that ACARA provide the Standing Council with an assessment of these matters.”

On the basis of this statement I wrote to ACARA on 27 April 2013 requesting information on action in response to this directive — then over 12 months old. To date I have received no reply.

If ACARA knows of no information regarding the extent of NAPLAN's unintended consequences, then one might conclude they have twice ignored ministerial directives. ACARA is certainly aware of ministerial concerns over the unintended negative consequences of a program it manages.

It is astonishing that ACARA would then write a submission decrying the lack of information on this issue. It also gives new meaning to a throwaway line in its submission about the negative findings from the Whitlam Institute survey: “Further research with parents, students and other stakeholders is needed to confirm the survey results on well-being.”

Further research is indeed needed; unless it exists and is not being made public, it should have been initiated by ACARA five years ago.

New Matilda is independent journalism at its finest. The site has been publishing intelligent coverage of Australian and international politics, media and culture since 2004.

[fbcomments]