The Gonksi 2.0 Review did not address the terms of reference, lacks clear evidence for its key recommendations and has left as many questions unanswered as it set out to clarify, according to a new policy paper from think tank The Centre for Independent Studies.
Co-authored by senior research fellow Dr Jennifer Buckingham and policy analyst Blaise Joseph, What the Gonski 2 Review Got Wrong argues that the Review ducked its remit to provide direction for cost-effective and evidence-based use of resources.
Instead, the paper says, the panel provided a number of sweeping key recommendations, many of them uncosted, unsupported by research and with little detail as to how they might be implemented.
The policy paper comes as Minister for Education and Training Simon Birmingham prepares to meet his state counterparts on Friday to secure support for the government’s education reform package.
Speaking to EducationHQ, Buckingham says she thought the Review “really missed the mark”.
“It’s really difficult to know why the panel took the approach they did.”
“…for some reason that the rest of us don’t know, this ended up being their set of recommendations – which I think was quite different to what people would have expected, particularly given the terms of reference.”
Popularly termed Gonski 2.0, the Review to Achieve Educational Excellence in Australian Schools was originally commissioned by the Federal Government to provide advice on “how school funding should be used to improve school performance and student outcomes”.
With Australian business identity David Gonski AC at the helm, the committee was asked to “examine evidence and make recommendations on the most effective teaching and learning strategies to be deployed”.
One year on, however, Buckingham and Joseph suggest the Review provided little guidance as to the next steps forward for the state of education.
“The inescapable conclusion is that the Gonski panel did not carry out the task entrusted to it by the Turnbull Government,” they say.
The authors characterise the 23 recommendations in the Review as ranging from “the non-controversial” – a Unique Student Identifier, for example – to the more “far-reaching”.
And it’s these broader recommendations that are fraught with problems, the paper says, including calls for schools to focus on “developing general capabilities” through learning progressions.
While ’21st century skills’ and general capabilities might be important for students moving into an increasingly uncertain world, the authors contend that the concepts are still not well understood.
Unlike the teaching and assessment of “foundational skills” like literacy and numeracy – the subject of “lengthy, rigorous and detailed” research for centuries, the jury is still out on whether skills like critical thinking, creativity and communication, can be effectively taught and measured independently, the paper says.
Critical thinking, for one, requires some knowledge of content knowledge; and cannot be categorised as a generic skill that can be transferred from one area to another.
“…to think critically about the impact of population growth on society [for example], a student must be knowledgeable about immigration, demography, welfare, education, multi-culturalism and economics – at a minimum.”
The Gonski Review has given little consideration to the state of the evidence to support the teaching and assessment of general capabilities, the paper says, and its willingness to ignore the research would appear to leave the government in a double bind: how to respond to the recommendations of a report that they commissioned, with little in the way of evidence to support those recommendations and little guidance as to how to implement them.
The panel’s recommendation to “develop a new online and on demand student learning assessment tool based on the Australian curriculum learning progressions” is equally problematic, Buckingham and Joseph say.
The online learning tool was recommended by the panel as a means to ensure “growth in learning” rather than provide “absolute measures of achievement”, allowing teachers to “tailor their teaching practices at both the class and the individual student level”.
Central to this was the recommendation that each child should be expected to achieve “at least one year’s learning growth” throughout the year.
But the paper argues that this would appear to go against what the panel was trying to address in the beginning.
A “pre-determined standard … minimum expectation for every student” is an ill-considered solution to the problem of a “one-size-fits-all approach to student achievement”, the paper suggests.
And nor does the Review explain what a year’s growth would look like.
“Would one year’s learning growth be the same for a typically developing child and a child with learning difficulties or disabilities? What does a year’s learning growth in history look like as compared to maths? These are not trivial questions, and the report gives no serious consideration to them in making its recommendations,” the paper says.
Those recommendations around continuous individual assessment are also made more complicated by the Review’s “frequent reference to the concept of a growth mindset”, Buckingham and Joseph say.
They point to 20 instances in which the Review refers to the concept of mindset, but argue that the panel provides little conclusive research to support the concept as an effective measurement of learning and achievement.
Of two recently published meta-analyses examining the relationship of mindset to student achievement, the paper says, one pointed to null findings across the sample and the other found that “only 12 per cent of the effect sizes obtained for mindset interventions were positive”.
“If mindset is a personal trait that cannot be intentionally developed – or it is not yet known how to do it – it is futile to place this responsibility and expectation on schools, and reckless to use it as a key assumption underpinning a wholesale redesign of student assessment processes.”
Buckingham says the panel’s reluctance to confront the evidence points to a wider trend in the education sector.
“There’s a tendency to do that a lot in education – for an idea to become quite appealing to people. And they’re willing to go with that idea, even on the flimsiest of evidence, because it sounds right and it sounds plausible. But when you peel back the layers of that you realise that there’s not a great deal of substance to it. And I think that that’s what seemed to be lacking in the committee – a willingness to be sceptical about what is the latest big idea in education,” she says.
Even amongst the “good ideas” such as the independent education evidence institute, Buckingham says the devil will ultimately be in the details; and across many of the recommendations the details are simply not there – including the proposed governance structure and funding arrangements for such a body.
“…there are lots of things there where it’s not clear who should be doing that and when, and, also, who bears the cost. So, in a lot of cases with these recommendations… does that money come out of the allocation that’s already in the Commonwealth budget, or is this money that’s expected to be spent in addition to what’s already been allocated in the Commonwealth budget?”
The lack of detail in the key recommendations and a reluctance to confront the evidence behind them is one thing, but the paper argues that it will ultimately be the schools, teachers and students that have to contend with the consequences down the track.
“The recommendations discussed in the Review … are potentially expensive and disruptive to the work of teachers and the lives of students…” the paper says.
“…[it’s] a recipe for educational disaster.”
While Buckingham sees some recommendations coming to fruition, including the Unique Student Identifier, and, eventually, the evidence institute, she’d like to see “a great big pause” on the recommendations around teaching and learning assessment.
“[We need to] really examine whether or not this is feasible, and if it is going to happen, at least do some trials. Instead of inflicting something that’s completely new across the whole country.”
In a statement, Minister Birmingham called the Review “a landmark report”, and said that he had been encouraged by widespread support for the panel’s recommendations from “primary and secondary education stakeholders … government, Catholic and independent school sectors and from various parent, welfare and business organisations”.
“…[The Review] presents a significant blueprint for change, including changes to the Australian curriculum to better measure student progression.”
“The Turnbull Government wants Australian school students to reach high levels of knowledge and achievement, which is why we expect the record funding for our schools to be used in ways that extend each student to their maximum potential,” he said.
Posted by Jamie Ramsey