Some thoughts on the Gonski "2.0" report

The second Gonski Review was publicly released this week to a storm of controversy and diversity of opinions among educators, policy wonks and researchers. 

The panel had a hard task. It was asked to focus on the school and classroom factors that can make the biggest, sustained difference to educational achievement, while ignoring the many, meaty structural issues such as funding allocations, residualisation, federalism and system coherence, which influence schooling outcomes. These had been explored in the earlier review chaired by David Gonski (and in my own work).   Despite these limitations, the panel did a pretty good job, outlining a vision of where Australian schooling should be heading (spoiler: a student-centred school system which values and supports educators) and some of the tools and changes needed to get there.

I was pleased to read the priority reforms put forward in the Mitchell Institute submission were endorsed as recommendations in Gonski2.0. And I was particularly enthused to see learning growth over time, personalised learning, and student agency plus additional time and evidence-based tools to support teachers and principals in their vital work as educators and instructional leaders at the centre of the report.

Of course, many of the key recommendations put forward are already happening in schools around Australia, including schools I've had the pleasure of working with over the years. (Check out Templestowe College, Rooty Hill High School and Marlborough Primary). But such approaches are not systematically supported or encouraged by current policy, accountability and regulatory frameworks, nor are they made easy for already over-stretched schools or teachers.

One of the biggest obstacles - recognised in this report - is the absence of timely, fine-grain and useable data at classroom level on teaching impact, and of tools to put such insights into practice in a way that is tailored to individuals and their different contexts.  Such data is in many ways the missing link, connecting teaching with learning in real time.  

Pivot, the organisation I've just joined, works with schools and systems to gain these vital insights into teaching effectiveness using student perception data and peer feedback, and uses this to provide confidential reports and curated resource packs to teachers, and aggregated reports to school leaders, on their greatest strengths and development areas.  These insights and tools are keys to unlock greater effectiveness and learning growth.

Student and peer feedback data can rightly take emphasis away from NAPLAN, which has been misused and conflated in both purpose and importance, with perverse effects at the individual, school and system levels. NAPLAN should be put back into perspective - a nationally-comparable point-in-time assessment of a few essential learning areas, to be used alongside other data sets and most importantly, formative assessments, to guide decisions on programs and resource allocation.

Want more?