NAPLAN – Selective Data Interpretation vs Transparency

Education has been in the forefront of the nation’s media in recent months due to the government’s much publicised Education Revolution. The focus has been on a range of issues associated with curriculum and student / school performance. The launching of the My School website resulted in so much interest that the site was unable to manage the demands on its first day of operation. While it is gratifying to see that Australians are interested in the education of their children, the implications of labelling schools as “good” or “bad” on the basis of their performance on the National Assessment Program for Numeracy and Literacy (NAPLAN) tests invites robust debate about the purpose and nature of education.

The Weekend Australian
(May 1-2, 2010) published a list of the nation’s top 100 primary and secondary schools on Saturday. Such a list is seen by many people as a natural extension of the My School website and depending on one’s views may be regarded as a move towards desirable transparency in the education system or as inappropriate use of data generated by specific assessment instruments, namely NAPLAN tests. Regardless of one’s point of view the lists generated by The Weekend Australian require significant scrutiny.

Brisbane Girls Grammar School’s absence from both lists, Reading and Numeracy, prompted the School to investigate the methodology used to create them. Examination of the information available on the My School website revealed that the average score attributed to each school for each of the tests considered in the article, was arrived at by adding the Year 7 score of that school for the relevant test to its Year 9 score on the same test. Clearly schools such as ours, that do not have Year 7 students, are not able to be ranked in this way. There is a comment beneath the listing that indicates that schools without a Year 7 cohort were assigned a “nominal ranking based on (an) estimated score calculated from comparable schools”. Such a ranking would account for the inclusion of schools which do not have a Year 7 cohort; however, it is difficult to understand how this nominal ranking was assigned when one analyses the data available on the website in conjunction with the schools included on the list.

Our School’s Year 9 NAPLAN results for the 2009 Reading Test resulted in an average of 643. When compared with the Year 9 NAPLAN results for the same test achieved by the schools listed as the top 100 secondary schools it is perplexing to note that the Brisbane Girls Grammar score was higher than half the schools listed. Similarly, this School’s Year 9 Numeracy results for 2009 when compared to those of The Weekend Australian’s top 100 schools in the country would result in our inclusion in the “list”.

A simple comparison of scores does not allow for the many differences between states, schools and student populations. Also important is the impact such simplistic analysis may have on the perception of the performance of Queensland, or indeed any state’s schools when compared with the rest of the nation.

As educators, we are deeply suspicious of any document which ranks schools based on limited (or reconstructed) data, whether it be performance on an academic test, the number of OP 1s achieved, the success of a sporting team or the performance of a musical group. Regardless of this though, when the interpretation of that data is clearly flawed the danger of such a practice is further emphasised. People who read The Weekend Australian last Saturday and believed that they could rely on the accuracy of the judgements made regarding the top performing schools in the country based on the 2009 NAPLAN results need to be aware that the methodology used to create these listings does not stand up under scrutiny.

Ms S Bolton

Acting Dean of Curriculum

Reference:
Ferrari, J. (2010, May 1-2). The Nation’s Top 100 Secondary Schools. The Weekend Australian, p.7

Leave a Reply