Year 5
Science Support Materials
Science snapshot: Evaluating a science program
Science/Science Understanding and Science Inquiry Skills
Content Description | Relevant aspects of the Achievement Standards |
---|---|
Biological sciences Chemical sciences Questioning and predicting Planning and conducting Processing and analysing data and information Evaluating |
Students classify substances according to their observable properties and behaviours. They analyse how the form of living things enables them to function in their environments. Students pose questions for investigation, predict what might happen when variables are changed, and plan investigation methods. Students construct tables and graphs to organise data and identify patterns. They use patterns in their data to suggest explanations and refer to data when they report findings. |
Nature of the assessment
The school used a commercially available standardised science assessment which provided data similar to the data provided for NAPLAN (i.e. the school mean and distribution, and the national sample mean and distribution, the skill assessed by each item and performance profiles.) The teachers used their understanding of NAPLAN data to inform the best approach when analysing the science data.
Purpose of the assessment
To assess Year 5 and Year 6 students’ science knowledge, scientific literacy and understanding of scientific inquiry, to:
- inform end-of-year reporting to parents
- evaluate the Year 5 and 6 science programs
- provide baseline data to the Year 6 and Year 7 teachers.
Analysing the data
The teachers reviewed:
- Comparative information - How did the school mean compare with the Australian reference group mean provided by the assessment? How did the school distribution compare with the Australian reference group? How did the class means and distributions compare? How did Year 5 and Year 6 distributions compare?
- Student performance on each item or question - What did the students need to know or understand to answer each question correctly? What misconceptions did an incorrect answer indicate? What percentage of the class answered the question correctly?
- Development of science understanding as indicated by the test - Which questions were lower on the developmental continuum? (Which were the easier questions?) Which questions were higher on the developmental continuum? (Which were the harder questions?) Where did the students start to fall away from the test?
- Spread of ability within a cohort and across cohorts - What skills did the less able students (students with the lower total scores) demonstrate? What skills did the more able students (students with the higher total scores) demonstrate? What did this mean for differentiating the curriculum?
- Diagnostic information - Were there misconceptions that needed to be addressed? Were there students who were at risk of falling even further behind? Were they extending all ability groups including the most able students? What refinements did they need to make to their teaching programs for these students, and for next year?
Using the information
The teachers used their evaluation to refine their science programs for the subsequent year. They used the student data to inform end-of-year reporting.
Overview of research
View the overview of research for this snapshot.