top of page
Search

Analysing Your Literacy Data to Drive Instruction

What do we look for? What do we do now? What happens next?




Let’s take a moment to think about the purpose of assessing students and the intention of assessments in our classrooms. We want to know where our students are at, what they CAN currently do and what they haven’t YET been able to achieve. 


We’ve all had that thought, ‘Why are we STILL doing this assessment?’ purely because we’ve been told to add a score into a spreadsheet and because ‘it’s just what we do’. But, what’s the purpose of the assessment though? Because if it’s just to add data into a spreadsheet, then we might as not well be doing it. Assessments are evidence of learning at any given point to inform, direct and plan for improvements in learning. 





If we think of the types of assessments and data we gather in our literacy practice, it’s whatever we can measure and observe. The list is endless:

  • PAT R data (reading assessment)

  • NAPLAN results

  • Progression Points

  • Achievement Standards

  • Observations

  • Anecdotal Evidence

  • Achievement of the Success Criteria 

  • Rubrics

  • Quizzes

  • Exit tickets

  • Thinking Routines

  • Brightpath (writing assessment)

  • Dibels (reading assessment) 

  • Student Conferences

  • Student Interviews

  • Pre and Post Tests/Unit Assessments

  • Writing samples

  • Tests (spelling, grammar and punctuation, etc.)

  • Responding to questions (written or verbal)




Individual Level

  • How has Student X gone compared to the average? Is Student X working above or below the average in this assessment? 


Group Level

  • How is my class/cohort trending in comparison to the mean in this assessment?

  • Are they working above or below the average, collectively?


History

  • What has been the consistency in Student X's achievements? Has Student X usually been above the average or below the average in this assessment?

  • What else do we know about Student X from a holistic perspective?

  • What’s the cohort’s consistency in their achievements? Have they usually been a cohort who sits above the average or below the average?

  • What else do we know about the cohort from a holistic perspective? 





Individual Level

  • What has Student X achieved and done well in this assessment? 

  • What CAN Student X do? (skills)

  • Where was Student X most successful in this assessment? 


Group Level

  • What was this class/cohort able to achieve in this assessment?

  • Is there a pattern?

  • What can this class/cohort do? (skills)

  • Where has this class/cohort been most successful in this assessment? 


History

  • Are these areas of achievement for Student X consistent with previous like-assessments? 

  • Are these skills demonstrated by Student X consistent with what we see in the classroom?

  • What else do we know about Student X’s ability to *insert skill?


  • Are these areas of achievement for the class/cohort consistent with previous like-assessments?

  • Are the skills demonstrated by the class/cohort consistent with what we’ve seen in previous like-assessments?

  • What else do we need to take into consideration around the achievements of the class/cohort?




Individual Level

  • Where was Student X not as successful in this assessment? (the gaps)

  • What are the next steps for Student X?

  • What does Student X need from me for their next steps?


Group Level

  • Where was the class/cohort not as successful in this assessment? (the gaps)

  • Is there a pattern? 

  • What are the next steps for the class/cohort? 

  • What does the class/cohort need from me for their next steps?

History

  • What steps have been implemented in the past with Student X?

  • What kind of support or approaches have been used with Student X?

  • As a class/cohort, what has been previously been embedded?

  • As a class/cohort, what was the success of what was previously implemented?




Once we’ve analysed the literacy data, observe the trends and patterns, looking for similarities and outliers.


What if majority of my students have similar next steps?

In your teaching practice, you’ll be guided by the information in the data set as to what your students need from you. If the majority of your students need to improve the same skill, or the majority of them didn’t achieve success in a similar area, this directs you to create whole class opportunities for them to be explicitly exposed to the teaching of this skill.


What if there's a handful of students who have similar next steps?

If there’s a handful of students who require similar next steps within the same area or skills, work with them in a small focus group, using an explicit approach. 


What happens with the outliers?

If the data is informing you of outliers, there may be a handful together, or perhaps only one or two students, this is where you’ll need to differentiate their learning and your teaching.



The literacy data is telling you what you need to be doing in your literacy practice to support your students in achieving success. It’s directing you to think about the strategies and approaches to help them do this. Perhaps if there’s something you’re doing that’s working well, based on the results you’re seeing, continue using that strategy. If the students aren’t progressing or there isn’t a shift in their achievement, use this as an opportunity to attempt different literacy approaches. (This is where The Instructional Literacy Coaching Experience can support you!)


Ensure that students are involved in the analysis and goal setting process. Unpack THEIR data with THEM! Allow them to analyse their results, provide them with feedback in order for them to set their goals, create success criteria, establish habits to achieve the goal, provide opportunities to check-in and reflect on the goal and allow time for them to work towards achieving success. 





What kind of literacy data do you collect and analyse to inform and drive your teaching?



What else do you focus on when interpreting student data?

30 views0 comments

Recent Posts

See All

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page