Student Data Analysis

Every time I watch this Ok Go video I wonder how many times it took them to get it right: 

Judging from the swear words that are screamed in the outtakesit probably took them a whole lot of adjusting to make the machine work.

THIS is feedback at its finest.  Not fake.  Not contrived. Real tips that lead to progress.  Tips that invite tinkering.  Each part of this Rube Goldberg Machine that did not work physically demanded the maker(s) to fix it so that the whole system would work. Feedback informed an area of need/adjustment/TLC.  Feedback did not fix the error, it drew attention to it.

Sounds like the kind of feedback I’d want for my students.

Last year my students got into a really awesome groove.  They would give each other feedback, then we would examine that feedback with a critical lens.  What feedback was the most useful?  What feedback did they need further clarification on? What feedback was not useful at all?  

(Examples in lego-anguage and math.)



After they had reflected on their feedback, they would write themselves a quick note to articulate their plan of attack. 

(Reflections on a photography assignment.)



Then they’d tinker with the parts of their work that they wanted to improve.

Critical examination of feedback empowered students to really own their learning.  It gave them ideas to play with, not mistakes to fix.  

In Leaders of Their Own Learning Ron Berger spends an entire of chapter discussing student use of data.  In Berger’s opinion, empowering students to look for trends in their feedback helps them own their learning.

“Then I borrowed a strategy from a colleague and everything changed.  We began using a data tracking form with categories of error types.  Students analyzed tests and assignments and assigned each error to a category (e.g., copying error, computation error, using the wrong operation) and noted it on the form.  Because the nature of their errors wasn’t always clear, students worked together to understand what went wrong.  They became intrigued by the patterns of their errors.  Over time they had a great deal of test and assignment data to draw on.  They created charts with distribution patterns of errors and of changes in performance over time.”

All I could think of was how onerous and time consuming this kind of data analysis would be.  I do think that assessment and evaluation needs to be demystified for students to truly own their learning.  I also see the merit in knowledge of self as learner (what are my common mistakes)… but holy cow, this takes a lot of time!

I think Berger was reading my mind because a few pages later he wrote, 

“Rather than flooding the classroom with data, teachers should design a focused data routine that can be expanded an supplemented over time.”

So… what’s the routine…?

Maybe tracking and collating sticky notes and student reflections would be a good start.  Students could draw a bar graph or tally of success criteria inside their work portfolios (or create a digital alternative).  Before answering critical questions about the feedback that they have received, students could spend a minute filling in the tally/graph… the good the bad and the ugly.  When the time came to work on something independently, that chart/graph would serve as a strong, visual reminder for things to consider while working.

If a picture is worth a thousand words, this graphed data might be worth a thousand stickies and reflections.

Something to try.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s