A couple of weeks ago I was reading through some of my edu feed and I came across a post about assessing curricular competencies in the new BC science curriculum. The post discussed feedback cycles on the competencies. After reading the post I felt kind of anxious, which isn’t that uncommon for me when I read something that I know I can improve on, or should be doing better with. Later in the day I was still feeling bothered and then it finally dawned to me that when I read about assessing curricular competencies, I end up feeling crappy.
Over time I’ve been playing with the format of my SBG tracking sheets. The biggest change for me is the addition of tracking curricular competencies. Tracking curricular competencies is pretty tricky in my opinion. While I fully and enthusiastically agree in practicing and recognizing curricular competencies, I’m much less interested in grading them. Why? Because many competencies cannot be taught. For example, I can teach a student to factor a polynomial, but I cannot teach a student “to use logic.
I’ve written about my usual SBG scheme here. It works fine and many students take advantage of learning at a slightly different pace but still getting credit for what they know, once they know it. However, I’m interested in keeping small quizzes primarily in the formative domain, yet using an assessment tool that is based on clear learning objectives, re-testable and flexible. This post talks about a possible transition from using a few dozen learning objectives in quizzes to a new, larger goal assessment tool.
The Task I recently sent out a survey to Twitter where 50 respondents were presented with series of scores for students. The scores were for individual learning objectives and all the scores are based on a 3 point or 4 point proficiency scale. Each score was indicated by one of four different colours. Users were asked to come up with an overall letter grade and percent for each student based on these learning objective scores.
Last night I was at a district meeting on Communicating Student Learning. There are a few different CSL projects going on in our school district and these meetings are good places to share our individual school experiences and collaborate on new ideas. At one point in the meeting, two concerns about proficiency based assessment/reporting came up. I wanted to write about them because these are two issues that I see raised with regards to assessment and Standards Based Grading (SBG) quite often and they are great questions.
Kids these days don’t know as much because of grade inflation. That makes no sense to me. Kids may, or may not, know as much as they use to but what they “know” is a result of the teaching that happens in the classroom. After the lessons, learning and practicing a student is assessed and typically given some number. Whether that number is 70 or 90, the learning has already happened.
In an earlier post I wrote about how I felt that I tend to move slowly through curriculum. One of the things I do that slows things down is frequent quizzing and post-quiz self/group assessment. Usually once every 5 classes (or less) we will have a quiz that can take anywhere from 10 minutes to 25 minutes. Once everyone is finished, the quizzes are handed back to the students and we go over the solutions.
I’ve been interested in assessment and reporting from the start of my teacher trainging, and it’s been the biggest focus of my Professional Development as a teacher. In the past two years I’ve been involved in communicating student learning (CSL) working groups with my school district. One of the motivators for these working groups is to get our pedagogy and procedures in-line with the new curriculum. In this blog post I want to briefly discuss the intersection of CSL and pedagogy.
One thing I’ve been trying to implement more and more into my units are Performance Tasks. McTighe and Wiggins in their Understanding by Design framework say that a Performance Task is an authentic assessment where students demonstrate the desired understandings. In my context, I currently use small SBG quizzes for the bulk of my assessments. Jay McTighe, who I had the pleasure and privilege of having lunch with, would probably call my quizzes “supplementary” evidence.
As It Stands After two months of I decided to return to my previous system of SBG objectives. Read on to see what I had tried but ultimately didn’t continue with. This year was once again fairly successful with SBG. I managed to work the Transfer Tasks into my system OK, which made me feel better about students that get all “mastered” on their learning objectives. However, I’m still not satisfied with how this works out.