As a nation, we’re undergoing a sort of collective madness about teacher effectiveness. The public release of teacher rankings in New York City and the latest reports of indications that teachers nationwide may have altered standardized tests are just the most recent outbreaks.
How did we get here? Is it just that we lack a common measure of effective teaching?
I think it runs deeper. I think we lack a common definition of teaching itself. And until we know what we’re talking about when we talk about “teaching,” we’ll continue to churn. If I’m right, we have one major challenge ahead of us: Agreeing that, at base and at best, teaching is and has always included:
Data-driven instruction: Common misperceptions about data-driven instruction are that 1) it’s only about assessment, and 2) it’s new. Data-driven instruction is much richer and older than that. Properly understood, the “data” in question encompasses everything from teacher observation of students’ level of classroom engagement, to teacher analysis of student work, to simple exit-ticket questions at the end of lessons, to, yes, information gleaned from assessments.
Formative assessments: A 2010 article by Margaret Heritage of the National Center for Research on Evaluation, Standards, and Student Testing offers a reminder that formative assessments are not limited to formalized assessment instruments or programs, but are rather a part of a “process that is fundamental and indigenous to the practice of teaching and learning.” Heritage describes the process of formative assessment succinctly. It encompasses:
- “Teachers making adjustments to teaching and learning in response to assessment evidence”
- “Students receiving feedback about their learning with advice on what they can do to improve
- “Students participating in the process through self-assessment”
Formative assessment, she argues, “should be regarded as a key professional skill for teachers.” And she cites compelling evidence to back her point of view: An early, well-regarded study of these practices found that “student learning gains triggered by formative assessment were amongst the largest ever reported for educational interventions.” (Moreover, formative assessments provide, or should provide, a the vast bulk of the data used to inform data-driven instruction.)
The opportunity: Consensus, documentation, rigor
It’s time to stop couching these basic techniques as new initiatives to be debated, interrogated and viewed with suspicion. Serious efforts at school reform demands that we accept these practices as standard operating procedure.
The silver lining in our current fixation with effectiveness-measurement is that it offers an impetus for widespread consensus on (and documentation of) a skills-based definition of teaching.
In fact, that work is already happening. Specifically: As districts and charters craft baseline rubrics to evaluate teachers and to identify areas for further professional development, they are also drafting practiced-based definitions of teaching that include descriptions of both formative assessment and data-driven instruction. Baltimore City Schools has created an exemplar instructional framework and rubric that clearly communicate formative assessment as core to teaching. Baltimore has defined as critical teacher skills as the ability to:
- “Engage students in standards-based lesson objectives”
- “Use questioning to bring students to higher order thinking”
- “Check for understanding and respond to misunderstanding”
- “Facilitate student-to-student academic talk”
- “Analyze student progress”
- “Modify instruction in response to data”
- “Partner with students and families to reflect on student’s progress”
Per the rubric, the importance of these skills will be reinforced through teacher observation and feedback, and the district will develop professional development offerings in each of these skills. By doing so, they will build a teacher corps that views formative assessment and data-driven instruction as what teaching is – rather than as yet another initiative. By following suit, other schools nationwide – both district and charter – can bring some rigor and reason to the emotional churn around effectiveness measures.