GUEST BLOGGER: Dr. William L. Heller, Using Data Program Director, Teaching Matters*

As a facilitator of the TERC Using Data institute, I try not to play favorites among the different stages of the process. Every link in the chain is important towards improving student outcomes. But I must confess that I always look forward to the item-level analysis with just a little extra bit of enthusiasm. This is where the school-based data teams that I work with are most likely to achieve a breakthrough and gain new understandings about the problems their students are having. Even the teachers who arrive being the most skeptical about the importance of data are subject to “Aha!” moments when they actually look at the questions their students got wrong on the exams and are able to specifically target a cause and a solution.

Item-level analysis is a crucial step when interpreting what test scores mean. New York State provides a performance indicator for each question, but these can often be inaccurate or misleading. Sometimes, students in a particular school may have had trouble with a question for reasons other than lacking the skill described by the performance indicator. Perhaps the Social Studies question contained a distractor answer that caught students’ attention. Maybe an ELA question required students to notice a part of a passage that sneaked out on the top of the second page. Or even a Math question could have required literacy skills the students may not have had.  Until the teachers who work with these students every day have a look at the questions, we won’t really know why students got them wrong.

In one middle school in Queens, the data team I was working with found that several problems that students seemed to be having with geometry were actually academic vocabulary problems. One of these issues was with the term “vertical angles,” which refers to a pair of angles that are opposite each other across a vertex. You might even call them opposite angles. In fact, that’s exactly what the teachers in this school had done.  Concerned that students might be confused by the term “vertical angles,” teachers taught the concept of “opposite angles.” Students learned what opposite angles were, they learned that opposite angles are equal, they did problems involving opposite angles, they were assessed on their understanding of opposite angles, and they successfully demonstrated comprehension of opposite angles. Then, an exam question asked them to identify a pair of vertical angles, and they were lost. The issue here was not a lack of mathematical understanding. The problem was that the vocabulary being used in the classroom was not aligned with the statewide assessments. Once you realize that, it’s easy to fix.

Another item-level examination revealed a very different vocabulary issue. Teachers deconstructed a question in which students were given the Dart board with three darts scattered on itlength and width of a rectangular portico and asked to find the area. Many students, not knowing what a “portico” was, were confused by the question and were unable to answer it. This time, the issue is not that the teachers failed to teach the word “portico” to their students. I freely admit that I did not know what a portico was (it’s a porch with a roof), but I still could have found its area. The key point here was that the data team now knew to teach strategies for helping students distinguish between relevant and irrelevant information in a word problem.

These are two very different examples, but they have something important in common. Students were having trouble on questions with performance indicators in the Geometry strand. Math teachers learned through item-level analysis that the real issues were with unfamiliar vocabulary words, rather than mathematical operations. This allowed them to target their research, planning, and interventions where they would have the most impact on student outcomes.

*Teaching Matters is a non-profit organization that partners with educators to ensure that all students can succeed in the digital age.  They are an official TERC Using Data partner organization, conducting the Using Data for Meaningful Change institute for New York City schools.