If the primary reason for testing students this Spring and at the End-of-the-Year is to diagnose the COVID-19 learning slide, states can save the dollars expended to assess students’ learning deficits. The vast majority of schools have been collecting and using benchmark and formative assessment results since schools returned to all remote, hybrid or in person schooling resumed in the Fall. With few exceptions, school districts at every level spend precious dollars on benchmark assessment systems. Teachers collect and analyze student learning results. Students routinely engage with on-line assessment systems in core subjects, log-in to adaptive practice instruction and complete student work samples.

Rather than viewing annual post-mortem data, teachers use classroom data to make adjustments to instructionIn. Meeting weekly, often daily, in common planning meetings conducted through video conferencing, teachers analyze diagnostic reports and student work samples to determine what gaps exist from the abrupt pivot to remote instruction in the Spring of 2020, as well as identifying deficits persisting as students and their families navigate routine swings between hybrid and in person schooling and back to fully remote due to on-going quarantining requirements.

Teachers are on it!  Their analyses go well-beyond Below and Above Proficiency cutoff rankings and growth measurement calculations as they take into account levels of student engagement when learning is remote and in class. Data includes levels of Internet interruptions, number of siblings sharing band-width and devices, food insecurity and other factors impacting capacity to deliver and participate in  instruction. 

An equally important part of teachers analyses focuses on the identification of specific foundational concepts where students are struggling in order to help with their day-to-day planning for both in person and remote learning activities. What should the focus be for those with unfinished learning and for those ready to extend their learning? How do you deliver the instruction needed when all students must sit 6 ft apart, facing forward, no movement within the classroom permitted, and access to manipulative and other tools have to be disinfected and organized for every iteration of a lesson or lab in advance, then re-collected, disinfected and organized for the next class. It’s a daunting agenda and it’s EVERY DAY.

Instead of eating up precious instructional time with a modified versions of  state testing, which means fewer questions, fewer concepts being measured and at best, it’s just a snap shot of student learning at any given moment, why not simply ask schools to share the whole year’s record of diagnostics that could furnish a much deeper understanding of where our kids are now? Use the $ that are expended modifying assessments and re-purpose state systems to receive existing results that can provide a time-lapse picture of student learning for 2020-21.

And if we really wanted to make a difference for our current PreK-12 population of students and coming future generations, why not pivot to what needs to be done to build on what COVID is starkly revealing about the vast inequities between poor neighborhoods and wealthier ones when it comes to the buildings teachers and students are in six to eight hours a day. We would all be better off if we spent the dollars need to diagnose  that “learning slide” and how our school funding mechanisms must be prioritized  and reformed to make our kids’ educations as big a priority as our military budget.

How much data is your school / district already generating that is more useful than End-Of-Year testing?

As educators look at End-of-Year state assessments, End-of-Course results and other data points used to track progress toward achievement and growth goals, teachers are getting more experienced and comfortable identifying gaps in learning.  What they aren’t as comfortable and confident about is knowing what to do next – how to use what they’ve learned to take action beyond student grouping decisions – how to create lesson plans with specific instructional strategies aimed at engaging students more effectively in the concepts and skills needed. An even further reach is encountered when it comes to expanding their investigations into verifying the reasons “Why?” students are experiencing the challenges revealed. 

Continuous, effective use of data by teachers requires that all three components of learning from our data are part of professional routines in a culture where we believe it is our moral responsibility to help all students thrive.

  1. Examine student learning and other relevant data to identify who is getting it, who isn’t, what aren’t they understanding and what aren’t they able to do successfully.
  2. Use that information to plan specific instructional strategies to engage students in the areas needing additional learning and plan how to measure the impact of those strategies.
  3. Continue to broaden the analysis to identify and verify the root causes of student learning gaps.

No. 2 above is perhaps the most critical component of the cycle and it actually speaks to a topic that

in itself warrants careful attention and requires additional analyses beyond the classroom. What’s working? And what isn’t? Which students are we leaving behind?

The reality in most schools is that as teachers are focused on their students, they are also often implementing new materials, new intervention programs (Social Emotional, Growth Mindset, Brain-based Learning, etc), developing higher level questioning strategies, identifying ways to scaffold rigorous instruction for ALL students. The list goes on and it’s all happening concurrently. But when we see improvements or our state report card ranking moves up a level, we aren’t sure which programs or combinations of changes contributed to the outcomes, or which ones had no impact whatsoever, except on our budget.

New programs and initiatives should from the outset include the following questions: What is the change we want to see, what will it look like when it’s implemented successfully, how will we know that it is successful, what is the evidence we’ll gather to help us know if it’s working?

Are you asking these questions?

We’ve written here multiple times about collaborative inquiry and how it is the most essential ingredient in using data to achieve real results. The two foundational books that came from the Using Data Project at TERC, Using Data Getting Results* and The Data Coach’s Guide, created a great number of pages, templates and tools describing step-by-step processes to guide instructional coaches, administrators, grade level team leaders and others. The guidance focused on how to begin developing the trust, confidence and skill of practitioners that can bring about collaborative inquiry in the service of learning to use data to guide instructional changes.

Our work began In the late 90’s just as data began it’s ascendency as flash points for improving educational outcomes. The underlying research was initially sparked by requests from schools, districts, and state department of education leaders, who themselves were overwhelmed and seeking help to understand how best to use data in schools. The growing demand for technical assistance led, Nancy Love, then at TERC, to wide-ranging research of a nascent body of literature and a very short list of other educators working in the field – Bob Garmston and Bruce Wellman, Ruth Johnson, Mike Schmoker, Victoria Bernhardt. But in fact, much of what was developed and disseminated through professional development came from schools themselves, where teams of early explorers worked with the Using Data Project team in partnership with WestEd,  to shape the ideas and build the tools.

One result, nearly twenty years later, is that you can’t pick up a book from the educators’ book list without encountering the essential ingredient for any change process to take hold which is collaborative inquiry. And it matters little what the main topic of the book is, to get the work done, it starts with teachers shifting their culture from one based on being individually “highly qualified”, to being part of a collaborative team where inquiry uses multiple perspectives and sets of skills to craft more rigorous lessons, differentiate instruction to assist all children to grow from their own starting point, use multiple forms of formative and summative assessments and other data to measure growth, provide timely feedback to students, and identify new areas of content and pedagogy needing professional development or coaching and peer feedback.

We know better than ever how powerful collaborative inquiry is in the hands of classroom teachers. We have hands-on professional development, online courses and tools, videos from the Teachers’ Channel and YouTube to assist schools in shifting their culture toward one of collaborative inquiry. We have rubrics to help us gauge our progress to achieving the goal and showing us what the end goal will look like. What we don’t have, perhaps, is a clear idea about what it looks like when schools don’t make this journey.

Based on our work over the past twenty years, we recognize what practice without collaborative inquiry looks like, and what it sounds like in schools with no culture of collaborative inquiry at the heart of how teachers go about the business of teaching our children.

Without collaborative inquiry, some if not all of the following attributes are in clear evidence. Same old, same old…

  • There is no weekly schedule that devotes common planning time to teacher teams who need to bring their student work or other assessment results to the table for investigation.
  • Administrators rarely sit in on team meetings to learn and to support teachers’ work.
  • Grade level and subject level team conversations sound like past years’ “teachers’ room talk” where students or their parents are routinely blamed for students’ poor behavior or performance.
  • Cursory examination of end-of-year or interim assessment results is focused on what’s wrong with the wording of the items.
  • From grade to grade, subject to subject, there is no common vocabulary related to assessments, content or pedagogy.
  • Interventions and specialists’ work with students is not aligned or in step with classroom instruction – no coordination of scaffolding to support students’ needs.
  • Students have no idea where their own learning is in relation to the learning standards. Grades are just marks on their work.
  • Instruction is focused on helping students learn “basic skills”, develop fact fluency, use “key words” to determine what algorithm to use.
  • Rigorous instruction is restricted to high achieving students until the rest master the previous bullet items.
  • Team talk rarely if ever digs into the concepts embodied in learning standards, to explore their own content knowledge or the pedagogy that could support different learners.
  • Instructional coaches or assistant principals meet individually with teachers to tell them about the data they are seeing and to suggest needed changes.
  • There is no coherence of content vertically (or horizontally) and little knowledge of what instruction in previous grades looked like relative to current instruction.
  • There is little evidence that current research about best practices or case studies are brought to the table to provide new insights.
  • Administrators select professional development for staff based on what other schools are doing near by.
  • There is no routine monitoring to determine the level of impact of newly implemented “fixes”.
  • No time is devoted to enabling teachers to regularly reflect both as individuals and collectively about their practice, about what excellent instruction would look like to help their students learn this, what would it take, how far are we from that.

We have spent years and years in schools recognizing the attributes above at the outset of the work, and we’ve experienced the satisfaction and real joy that comes when we see these characteristics disappear as teachers themselves begin to surface the important questions and go after the data to help them find the answers.

Collaborative Inquiry is happening! Read about the experience in Metro-Nashville Public Schools working with REL Appalachia to support teachers building their craft together through Collaborative Inquiry.


Johnson, M. (2018). Empowering educators to make data-informed decisions: A district’s journey of effective data use.   In E. G. Mense & M. Crain-Dorough (Eds.), Data leadership for K-12 schools in a time of accountability, 158-183. Hershey, PA: IGI Global, Inc. MNPS Collaborative Inquiry Toolkit website at www.mnpscollaboration.org/about.html


*Using Data Getting Results was published by Christopher Gordon Publishers which no longer exists and the book is no longer in print.

*The Data Coach’s Guide is still published and available at Corwin Press a Sage Publishing Co.

In this space we present all the ways data can be used by teachers and administrators fine-tune or dramatically reinvent how teaching and learning happen in classrooms. We share processes and techniques teachers can use to zero in on students’ learning challenges – the gaps and misconceptions they may be experiencing, in order to re-think how the next lessons need to be orchestrated.  We counsel administrators at the both the school and district levels in how they can both initiate and support the rigorous use of data to inform decisions that can address  immediate needs as well as how to collect data   to monitor the impact of programs and policies.

In both cases, whether working directly with teacher leaders and teams or with administrative councils, we always stress the importance of continually raising the question of “why?”. Why are we seeing these results? If our data protocols are not opening the door to a relentless search for the causes of student learning gaps, we are missing out on the greatest opportunity to question the most fundamental contributors to low growth levels of our students – curriculum and instruction. And we’re not opening the door for teachers to continue to develop deeper knowledge of the content and learning progressions associated with acquiring the concepts, knowledge and skills in a domain as a means of adding to our repertoire of instructional strategies and techniques to engage learners every day in every lesson.

So what does effective data use in schools have to do with pre-surgery checklists? This past week Boston has been a swarming hive of thinking and exchange of ideas in an event called HUBweek 2017. At historic Faneuil Hall, Dr. Atul Gawandwe, a surgeon at Brigham and Women’s Hospital and a professor at Harvard School of Public Health, discussed with Malcom Gladwell (author of Blink, The Tipping Point, Outliers and more) about many aspects of rethinking health care and “introducing innovative systems to help save more lives.” (Boston Globe, 10/14/17 Metro section). Here is what grabbed my attention in the article.

“Despite the rush to find new and innovative ways to save patients, the medical profession must also ensure that basic medical protocols are followed. The extreme complexity of modern medicine has exceeded are ability to handle it.” (bold mine) These two sentences resonate so deeply with what teachers are experiencing every day.

The innovation Dr. Gawande and his colleagues developed, which is saving thousands of lives around the world, is a simple pre-surgical checklist to be used by surgeons and their team before surgery begins on a patient. “The goal, he said, has been to create a new kind of science, one that defines where failures occur, where complexities overwhelm teams and find prototypes to fix these problems” (Boston Globe, 10/14/17 Metro section). See the connection!  They are also focusing on the “why?”.

Many schools where we are working have introduced or are introducing Instructional Rounds, another practice borrowed from the medical world in hospitals, where teams of physicians go from patient to patient to review patient status.  In schools, teams of teachers, administrators and specialists visit classrooms to observe instruction and provide feedback to teachers. Is there something we could learn from the pre-surgery checklist that could help us save student learning in classrooms?  It’s a question I invite you to think about and share insights.

In the version of the check list furnished by the World Health Organization (WHO), there are three instances when surgeons take stock – before anesthesia is administered, before the start of the surgical intervention, before any member of the team leaves the operating room. As educators, we drill down into assessment results after the teaching has occurred (summative), or while the teaching is in motion (formative) and in doing our causal analysis we bring a wider range of evidence to the table. Does this check list from the medical field offer us an opening for what evidence we might confront in the moment in our classrooms?

What questions could we ask our students before instruction begins? During instruction?  At the conclusion of the instruction before moving to another subject or classroom? What interactions with students would this require that help us agree on what the learning needs are and what is going to take place? And make students real engineers in their own learning? What data would this generate to assist us in planning and adjusting lessons?

Is there an opportunity here for us to develop the Pre-Instruction Check List to help us ensure we’re getting ready to teach with absolute confidence in the lessons coming up? What would we include in that check list? What are excellent teachers already checking moment by moment during instruction? Could such a check list focus our attention of the most vital elements required for ensuring that all of our instruction produces the desired outcomes?Fishbone Close Up

In facilitating teams of teachers who are focused on using their data to figure out next steps for instruction (or school level teams focused on teaching and learning), Using Data facilitators introduce processes and protocols to support genuine inquiry.  There are the 5 phases of continuous improvement (or the 6 or the 8). And frequently schools implement cycles of improvement.  What they so frequently miss is one element that makes it work.  In music, it’s “all about the bass”.

In data analysis it’s all about discovery,  being open, being in exploration mode, which means leavimultiple pieces of large chart paper displaying data analysis that creates a hand-drawn data wallng assumptions at the door. The tension here is that as humans, we aren’t that comfortable with holding out in uncertainty.  We want to solve problems quickly. We want to feel confident that we know what we’re doing. And any suggestions to the contrary, render us incapable to doing anything but sticking to what is familiar instead of taking the risks that high performing schools have come to relish.

If we extend the notion of being open a little further, it isn’t too far a stretch to realize that  along with discovery and exploration goes one of the 7 Norms of Collaboration – screen-shot-2016-12-01-at-10-21-07-am“Presuming Positive Presuppositions”. In other words, assume that everyone at the table only wants what’s best for our students. And most importantly, when looking at our students’ results, presume that every student wants to learn and to be successful. If we can presume positive presuppositions about them while we stay in discovery mode to learn more about their strengths, their sometimes hidden or buried aspirations, we can figure out how to design instruction that overwhelms the effects of poverty, learning disabilities and language differences.

In other words, explorers don’t let students’ historical and demographic profiles bias their instruction. Instead they are continuously open to the possibilities that are within every student we teach. Teacher teams who have learned how to confront their low expectations for student learning use the data to surface the questions leading to the next great discovery rather than jumping to premature conclusions that typically result in same old, same old – cycles of reteaching, assigned interventions and test prep.

On another note, with this week’s announcement by President-Elect, Donald Trump that his nomination for the Secretary of Education position is Betsy DeVos, a strong advocate of education vouchers and charter schools in Michigan, perhaps we could slow down any rush to judgement and instead, benefit by using some of the same processes for using data effectively (be in discovery mode, triangulate the data, search for root causes, monitor progress toward goals)  before we draw conclusions about the implications of this appointment.

By Mary Anne Mather, Managing Editor
TERC’s Using Data For Meaningful Change Blog
TERC Using Data Senior Facilitator

Many districts are heading into spring state-level testing. It’s irrefutable that the opinions surrounding the pros and cons ofthree teachers collaboratively analyzing student work samples such assessments make for heated discussions in many circles. Not the least among the disputes is the time spent on what some call “teaching to the test.” The high stakes value placed on these tests can make even the best of us do things we don’t really embrace as best practice.

At TERC, we try to look at it from a different angle. What if our day-to-day work as professional-level instructors set the stage for students to perform better on the standardized tests because we intricately understood the ins and outs of what students do and do not know? Armed with that knowledge, we can plan classroom instruction that closes the gap between misconception and success. It’s most likely going to influence test scores, while addressing essential grade-level learning goals. That’s where looking at student work samples comes in! (more…)

Guest Blogger, Jennifer Ungermultiple pieces of large chart paper displaying data analysis that creates a hand-drawn data wall

When TERC’s Using Data facilitators work with schools and districts, we assist with establishing a continuous improvement culture that is grounded on collaborative inquiry. In the process, a lot of chart-paper-sized posters are created. There’s a sound and productive reason for this large-format paper trail!

Anyone who has engaged in data analysis with Using Data’s protocols and processes knows that we value public recording on chart paper because it gets everyone literally “on the same page” and talking together as they uncover learning challenges, own them, and identify strategic solutions. (more…)

By Mary Anne Mather, Managing Editor
TERC’s Using Data For Meaningful Change Blog

bo and girl lean over folders on a table and work on indpendent student projects

Photo Credit: Clyde Gaw, TAB Educator

 Too often, when people think about using data, they limit their thinking to consulting test and assessment data from state tests, to district benchmarks, to classroom assessments. And while consulting this level of data has its merits, being truly data-informed requires so much more! As teachers, we can come closer to “data-genius” if we tap the treasure-trove of data that a classroom genius hour reveals… (more…)

Guest Blogger: Jennifer Ungermany colored 3-D question marks

I have worked with so many districts and schools where the leadership proudly points to their “data binders”—most recently I recall a three-inch D-ring binder. Not that binders filled with data aren’t helpful or good, but I caution that if they are not being used to guide instructional and programmatic decisions, well, then they can be a waste of precious time and money. More importantly, if they are not connected to a shared ownership of the questions a group of educators has about instruction and programs and similar concerns, then they can serve no meaningful purpose.

So how do we get from just having data to using data for meaningful change and improved results? (more…)

Introduction by Mary Anne Mather, Managing Editor
TERC’s Using Data for Meaningful Change BlogGroup of teachers analyzing and charting data using 4-pahse dialog
…with a link to Data Quality Campaign’s Flashlight blog on
How Educators Use Data: A Four Step Process

Effective Use of Classroom Data: It’s a topic that weighs on the minds of many educators these days. It’s also the title of a workshop that TERC Using Data recently facilitated at MESPA (Massachusetts Elementary Principals’ Association). The educators who attended were seeking strategies and resources to bring back to their schools that would help them build a culture of data use that is continuous, meaningful, manageable, sensible, and effective. Who isn’t?

There is little doubt that, in the news, education-related data are routinely discussed, bandied about, and sometimes applied in ways that are not efficacious for supporting effective teaching and learning. TERC is dedicated to making data a sweet and welcomed word, not a dreaded mandate. That’s why we were so excited that Rebecca Shah (@rebecca_shah) from Data Quality Campaign was a surprise workshop attendee! Rebecca took one of the teacher-level data analysis processes shared during the workshop and used it to reflect on the session and its outcomes. Her thoughts and related resources are posted on the Flashlight, Data Quality Campaign’s blog: How Educators Use Data: A Four Step Process. Enjoy!

And if you’d like to learn more about Four-Phase Data Dialogue, visit our Data Tips (see Tips 2-5).