data analysis


In facilitating teams of teachers who are focused on using their data to figure out next steps for instruction (or school level teams focused on teaching and learning), Using Data facilitators introduce processes and protocols to support genuine inquiry.  There are the 5 phases of continuous improvement (or the 6 or the 8). And frequently schools implement cycles of improvement.  What they so frequently miss is one element that makes it work.  In music, it’s “all about the bass”.

In data analysis it’s all about discovery,  being open, being in exploration mode, which means leavimultiple pieces of large chart paper displaying data analysis that creates a hand-drawn data wallng assumptions at the door. The tension here is that as humans, we aren’t that comfortable with holding out in uncertainty.  We want to solve problems quickly. We want to feel confident that we know what we’re doing. And any suggestions to the contrary, render us incapable to doing anything but sticking to what is familiar instead of taking the risks that high performing schools have come to relish.

If we extend the notion of being open a little further, it isn’t too far a stretch to realize that  along with discovery and exploration goes one of the 7 Norms of Collaboration – screen-shot-2016-12-01-at-10-21-07-am“Presuming Positive Presuppositions”. In other words, assume that everyone at the table only wants what’s best for our students. And most importantly, when looking at our students’ results, presume that every student wants to learn and to be successful. If we can presume positive presuppositions about them while we stay in discovery mode to learn more about their strengths, their sometimes hidden or buried aspirations, we can figure out how to design instruction that overwhelms the effects of poverty, learning disabilities and language differences.

In other words, explorers don’t let students’ historical and demographic profiles bias their instruction. Instead they are continuously open to the possibilities that are within every student we teach. Teacher teams who have learned how to confront their low expectations for student learning use the data to surface the questions leading to the next great discovery rather than jumping to premature conclusions that typically result in same old, same old – cycles of reteaching, assigned interventions and test prep.

On another note, with this week’s announcement by President-Elect, Donald Trump that his nomination for the Secretary of Education position is Betsy DeVos, a strong advocate of education vouchers and charter schools in Michigan, perhaps we could slow down any rush to judgement and instead, benefit by using some of the same processes for using data effectively (be in discovery mode, triangulate the data, search for root causes, monitor progress toward goals)  before we draw conclusions about the implications of this appointment.

By Mary Anne Mather, Managing Editor
TERC’s Using Data For Meaningful Change Blog
TERC Using Data Senior Facilitator

Many districts are heading into spring state-level testing. It’s irrefutable that the opinions surrounding the pros and cons ofthree teachers collaboratively analyzing student work samples such assessments make for heated discussions in many circles. Not the least among the disputes is the time spent on what some call “teaching to the test.” The high stakes value placed on these tests can make even the best of us do things we don’t really embrace as best practice.

At TERC, we try to look at it from a different angle. What if our day-to-day work as professional-level instructors set the stage for students to perform better on the standardized tests because we intricately understood the ins and outs of what students do and do not know? Armed with that knowledge, we can plan classroom instruction that closes the gap between misconception and success. It’s most likely going to influence test scores, while addressing essential grade-level learning goals. That’s where looking at student work samples comes in! (more…)

Guest Blogger, Jennifer Ungermultiple pieces of large chart paper displaying data analysis that creates a hand-drawn data wall

When TERC’s Using Data facilitators work with schools and districts, we assist with establishing a continuous improvement culture that is grounded on collaborative inquiry. In the process, a lot of chart-paper-sized posters are created. There’s a sound and productive reason for this large-format paper trail!

Anyone who has engaged in data analysis with Using Data’s protocols and processes knows that we value public recording on chart paper because it gets everyone literally “on the same page” and talking together as they uncover learning challenges, own them, and identify strategic solutions. (more…)

Guest Blogger: Jennifer Ungermany colored 3-D question marks

I have worked with so many districts and schools where the leadership proudly points to their “data binders”—most recently I recall a three-inch D-ring binder. Not that binders filled with data aren’t helpful or good, but I caution that if they are not being used to guide instructional and programmatic decisions, well, then they can be a waste of precious time and money. More importantly, if they are not connected to a shared ownership of the questions a group of educators has about instruction and programs and similar concerns, then they can serve no meaningful purpose.

So how do we get from just having data to using data for meaningful change and improved results? (more…)

Introduction by Mary Anne Mather, Managing Editor
TERC’s Using Data for Meaningful Change BlogGroup of teachers analyzing and charting data using 4-pahse dialog
…with a link to Data Quality Campaign’s Flashlight blog on
How Educators Use Data: A Four Step Process
*
****************

Effective Use of Classroom Data: It’s a topic that weighs on the minds of many educators these days. It’s also the title of a workshop that TERC Using Data recently facilitated at MESPA (Massachusetts Elementary Principals’ Association). The educators who attended were seeking strategies and resources to bring back to their schools that would help them build a culture of data use that is continuous, meaningful, manageable, sensible, and effective. Who isn’t?

There is little doubt that, in the news, education-related data are routinely discussed, bandied about, and sometimes applied in ways that are not efficacious for supporting effective teaching and learning. TERC is dedicated to making data a sweet and welcomed word, not a dreaded mandate. That’s why we were so excited that Rebecca Shah (@rebecca_shah) from Data Quality Campaign was a surprise workshop attendee! Rebecca took one of the teacher-level data analysis processes shared during the workshop and used it to reflect on the session and its outcomes. Her thoughts and related resources are posted on the Flashlight, Data Quality Campaign’s blog: How Educators Use Data: A Four Step Process. Enjoy!

And if you’d like to learn more about Four-Phase Data Dialogue, visit our Data Tips (see Tips 2-5).

 

GUEST BLOGGER: Kevin Dwyer, Education Consultant, LearningDesignsheadshot of the author, Kevin Dwyer
Email: kevin@learningdesigns.net          
Twitter: @marketeducate

Education reform is an ongoing topic of public comment and debate in many states. Our guest blogger, Kevin Dwyer, a long-time education consultant and Connecticut resident, fills us in on the Connecticut news that he’s been following…

A fiery public debate about education reform in Connecticut has been ignited by first-term Governor Dannel Malloy. Twitter is alive with back and forth 140-character points and counterpoints (see #ctedreform and #wecantwaitct). Data is at the heart of arguments on both sides.

The driver for the debate is a fact debated by no one: Connecticut has the highest achievement gap in the country. The purported excellence of its suburban schools serve to highlight the gross inadequacies of Connecticut’s urban districts. Students in urban districts are simply not being given the same access to quality education as their suburban neighbors.

Compounding the problem is that Connecticut has failed three times to secure Race to the Top (RTTT) money. Billions of dollars have been awarded in three rounds of funding. Connecticut has yet to earn a dime. Lack of an adequate evaluation system to promote effective teaching practices has been a key shortcoming in their RTTT applications.

After the most recent RTTT application failure, the Governot drew a line in the sand. At the State of the State address in February, Malloy said, “Let’s be honest with ourselves, and let’s speak bluntly: many parts of our system of public education are broken.” In essence he said that schools in Connecticut must change—not just urban schools, ALL schools.  He added, in reference to the issue of equitable teacher quality, “In today’s (public education) system, basically the only thing you have to do is show up for four years. Do that, and tenure is yours.”

Opponents and proponents of the Governor’s comprehensive school reform plan have readily lined up on either side of the issues of teacher quality, funding for charter schools, and definitions of education reform. Interestingly, both camps reference common data sources and are able to make data interpretations that selectively support their opposing views.

The Connecticut Education Association (CEA) leaked a memo that outlined their strategy to delay the Governor’s reforms for at least another year. The CEA has launched a television ad campaign which says that the Governor doesn’t get reform right. They attest that the Governor ‘s plan “takes away district control and places it in the hands of the state education commissioner; allows principals to decide which teachers are certified and; and siphons tax dollars from neighborhood schools.”

On the other side of the argument, reform advocacy group ConnCAN  has taken dead aim at the union—telling them to “Come Clean” with their membership about the role of student achievement data and teacher evaluations. ConnCAN is one of a host of groups representing businesses (CT Business and Industry Association), Superintendents (CT Association of School Superintendents), School Boards, and principals who are supporting Governor Malloy’s vision for school reform.

In the end, the question of what’s right for students and educators is being lost. Everyone agrees that something needs to be done. But the message is being buried by claims and counter claims buttressed by the same data—often manipulated to support opposing viewpoints. It leaves the public increasingly polarized about education funding and teacher performance, and wondering whose data interpretations to trust. The REAL challenge remains: figuring out how to move beyond special interests to how to meaningfully and accurately use the data to move together towards educational excellence.

GUEST BLOGGER: Mary Anne Mather, Using Data Senior Facilitator & Social Media Liaison on Twitter & FaceBook

If you want to tap one of the most powerful uses of data, disaggregate! Disaggregation means looking at how specific subgroups perform. Typically, formal student achievement data come “aggregated,” reported for the population as a whole—the whole state, school, grade level, or class. Disaggregating can bring to light critical problems and issues that might otherwise remain invisible.

For example, one district’s state test data indicated that eighth-grade math scores steadily improved over three years. When the data team disaggregated those data, they discovered that boys’ scores improved, while girls’ scores actually declined.different colored stick figures sorted into color-coordinated groups Another school noticed increased enrollment in their after-school science club. However, disaggregated data indicated that minority students, even those in more advanced classes, weren’t signing up. These are just some of the questions that disaggregated data can help answer:

• Is there an achievement gap among different demographic groups? Is it getting bigger or smaller?

• Are minority or female students enrolling in higher-level mathematics and science courses at the same rate as other students?

• Are poor or minority students over-represented in special education or under-represented in gifted and talented programs? (more…)

Next Page »