October Scott smiling into the camera with s sleek new buzz cut and a burgundy collared shirt
October ScottContent Specialist

Reflective practice is central to teachers’ professional growth and ability to adjust their teaching approaches to address the needs of their specific students. Yet, teachers are seldom offered the opportunity to engage in structured, deep inquiry about their own instruction techniques.

Lori Calhoun

Lori Calhoun

For Lori Calhoun, a sixth-grade math teacher in South Carolina, that opportunity came through Teacher as Researcher.

Calhoun was among the teachers who participated in this innovative professional learning series, offered in South Carolina by Marzano Research in partnership with the South Carolina Department of Education and Education Analytics.

When it comes to determining the effectiveness of instructional strategies, Marzano Research believes teachers have the best vantage point in finding out what works for their unique classroom contexts. Teacher as Researcher rethinks instructional strategy research by equipping teachers to test strategies’ impacts and adjust their day-to-day teaching based on their findings.

“In going through the cycles, I noticed that I became more intentional about using the research to guide my instruction and make adjustments as needed,” Calhoun reflected. “It allowed me to identify strengths and weaknesses that assisted me in making informed decisions regarding the teaching and learning that was taking place in my classes.”

The “cycles” Calhoun refers to are called Instructional Improvement Cycles—a core component of the Teacher as Researcher model. In these structured experiments lasting roughly four weeks, teachers select an evidence-based instructional strategy, implement it with their students, gather data on the outcomes, and analyze that data to evaluate the strategy’s effectiveness. Throughout the process, participants receive personalized coaching from Marzano Research experts.

Testing an “Error Analysis” Strategy

Using South Carolina’s Strategy Clearinghouse, participants can choose from hundreds of instructional strategies to test. Strategy Clearinghouse is a publicly available, comprehensive database of evidence-backed and standards-aligned strategies, curated by Marzano Research from What Works Clearinghouse practice guides and other reputable resources.

The strategy Calhoun decided to put to the test was, “Select solved problems that reflect the lesson’s instructional aim, including problems that illustrate common errors.”

“Students should be able to begin to recognize and classify their mistakes themselves,” Calhoun explained. “They can think through what types of mistakes are being made and, in turn, learn from them.”

Calhoun conducted two full Instructional Improvement Cycles implementing this strategy—first in a sixth-grade math class focused on equations, and then in an advanced sixth-grade class covering statistics. This allowed her to evaluate the approach across different student groups and content areas.

Qualitative Insights and Reflections

Throughout the cycles, Calhoun assessed student learning through pre-, mid-, and post-test data and kept detailed qualitative notes on how the instruction progressed. These notes revealed valuable insights that helped shape her reflections and future instruction.

“When implementing the error analysis [strategy], I noticed that the majority of my students were able to identify the errors,” said Calhoun. This finding aligned with the strategy’s intended goal of boosting error recognition.

Her notes also revealed opportunities for refinement. By more precisely categorizing the mistakes students made, Calhoun said she could adapt her lessons to provide targeted support for the underlying misconceptions or skill gaps.

“I noticed that some students would identify the error incorrectly but then still get the answer correct,” she said. “I feel like it would be important to keep track of specific errors students are making. For example, were the errors computational, careless, or conceptual?”

Evaluating Effect Sizes

In addition to her qualitative reflections, Calhoun used the Teacher as Researcher tools to calculate effect sizes based on the student assessment data.

“My intervention effect size and the baseline effect size were both positive. And by looking at their scores, there was growth between the pre-, mid-, and post-assessments,” Calhoun explained. “The confidence level based on the scores was a yes, meaning there is some confidence that I could get similar results with different students.”

Interestingly, the effect sizes Calhoun observed were somewhat smaller during the ‘intervention’ phase when implementing the error analysis strategy, compared to her baseline instruction. She considered possible explanations for this: “I did use another strategy that may have been helpful to students when solving [problems]. I used a four-step process to help them organize their steps when solving. The majority of my students used this method when solving their post-test.”

In other words, Calhoun was already incorporating effective techniques in her baseline instruction, making it a high bar for the new strategy to produce drastically larger gains. Her findings suggested that an ideal approach may be to combine multiple complementary strategies.

A Path for Improving Instruction

Based on her two Instructional Improvement Cycles, Calhoun identified key takeaways to apply going forward.

“I plan to continue to use error analysis next year in my classes when teaching these topics,” she said. “However, I plan on spending more time on how to find the quartiles followed by more error analysis examples with this specific topic.”

She also wants to do a cycle with other classes to compare with her original results.

“I plan to use the same strategy again with a [more advanced] class and a different unit in math to be able to compare my results from both after cycle two,” said Calhoun.

Throughout the Teacher as Researcher series, participants had the chance to not only test strategies to determine their feasibility and impact, but also use the experience as an opportunity to reflect on instructional practices, research design, and student learning. With recent enhancements to the program, South Carolina participants now also have access to Strategy Workshop, a newly launched web tool that provides all the resources and analysis tools in one place.

As Calhoun’s case shows, there is value in honoring teachers’ knowledge through providing opportunities for them to engage in research in systematic, rigorous, and supported ways. By embracing an evidence-based, reflective approach to continually refine her pedagogy, Calhoun exemplifies how the Teacher as Researcher model can elevate educators’ methods for continually optimizing their craft. And, the ultimate beneficiaries are the students, whose learning experiences are enriched through reflective, data-driven instruction.

Special thanks to Enid Rosario-Ramos for contributing to this blog.