douglas vandine pic
Douglas Van DineSenior Researcher

When state education agencies and school districts set out to build new systems, programs, and tools, it’s helpful to have a trusted partner to help pilot, validate, and refine them to ensure they’re working as intended.

Following statutory changes to teacher evaluation systems in 2018, Wyoming Department of Education (WDE) partnered with REL Central at Marzano Research; the Region 11 Comprehensive Center; and a committee of teachers, principals, and central office staff from across the state to develop professional practice standards for teachers and resources for teacher evaluation.

WY.SEES .onepager 1Between 2018 and 2020, this group developed a state-specific evaluation system districts could adopt: the Wyoming Sample Educator Evaluation System (Wyoming SEES). After an initial draft was completed and reviewed by Wyoming stakeholders, the group agreed it was important to pilot the system to establish its reliability and validity. Without this assurance, they believed districts would be unlikely to adopt the new system.

To support the pilot, we partnered with the WDE and its stakeholder committee during the 2020–21 school year to:

  • develop and refine research questions,

  • develop a survey instrument and focus group protocols,

  • administer the survey to all teachers and administrators in pilot districts,

  • conduct focus groups with teachers and administrators,

  • collect de-identified rubric data from all evaluated teachers,

  • analyze data and interpret results, and

  • refine the evaluation system based on results.

Overall, teachers and administrators in our focus groups provided positive feedback about the teacher evaluation system. Educators shared that the system “really did look at the teacher as a whole” and “led to a lot of self-reflection in how to improve as a teacher.”

Data analysis included evaluating the functionality, reliability, and validity of the Wyoming SEES rubric. Overall, there were many rubric items that every teacher had mastered, suggesting that these items may not be needed. Additionally, several teachers had mastered every item on the rubric, signifying additional indicators could be added to the top end of the rubric to make it more useful in determining varying levels of effectiveness.

Following a co-interpretation of the results of the pilot, we helped guide the WDE and its stakeholder committee in using the results to inform revisions and refinements to the system. For example, indicators in the rubric that were confusing or too easy were removed or reworded, and indicators that were not a good fit were moved to increase alignment within a benchmark.

Information gathered through the survey and focus groups also helped highlight additional supports teachers and administrators might need to better implement the system. These supports include annotated rubrics for teachers of different grade levels or content areas (e.g., physical education, band), training materials, and tools to help evaluators calibrate their understanding of rubric indicators to better align evaluations completed by different evaluators.

The WDE and Wyoming stakeholders gained additional confidence in the system through the pilot data analysis and are excited to see the system adopted by other districts in the state. Seven Wyoming districts have chosen to use the Wyoming SEES during the 2021–22 school year. The agency will use what it learned from this process—and what it will learn from these seven districts’ experiences—to continue to evaluate the reliability and validity of the Wyoming SEES and make additional improvements to the system.

If you have a new system, tool, or program that you would like to pilot and validate, reach out to Douglas Van Dine at douglas.vandine@marzanoresearch.com.