Important message to our readers and subscribers

This week, we will be transitioning the McREL Blog to a new platform. If you have bookmarked our site you might encounter page errors, which you can fix by refreshing the web page and updating your bookmarks. If you subscribe to our RSS feed, or you’ve commented on any recent posts, you might receive an automatic email update alerting you to updates to our blog. We apologize in advance for any inconvenience this might cause, and invite you to re-subscribe to our RSS feed after the transition.

We’re confident that this transition will provide a more robust social experience for our readers, and we thank you for your patience as we make this move.

Posted by McREL International.


Do teacher evaluations really help teachers improve?

IStock_000047244362_Small-feedbackIn recent years, annual performance reviews for teachers have become ubiquitous. Between 2009 and 2012 alone, the number of states requiring them jumped from 14 to 43. But do teacher evaluations make a difference in how teachers teach? Do they really help teachers improve?

Most research to date, write Bryan Goodwin and Heather Hein in the May issue of Educational Leadership, has not focused on this question. For example, in 2013, a three-year, $45 million study funded by the Bill and Melinda Gates Foundation found that it is possible—by taking into account student achievement data, student surveys, and classroom observations—to accurately evaluate effective teaching. What it didn’t show was whether accurate evaluations lead to better teaching.

However, in Cincinnati, one of the only studies to look at the performance trajectory of teachers found a spike in effectiveness of midcareer teachers who participated in evaluations based on multiple, structured observations conducted by experienced peers from other schools. Gains were largest for teachers with previously low levels of student achievement.

The reason, Goodwin and Hein write, seems to be that teachers had internalized the feedback from their peers, and that this micro-level, high-quality feedback—which they received before they were given summative ratings—was perhaps the most important factor in their improvement.

Indeed, not long after The Gates Foundation released its 2013 study, Microsoft itself announced it was moving away from the rating and ranking of its own employees and toward real-time feedback and coaching focused on professional growth. So, perhaps, the authors conclude, schools, too, would benefit from measuring less and motivating more.

So too, the authors conclude, when it comes to evaluating teachers and principals, school systems would do well to remember that the real benefit of performance appraisal lies not in the rigor of the rating system, but rather, in using goal-setting and feedback to support professional growth.

Read the entire column.

Posted by McREL International.


GreenSTEM Model: Steps for an instructional approach

GreenSTEM Creek ProjectThe 5th-grade class gathered by the creek that ran between their school and neighborhood, reminiscing about years past when it was safe to play in and around this water. The creek was now stagnant, cloudy, thick with algae, and foul-smelling. Thus began their yearlong GreenSTEM project that used STEM concepts and processes to investigate the problem with the creek, and inspired students to design and carry out a solution.

GreenSTEM is an engaging project-based approach that uses science, technology, engineering, and math (STEM) content and practices to investigate local environmental problems, and design and implement solutions. GreenSTEM projects like the creek example comprise all eight steps (see graphic), which are woven throughout the lessons, and can last a week, month or the entire school year. Authentic and relevant GreenSTEM lessons that focus on only one step can be explored in only one or two class periods.

Steps we took in the GreenSTEM Creek Mitigation Project

Identify a local green challenge
. We identified the poor water quality of the neighborhood creek as the challenge.

McREL-GreenSTEM STEPSFrame the challenge with driving questions. In our first lesson, the 5th-graders asked compelling questions like, “What is making the creek behind our school look and smell bad?”, “What is causing the pond to be covered with algae?”, and “What effect do the large rocks on the lower edge of the pond have on water flow between the pond and channel?” This led to a series of ecosystem investigations that helped identify the problem and design solutions.

Ground all actions in STEM content. Using science, math, and technology (i.e., the modification of a natural resource to meet the needs or wants of the community), the 5th-graders investigated and gathered data to describe the problems with water and soil quality, macroinvertebrate diversity, erosion and deposition, and invasive plant species. This evidence became the guideposts for designing solutions to modify the creek.

Design and implement a solution. In a dynamic engineering design process, students design a solution, test it, modify the design based on test results, and test again. This is not always easy in some real-world GreenSTEM projects. In the creek mitigation case, the 5th-graders discovered that any of their design solutions addressing erosion, eutrophication, and current placement of large rocks in the creek channel could only be implemented by, or with permission from, the local water district authorities because of environmental protection laws. This made collaboration with these community experts essential to the project’s success. Students gathered the data on their designs through stream table channel modeling and presented their evidence and design solution recommendations to the authorities. At this point, if the authorities chose to implement their design, students would then be able to gather comparison data, allowing them to enter another improvement cycle in the engineering design process.*

McREL-engineeringdesignprocess-042016Evaluate the solution, with the option to redesign. The effects of any creek mitigation would be evident immediately after implementation of the design solution, allowing a longitudinal study to continue for months and possibly years. Evaluation of various stream table channel design models would also be possible. With no creek mitigation, the evaluation could continue with the current creek channel design.

Share with others affected by the challenge and solution. Because students had very limited control in testing any solutions to the creek’s water quality, they became the engineer liaisons, offering their evidence-based solutions to the attention of the legal decision-makers. Students prepared and gave presentations to key stakeholders from the city and county water authority, school and district administrators, and interested community members.

Provide choices for students throughout the project. Real-world projects like this creek project are equally exciting for both teachers and students, who become co-learners by exploring, discovering, designing, and testing together. There is no preset answer or “right” solution. Given the latitude to make choices in GreenSTEM projects, students often discover new aspects of the challenge and come up with more innovative ideas than the teacher could have imagined.

Connect beyond the classroom throughout the project.
School and community stakeholders were not aware of the creek quality issue and its effect on community activities until students met with them. Because their suggestions were data-driven and evidence-based, the water district professionals, and community members initiated high-level discussions with students about their findings and solutions, and then took the information back to their agencies and leadership for further discussions about implementation.

GreenSTEM projects can continue to inspire students far beyond the classroom—even over a lifetime—through ongoing student engagement in community issues and the pursuit of “green” jobs and higher-education opportunities in STEM. This revelation became clear to a GreenSTEM educator workshop participant, who expressed it so powerfully:

“The human mind continues to process unresolved questions long after the question is asked. Even if a classroom never solves a GreenSTEM issue, those 30 minds will continue to churn on the problem. Perhaps someday, one or two of these minds will find an answer to the problem.”

*Note: The water authorities were grateful to the students for bringing this problem and some solutions to their attention, but chose not act on the recommendations, citing flood control engineering design as the priority to an unimpeded channel with improved water quality. This lesson in civics enhanced the students' authentic community GreenSTEM project.

Larndt_LThumbMcREL consultant Laura Arndt taught science and GreenSTEM education at the elementary and high school level for 16 years. Now at McREL, Laura develops science curriculum and professional development models, and offers expertise on STEM and GreenSTEM program development in both formal and informal education settings. For more information about GreenSTEM, contact Laura at



Additional resources

Green STEM: STEM as It’s Meant to Be

Greening STEM resources


Looking on the bright side for school improvement

IStock_000061951292_Large-2Recently, I’ve had some enlightening discussions with colleagues about the concept of an inside out approach to school improvement. Many of the meaningful exchanges in these conversations have centered on opportunities to learn from bright spots within our schools and districts. Often in school improvement planning, we limit ourselves to discussing challenges, ignoring the bright spots. By doing this, we’re missing a great opportunity to expand and replicate the greatest aspects of our schools, our existing strengths.

While I was serving as a high school administrator in Michigan, our school improvement team was charged with turning around student outcomes within some of our identified gap areas, in this case, 9th-grade math and 9th-grade English. At the time, our school was still structured much like it had been since the 1950s, with each department operating independently in its own silo. Attempting to shift this paradigm was like turning around a freighter ship in a canal. However, during one of our routine school improvement work sessions, this anti-change trend actually changed.

Here’s what happened.

While following a typical template for a Department of Education-designed comprehensive needs assessment for school improvement planning, we worked through a dialogue guided by data protocols. Thechallenges” side of our T-chart became our focus and grew far more elaborate than our “strengths” side. Much of the prior training that our school, along with others across the district, had received on school improvement planning was focused on problem identification, a flavor-of-the-month program remedy, monitoring implementation, and evaluating its effectiveness. This trend seemed to be going viral in school improvement teams across the district and the state.

However, in the midst of our gap-focused meeting, a hidden bright spot was exposed by a bold member of the team who asked: “What is our school really good at right now?” An awkward silence followed. The squeak from the markers on chart paper suddenly ceased as our focus shifted from gaps to bright spots. Soon though, we started talking about the positives: When people think of our school, what is it that people would say we are good at right now… ten years from now… ten years ago?

This new series of questions required some higher-order reflection—reflection that expanded beyond our T-chart template and beyond our initial gap-focused mindset. The responses rolled out rapidly: Band! Girls’ basketball! Football! It was fascinating to hear the collective agreement in these three responses. The attention of each member of the group shifted to a bright spot, and the responses steered everyone to a path of shared vision. All eyes focused on a space within schoolwide efforts. We derived encouragement from our ability to consistently attain our desired outcomes in these strength areas.  

Eventually, the follow-up question surfaced: Why? Why were these programs so good at getting desired outcomes, year in and year out? Why were we able to have consistent success with our students in these programs, year in and year out? This discovery process led to meaningful professional learning for every member of our team and helped reveal and replicate the “DNA” of these bright spots. In particular, we found our most powerful assets resided in:

  • High-leverage instructional strategies. Effective demonstration of high-quality instruction was consistent in our bright-spot programs. Reciprocal teaching, cooperative learning, peer tutoring, and time-on-task were just some of the many routine practices revealed by dissecting the programs. In addition, each program—whether it was band, basketball, or football—was highly effective with the use of formative assessment during practice to drive both instruction and instructional design and planning; this led to success at game or performance time.
  • Effective use of video to analyze and drive strategy. The most successful sports programs in our school used video to enhance the instructional program, providing students and players with an excellent example of how to fail forward, not just as individuals but also collectively.
  • High-quality transition programs from middle to high school. One common thread we discovered in each of the bright-spot programs was an excellent transitional program for middle school students entering into the high school program. These summer programs and camps helped to engage students and families with the core principles and expectations of the program. In addition, the middle and high school coaches and teachers collaborated and co-designed the program of study for the students and players, ensuring the transition was relatively seamless.

Our analysis of the “DNA” of these school programs revealed many of the same findings commonly gleaned through extensive workshops on high-quality classroom and school practices. However, this powerful practice of self-reflection created an additional sense of ownership among our team members. Encouragement came from our own school’s success—something that couldn’t be replicated from external findings or external teachings—dispelling any myths that team members might have believed regarding potential outcomes with “our kinds of students.”

Encouragement transformed into motivation, which created engagement. The rippling effects on our school culture and climate included:

  • Schoolwide collaboration. The department silos began to break down and faculty began to observe promising practices across departments.
  • Focus on the positive. A tighter, more engaging focus on closing gaps allowed us to fill the gaps with strengths.
  • Ownership and shared-vision. We weren’t asking faculty to buy in to a new program for the remedy. Rather, we already collectively owned the promising practices of our successful programs and could use them as a model for wider success.

What might you find if your school improvement team focused on your school’s bright spots and began replicating them? How much further down the road toward school improvement might you go?

Ben_Cronkright_130140A former principal and federal programs manager/academic officer for the Hawaii State Charter School Commission, McREL consultant Ben Cronkright works with Departments of Education in Guam, the Marshall Islands, and Federated States of Micronesia on increasing capacities related to teacher effectiveness and college- and career-readiness for students.


Looking at student work: Are you snorkeling or scuba diving?

000062504840_Small-student work samplesTeachers looking together at student work seems like a surefire way to improve teaching and learning, as teachers look at real artifacts and reflect on expectations, practices, and results. However, as with most things in education, success depends not on what teachers do but how they do it, write Bryan Goodwin and Heather Hein in this month’s Research Says column in Educational Leadership.

When teachers look at student work, are they just skimming the surface or diving deeply into what the work shows students—and even teachers themselves—are thinking about during the learning process? To take that dive, research shows three conditions need to be in place:

  •  Be tough on practices, not people. Teachers must trust each other in order to expose their struggles and failures, and they must be willing to be truthful in order to see results.
  • Focus on student thinking. Many collaborative conversations end up being about proving students learned and teachers did their jobs; in contrast, effective collaborative inquiry moves beyond whether students “got it” to what they were thinking
  • Encourage self-reflection. Analyzing student work has little benefit unless teachers also step back and reflect on their own work and assumptions.

Read the entire column.

Posted by McREL International.