Our Blog

Measuring School Performance without a Strategy

Posted June 26, 2011 5:16 PM by Dylan Miyake

An article in Saturday, June 25's Wall Street Journal, "School Reform, Chicago Style," provided an interesting picture on measuring performance.

The schools were gathering a lot of data. "Two number crunchers at Marshall [High School] digested tens of thousands of data points, from the frequency of fights to cheerleaders' GPAs." After a year's worth of data collection and analysis, some schools in Chicago were seeing "promising trends."

According to Ascendant's 2011 strategy management survey--which we conducted in conjunction with our 2011 strategy execution summit--data quality was one of the top ten barriers to performance management and strategy execution. So, from this point of view, Chicago was succeeding with one of the harder issues.

Our strategy management survey found the number one barrier to performance management and strategy execution was funding. Chicago's worst performing schools were doing okay in the funding department as well--at least well enough in the days of declining budgets--because they received $20 million in federal money and an extra $7 million in local money.

However, even though Chicago Public Schools were able to overcome two of the biggest obstacles to performance management, the article presented a system that was achieving mixed results, at best.

In fact, the attitude of teachers and administrators toward the data collection and analysis was ambivalent. At one school, staffers complained about data collection and entry requirements, according to the Journal. One Chicago school administrator "embraced the process but worried the school was applying and measuring too many incremental changes without determining what works."

I think that points to the crux of the problem for Chicago Public Schools--they are measuring too many things. They decided they needed to measure their performance, but this performance measurement was not linked to a strategy. They hadn't taken the time to decide what issues weren't strategic and, therefore, what data they didn't need to track on a regular basis.

"It's very hard to get any sense of cause and effect," assistant principal Matt Curtis said. "We need to home in on a strategy or two and make the commitment to run with that."

As Chicago may find out, without the strategy piece, gains from measuring performance will be hard to maintain.

Filed Under Measurement