How well does Rhee’s “Broom” work?

So how well does Rhee’s broom actually work? You know, the one she is supposedly using to get rid of incompetent, veteran* administrators and teachers? To hear her say it, her policy has been a smashing success. But the truth is quite different.

You might remember the 2008 TIME magazine cover with a photo of Michelle Rhee, frowning, dressed in black, holding a broom in a classroom. Some people simply thought it meant that Rhee was an old-fashioned witch, about to get on her broom and ride around. I understand that sentiment, but I suspect that the author of the article meant that Rhee planned to clean out the school system by getting rid of veteran teachers and administrators.

According to Rhee’s numerous public statements, the only thing that really matters is standardized test scores. (Though they are probably the LEAST important things in the lives of most students, and, it is very ironic that Rhee herself is unable to produce any test scores whatsoever backing up her mythical “Baltimore Miracle.”)

Taking this at face value (even though many people have shown that most standardized tests produce little data that is actually valuable), I have done a little bit of analysis to see whether the schools where Rhee fired or replaced the principals actually did better on the DC-CAS than the schools where the principals were not replaced. Recall that Rhee has claimed many times that her approach has been extremely successful. (I was about to say, “a sweeping victory.”)

What I found is that both groups of schools included ones where the scores went up a lot from SY 2007-8 to SY 2008-9, and both groups of schools had members where the scores dropped about as much. To me, using Rhee’s own yardstick, it’s hard to find any big difference between the two groups of schools .

Don’t believe me? Look at the graphs, below.

This first graph, with a light green background, shows the changes in the percentages of students scoring “proficient” or better on the MATH portion of the DC-CAS from SY 2007-8 to SY 2008-9 (that is, from the year before last to last year), at schools that had the same principals both years. Each bar represents a single school, but there was no room to label them all. This is for ALL DC public schools for which there is data for both school years – elementary and secondary, but not charter schools, since Rhee is not in charge of them. I have no way of tracking individual students’ scores. These graphs just give the changes in percentages of students “passing” the DC-CAS.

As you can see, one school increased its percentage of ‘passing’ students by about 43%. In fact, something like 2/3 of these schools showed an increase in the proportion of students passing in math. A couple of schools had declines of around 24% in the percentages of students ‘passing’ the math test.

Now let’s compare that graph with one that shows the progress (or lack thereof) of the counterparts to these schools —  the ones where Rhee put in a new principal for the second year (SY 2008-09, i.e. last school year).

I see three main differences between these two graphs:

  • The second graph has a background that is light orange or the color of canned salmon.  The first one has a light green background. (I changed the colors so that you and I can tell them apart; otherwise, it’s pretty tough.)
  • The second, light orange graph has fewer bars. That’s simply because more schools kept their principals than had them replaced by Rhee for SY 2008-9.
  • One of the bars in the first graph is a lot longer than any of the other ones. That was Birney ES. (Questions were raised by the DC State Board of Education about those scores.)

Now, for the similarities:

  • In both graphs, which means in both groups of schools,  somewhere near  1/3 of the schools showed declines from the first year to the second year, because the bars point downwards into negative territory. (Counting and using a calculator, I find that 12 out of 33, or 36% of the schools with NEW principals, had drops in scores; and 24 out of 70, or 34% of the schools that did NOT change principals, had drops in scores. Not a significant difference in my opinion, but if you think it maters, then it’s one that favors the schools that did NOT change principals.)
  • The bars are generally very similar in length in the two graphs.
  • In both graphs, the scale on the vertical axis is the same, so you can compare the graphs without bias.

So, not really much difference at all in the two groups of schools in terms of changes in math scores. Maybe reading tells a different story? Take a look.

First, changes in reading scores over the same period of time as the other two graphs, in schools that kept the SAME principal for both years:

That graph has sort of a blue-gray background. Notice that about half the schools improved, and a little more than half the schools did worse during the second year. I also estimate that at about 25% to 30% of the schools (the ones near the center), there was very, very little change.

Let’s now look at the changes in reading scores at schools where the principal was REPLACED:

Differences?

  • This last graph has a bright yellow background, not a pale blue-gray.
  • The bars on this second graph are significantly shorter, both in the positive and in negative directions.
  • One bar on the gray graph is a lot taller than any of the other ones. (Guess which school that was!**)
  • In the yellow graph, the fraction of schools that did worse the second year (i.e. the ones that have bars pointing downwards) is a bit under one-half, not a bit over one-half. (17 out of 45, or 38% of the schools with new principals had drops in reading scores, versus 37 out of 70, or 53%, for the schools which kept their principals.)
  • The fraction of schools that made essentially no change at al, one way or the other, appears to be smaller in the yellow graph.

So, all in all, even on Rhee’s own terms (where the only important thing is standardized test scores) her broom has so far made very, very little difference.

Which is not so surprising. After all, a broom is rather a blunt instrument.

=========================

* To some people, the terms “veteran” and “incompetent” are synonymous.  I disagree. It takes several years for a teacher to begin to become competent. (I’ve had my differences with a lot of local school administrators, but they also need a lot of experience to begin to become competent.)

* *Yes, Birney again.

========================

My data came from two sources.

  • Mary Levy had very carefully compiled a spreadsheet with all of the names of all of the principals in all of the DC public schools going back to at least 2002, and emailed that list to me. I spot-checked it to make sure it was accurate, ad found that it was. Mary’s compilation saved me many hours of labor.
  • For the scores, I used the data that you can find for yourself for each school on the OSSE-NCLB-DCPS website, which gives, among other things, the percentages of students scoring at various levels on the DC-CAS and its predecessors at all of the various publicly-supported schools in DC. I cut-and-pasted the data on AYP scores for the past 8 years into the previous spreadsheet. Then I had the spreadsheet do a bit of subtraction (scores in 2009 minus the scores in 2008), some sorting (by whether the same principal was kept or not, followed by the change in scores), and finally a bit of bar-graphing. That’s all.
Published in: on February 26, 2010 at 2:05 am  Comments (8)  

The URI to TrackBack this entry is: https://gfbrandenburg.wordpress.com/2010/02/26/how-well-does-rhees-broom-work/trackback/

RSS feed for comments on this post.

8 CommentsLeave a comment

  1. Thanks for this. Nice job.

    Like

  2. […] Access Portal, a dashboard-style site with data on the capital projects pipeline. DCist has a look. And GGW notes new online system for emergency no parking […]

    Like

  3. There’s a couple of stawart Rhee defenders on the Post comment section (Anne-Darcy, axolotl, bbcrock) who would accuse you of making this up.

    Otherwise, great work.

    Like

  4. Thanks for all the analysis, Guy. To me, this confirms that the unsubstantiated claims Rhee was making (*see below) about the effects of her new principals back in July and August of ’09 were false. Keep in mind that when she made these comments, the results of DC-CAS had not been released, so she knew could essentially say anything and not have it challenged.

    Although she has been known to lie blatantly about statistics (e.g., about Shaw Middle School), in this case it looks like she just made the whole thing up, never figuring anyone would put together the information needed to check her claim. You obviously went to a lot of trouble to determine the non-effect of Rhee’s new principals. It is greatly appreciated.

    *Rhee quotes in July-August ‘09 about her new principals:
    “New principals outperform the district average so that says a lot about the talent that we were able to bring into the system.”

    “It is one piece of compelling evidence that we’re making the right decisions and that reforms are working. …If you look at the new principals that we brought in, their achievements gains, in general, across the district, looked really, really strong.” American University Radio Metro Connection, 7/17/09 wamu.org/programs/mc/09/07/17.php

    “Test scores of schools with new principals outscored the District average,” she [Rhee] says. “We did the right thing by replacing principals.” 7/19/09 “Chancellor Rhee settles in for the siege” “Washington Examiner”, Harry Jaffe http://www.washingtonexaminer.com/local/Chancellor-Rhee-settles-in-for-the-siege-7988873-51052702.html

    “If you look at the gains, the academic gains that we’ve seen over the last two years, the data shows that our new principals are actually seeing better than average gains, which means we really have succeeded in bringing in some really, really strong new leaders who are making the difference in our schools.”
    8/17/09, PBS Podcast, “Two Years of Talks with Michelle Rhee & George Parker – Snowball Effect” http://learningmatters.tv/blog/podcasts/michelle-rhee-in-dc-podcast-snowball-effect/2510/

    “As part of our aggressive human capital strategy, DCPS recruited over 49 proven instructional leaders for the 2008-2009 school year to replace principals who were unable to increase student performance. Our new principals went on to outperform the district-wide averages on the DC-CAS this year. One of these new principals, Dwan Jordon, assumed leadership last year of Sousa Middle School in Ward 7, one of the city’s highest poverty wards. In just one year he galvanized his staff to move students up 17% points in reading and 25% in math, meeting AYP for the first time in Sousa’s history.”
    7/23/09, “Education Reform in the District of Columbia” US Senate Testimony of Michelle Rhee, Chancellor, Meeting of the Subcommittee on Oversight of Government Management, the Federal Workforce, and the District of Columbia, page 7.
    http://docs.google.com/gview?a=v&q=cache:Bs-5Zn19PTYJ:hsgac.senate.gov/public/index.cfm%3FFuseAction%3DFiles.View%26FileStore_id%3D5db18d21-1e2c-47ce-bd40-e6aac23e3ff8+Michelle+rhee+engaged&hl=en&gl=us

    [Note that in her (sworn) Senate testimony, after making the false claim about her new principals, she cherry-picks one new principal whose school did show achievement gains, conveniently not mentioning schools where new principals saw losses (e.g., Shaw).]

    Like

  5. Thank you for doing this analysis, GFBrandenburg. However, I’d like to make one major criticism of your post.

    Nowhere did you consider the fact that Chancellor Rhee chose to replace the principals at schools that were especially struggling with DC CAS scores (if we assume, as you and I are now, that test scores are a proxy for student achievement).

    Of all the reasons for which a principal would be replaced, I would imagine that, especially in this high-stakes environment, low student achievement–as measured by DC CAS scores–would be the most important. If this is so (I don’t know because I don’t have–and don’t have the time to analyze–the data;I’m a DCPS English teacher), your comparing same-principal to different-principal schools is just as misleading as what you claim Rhee’s comments to be. That is, a struggling school may not be able to make the same ABSOLUTE growth as a school that is doing okay and has had the same principal.

    So, I suggest that you make a different comparison. Start by figuring out the “baseline” performance of a school before the principal was replaced. Then, examine the trend after a new principal is installed. With same-principal schools, calculate a baseline average performance. Compare the rates of change in DC CAS growth for these two groups of schools. Which one is outperforming the other?

    Your decision to compare apples to oranges–and only at one point in time–is deceptive.

    I will concede that it is difficult to measure the lag between when a new principal enters and when she “affects” her school’s achievement (this is the same problem that we have with determining whether the NAEP math gains can be attributed to Rhee).

    If you could kindly share with me your data, I would love to take a look. In fact, your decision to send me your data would be an indication of whether you truly cared about getting to bottom of the “discrepancy” between Rhee’s comments and “reality” or just wanted to bash Rhee.

    Like

    • Dear IronBulldog,

      See my response to your second comment; I made it into an entire post/blog.

      In general, I don’t think you’ve been paying enough attention to the facts.

      Were you able to see the spreadsheets I put on google docs? I’m proud that I just learned how to do that!

      Like

  6. I would love to see these charts compared to the charts for the 06-07/07-08 change, before Rhee started her changes. That is where you could really see if there is or is not a change due to Michelle Rhee’s principal changes. Rhee decided to keep the returning Principals in place as well, so these charts show that Michelle Rhee’s choices for new Principals perform at close to the same level, as her choices of principals to stay in place.

    If you create a chart from the 06-07/07-08 change in test scores, that is where you would truly have a good comparison to see if she’s had positive, negative or negligible impact with the principal changes.

    Like

    • That sounds like a good idea, Edfandc. It’ll take a bit of work, but I will do it. That is, for as back as far as I can go with the DC-CAS (and perhaps with the SAT-9 as well), I will compare changes at schools with new principals on one hand, with changes at schools that kept their old ones.

      Keep in mind that I don’t have enough information to tell if the new principals were named because the old one died, got pregnant, moved to another state, retired, was fired, was promoted to central office, or what.

      Without having done the actual numerical analysis, let me make a prediction: there won’t be a really serious difference one way or another.

      Like


Leave a comment