Details on those ‘Dozens and Dozens’ of Schools

As promised, I am posting the charts that I laboriously put together from data published by OSSE and a spreadsheet leaked to me, showing how unconvincing was the progress at the only 13 schools that WaPo reporter Jay Mathews could find that even vaguely resembled the ones that Michelle Rhee bragged about.

I am not going to look at the schools with large percentages of white students or at McKinley Tech.

First, here are my charts for Payne ES, Plummer ES and Prospect LC. I color-coded the chart much the way that Erich Martel does. That is, each diagonal sloping up to the right represents an individual cohort of students as they move from grade to grade, from year to year. Obviously I have no way of telling how many students transferred into our out of each cohort, but my experience in several DC public schools in all four quadrants indicates that students don’t move around all that much.

Thus, at Payne, under “Reading”, in the column for 3rd grade, in the row for 2010, you see the number 27. That means that at Payne, in the 3rd grade, in school year 2009-2010, about 27% of the students were ‘proficient’ or ‘advanced’ in reading according to their scores on the DC-CAS. The next year, most of those kids returned as fourth graders in the same school, and about 23% of them ‘passed’ the DC-CAS in reading because their answer sheets had enough correct answers for them to score ‘proficient’ or ‘advanced’. Note, that cell is also blue. But the next year, 2011-12, the percentage of students ‘passing’ the DC-CAS doubled, to 46%. I find that jump worthy of a red flag. Either that teacher did something astounding, or the students are an entirely different group of kids, or else someone cheated (most likely not the students).

payne plummer + Prospect

 

Any time I saw a drop or rise of 10% or more from the year before for a single cohort, I noted my “red flag” by writing the  percentage of passing scores in bold, red.

Notice that at Plummer, the cohort in blue, in reading, went fomr 40% passing to 18% passing to 46% passing in the course of three years. The cohort in green in math went from 60% passing to 18% passing to  29% passing.

At Prospect, the cohort in yellow goes from 25% passing to 0% passing to 5% passing to 5% passing in reading. In math, the same group goes from 13% to 31% to 0% to 24% passing.

You see anything weird with that? I sure do.

Next come Thomson, Tubman, Hart, and Sousa:

thomson tubman hart + sousa

 

The only one of these schools with a chart not covered with ‘red flags’ is Hart.

Your thoughts? (As usual, the “comment” button, below the end of this story, is tiny. Sorry about that.)

 

The URI to TrackBack this entry is: https://gfbrandenburg.wordpress.com/2013/01/15/details-on-those-dozens-and-dozens-of-schools/trackback/

RSS feed for comments on this post.

11 CommentsLeave a comment

  1. I like Hart’s Principal. He has principles. Now, Dwan. He’s another matter. I remember being stuck over there for a week-long staff development in the Summer of 2009. The vibe was all wrong. The teachers looked scared. When my Principal at the time asked him to share his testing strategies, he launched into a spiel about frequent classroom observations, walk-throughs, and his being “on-top” of things.

    According to CityPaper there was more to it.

    http://www.washingtoncitypaper.com/blogs/citydesk/2010/08/09/dwan-jordans-image-takes-blow-with-new-dcps-test-scores/

    Like

    • Do you work (or did you work) or have your own children at Sousa under one or both principals?

      Like

  2. […] D.C. teacher and blogger extraordinaire G.F. Brandenburg has started an investigation of the “dozens and dozens of schools” that allegedly saw “steady” or “even some dramatic gains” when Rhee was […]

    Like

  3. As always, good stuff. Got a question:

    Look at the proficiency rates in the columns, going from low to high.

    At Thompson from 2009 to 2010, the green cohort made a 16 point leap, but the score is three points lower than the score for the last year’s (yellow) cohort.

    I don’t know the specifics of the incentive bonuses: were they tied to cohorts gains, or were they tied to just straight cross-sectional gains? I had always assumed they were cross-sectional, but I never really thought about until now.

    Like

    • You would get a bonus is (say e.g.) this year’s 7th graders did much better than last year’s 7th graders,

      Like

      • OK, so that’s cross-sectional. So here’s what’s interesting:

        There are large leaps for the cohorts, but not necessarily for the grade level year-to-year. Take Thompson’s green cohort for math. In 2010 they leapt 20 points. But the grade dropped 19 points from the previous year. So the cohort made gains, but it still would come across as a loss.

        DiCarlo has more on cohorts and growth today:

        http://www.shankerblog.org/?p=7455

        Like

  4. While I think your point is generally a valid one, large gains in passing can also be attributed to a large percentage of the student score distribution just under the cut point. In some cases, an average increase of an additional one question correct can translate into a relatively large gain in the percent proficient depending on the distribution of scores around the cut point. You would really need scale scores to assess the gains in student test scores. Having said that, many of these gains would certainly be very large–unbelievably so–even in terms of scale scores.

    Ironically, I could find no scale score data–not even proficiency rates–on the DCPS website. Given that the district is data-driven, they sure don;t make their data readily available. And there is only one reason for that . . . which brings us back to your original point.

    See Koret, D. (2008) Measuring up as well as Matt DiCarlo’s Shanker Blog and many other places about how changes in proficiency rates cannot be used to accurately measure student changes in performance.

    Like

    • Right. Teachers themselves don’t get any of that information on the DC-CAS. When we do look at the test items, we are often appalled by how poorly they are written or how little they relate to the very standards or learning objective that they purport to be measuring mastery of. When we look at the other tests we are periodically forced to administer, we DO get a chance to compare the tests themselves with the answer sheets, and the mismatch between the test and any actual learning is even worse.

      Like

    • Ed, you make some very good points, but I’d add one thing:

      We can consider the possibility of cut score influences, and cohort changes, and cheating. But we should also be looking at the DC-CAS itself.

      Is it a valid and reliable measure of student learning year-to-year? Has it been properly vetted? Is the grading consistent?

      Or are the fluctuations reflections of poor year-to-year reliability of the test?

      Like

      • I have repeatedly and in detail showed that the DCCAS sucks as an assessment of what students need to know. It also seems to have a lot of wild scoring swings from year to year. So it’s neither valid nor reliable.

        Like

  5. former Thomson (no p) parent here. I really don’t think they were cheating. They never got hit by USA Today for erasure irregularities and having been in the building during testing time, I can say there was a lot of diligence on the part of the admin there.

    The longtime principal had, over time, attracted a much larger middle class and upper middle class parent base. (Chinese, guitar, choir, pull outs for gifted and talented students, etc….) Also, when she left, there was big drop in scores under the Rhee appointed principal. If you took out those scores, I think it would look pretty solid. It’s also a school with a high mobility rate, being downtown and half of the kids are out of boundary.

    Either way, if you look at the overall scores for the school, they show a slow (very slow) steady rise except for the year 1/2 the staff retired (when Rhee joined DCPS) and the year the rockstar principal ran the school.

    Like


Leave a comment