## Scooped by Gary Rubenstein

If you are very observant, take a look at a graph by Gary Rubenstein on his blog 2/12/25, and look at a graph by me on 3/9/12, nearly a (short) month later.

Both show the lack of correlation between a teachers’ score on the exceedingly complex Teaching and Learning Framework classroom observation scores on the one hand, and their scores on the Individual Added-Value measurement scheme either in math or reading or both, depending on what subject(s) and grade levels that they taught.

Gary’s graph is, of course, populated by lots of bright red triangles; mine has little blue squares. His grid is missing vertical lines, so mine is clearly better. (joke !) But look even more carefully – you can see that the individual triangles and squares are in the identical places.

This shows that Excel, when given the same data, will produce much the same graph.

It’s really easy to do, by the way. You should try it. Here is the original data table.

Published in: on March 11, 2012 at 2:48 pm  Comments (3)
Tags: , , , ,

## The Correlation Between ‘Value-Added’ Scores and Observation Scores in DCPS under IMPACT is, in fact, Exceedingly Weak

As I suspected, there is nearly no correlation between the scores obtained by DCPS teachers on two critical measures.

I know this because someone leaked me a copy of the entire summary spreadsheet, which I will post on the web at Google Docs shortly.

As usual, a scatter plot does an excellent job of showing how ridiculous the entire IMPACT evaluation system is. It doesn’t predict anything to speak of.

Here is the first graph.

Notice that the r^2 value is quite low: 0.1233, or about 12%. Not quite a random distribution, but fairly close. Certainly not something that should be used to decide whether someone gets to keep their job or earn a bonus.

The Aspen Institute study apparently used R rather than r*2; they reported R values of about 0.35, which is about what you get when you take the square root of 0.1233.

Here is the second graph, which plots teachers’ ranks on the classroom observations versus their ranks on the Value Added scores. Do you see any correlation?

Remember, this is the correlation that Jason Kamras said was quite strong.

Published in: on March 9, 2012 at 10:55 pm  Comments (5)
Tags: , , , , ,

## DCPS consultant writes that there is little correlation between principal evaluation scores and VAM (or IVA) scores

I quote from an official DCPS report written by a consultant named Rachel Curtis in the employ of the Aspen Institute:

#### “DCPS analyzed the relationship between TLF rubric scores and individual teacher value-added scores based on the DC-CAS.

“At this early stage in the use of value-added analysis nationally, the hope is that there is a strong correlation between a teachers’ score on an instructional rubric and his or her value-added score. This would validate the instructional rubric by showing that doing well in instruction produces better student outcomes. DCPS analysis at the end of the first year
of IMPACT suggests that there is a modest correlation between the two ratings (0.34).

### DCPS’s correlations are similar to those of other districts that are using both an instructional rubric and value-added data. A moderate correlation suggests that while there is a correlation between the assessment of instruction and student learning as measured by standardized tests (for the most part), it is not strong. At this early stage of using value-added data this is an issue that needs to be further analyzed.”

Ya know, if if the educational Deformers running the schools today were honest, they would admit that they’re still working the bugs and kinks out of this weird evaluation system. They would run a few pilot studies here or there, no stakes on anyone, so nobody cheats, and see how it goes. Then either revise it or get rid of it entirely.

Instead, starting in Washington, DC just a few years ago, with Michelle Rhee and Adrian Fenty leading the way locally and obscenely rich financiers funding the entire campaign, they rushed through an elaborate system of secret formulas and rigid rubrics, known as IMPACT. It appears that their goal of demoralizing teachers and convincing the public that public schools need to be closed and be turned over to the same hedge fund managers that brought us the current Great Depression, high unemployment rates, foreclosures. While the gap between the very wealthiest and the rest of the population, especially the bottom 50%, has become truly phenomenal.

### Here’s a little table from the report, same page:

(Just so you know, I’ve been giving r^2 in my previous columns, not r. I believe they are using r; to compare that to my previous analyses, if you take 0.34 and square it, you get about 0.1156. That means that the IVA “explains” about 12% of the TLF, and vice versa. Pretty weak stuff.

Would I be alone in suggesting that the “hope” of a strong correlation has not been fulfilled? In fact, I think that’s a pretty measley correlation, and it suggests to me the possibility that neither the formal TLF evaluation rubrics done by administrators, nor the Individual Value-Added magic secret formulas, do an adequate or even competent job of measuring the output of teachers.

Published in: on March 8, 2012 at 9:01 pm  Comments (4)
Tags: , , , , ,