## What I actually had time to say …

Since I had to abbreviate my remarks, here is what I actually said:

I am Guy Brandenburg, retired DCPS mathematics teacher.

To depart from my text, I want to start by proposing a solution: look hard at the collaborative assessment model being used a few miles away in Montgomery County [MD] and follow the advice of Edwards Deming.

Even though I personally retired before [the establishment of the] IMPACT [teacher evaluation system], I want to use statistics and graphs to show that the Value-Added measurements that are used to evaluate teachers are unreliable, invalid, and do not help teachers improve instruction. To the contrary: IVA measurements are driving a number of excellent, veteran teachers to resign or be fired from DCPS to go elsewhere.

Celebrated mathematician John Ewing says that VAM is “mathematical intimidation” and a “modern, mathematical version of the Emperor’s New Clothes.”

I agree.

One of my colleagues was able to pry the value-added formula [used in DC] from [DC data honcho] Jason Kamras after SIX MONTHS of back-and-forth emails. [Here it is:]

One problem with that formula is that nobody outside a small group of highly-paid consultants has any idea what are the values of any of those variables.

In not a single case has the [DCPS] Office of Data and Accountability sat down with a teacher and explained, in detail, exactly how a teacher’s score is calculated, student by student and class by class.

Nor has that office shared that data with the Washington Teachers’ Union.

I would ask you, Mr. Catania, to ask the Office of Data and Accountability to share with the WTU all IMPACT scores for every single teacher, including all the sub-scores, for every single class a teacher has.

Now let’s look at some statistics.

My first graph is completely random data points that I had Excel make up for me [and plot as x-y pairs].

Notice that even though these are completely random, Excel still found a small correlation: r-squared was about 0.08 and r was about 29%.

Now let’s look at a very strong case of negative correlation in the real world: poverty rates and student achievement in Nebraska:

The next graph is for the same sort of thing in Wisconsin:

Again, quite a strong correlation, just as we see here in Washington, DC:

Now, how about those Value-Added scores? Do they correlate with classroom observations?

Mostly, we don’t know, because the data is kept secret. However, someone leaked to me the IVA and classroom observation scores for [DCPS in] SY 2009-10, and I plotted them [as you can see below].

I would say this looks pretty much no correlation at all. It certainly gives teachers no assistance on what to improve in order to help their students learn better.

And how stable are Value-Added measurements [in DCPS] over time? Unfortunately, since DCPS keeps all the data hidden, we don’t know how stable these scores are here. However, the New York Times leaked the value-added data for NYC teachers for several years, and we can look at those scores to [find out]. Here is one such graph [showing how the same teachers, in the same schools, scored in 2008-9 versus 2009-10]:

That is very close to random.

One last point:

Mayor Gray and chancellors Henderson and Rhee all claim that education in DC only started improving after mayoral control of the schools, starting in 2007. Look for yourself [in the next two graphs].

Notice that gains began almost 20 years ago, long before mayoral control or chancellors Rhee and Henderson, long before IMPACT.

To repeat, I suggest that we throw out IMPACT and look hard at the ideas of Edwards Deming and the assessment models used in Montgomery County.

Published in: on December 15, 2013 at 12:47 pm  Comments (4)
Tags: , , , , , , , , , , , , , , ,

## More on the 2013 NAEP

I would like to present some more results from the latest batch of released scores from the 2013 National Assessment of Educational Progress, or NAEP, so you can judge for yourselves.

As usual, the charlatans and quacks who are guiding US educational policy today claim that the results are clear proof that their ill-considered policies are working miracles, especially in the District of Columbia, my home town.

I claim that there has been no miracle. Yes, scores on the NAEP reading and math scores in the 4th grade and 8th grade are gradually but unevenly increasing — as has been the case for the past twenty years or so. But there has been no Rhee/Kamras/Henderson miracle in DC, or at least not one we can see on these graphs — no huge, enormous jump that trumps all growth prior to their mayoral takeover of the DC public schools.

Plus, we don’t yet know what weight the NAEP statisticians give to the scores of the kids in the regular public schools, those in the private or religious schools, or those in the charter schools. We do know that the proportion of white students counted in DC has increased substantially since the 1990’s, and that the proportion of black kids has shrunk, but we can only guess just what that means.

For each graph, I have drawn a thick, red, vertical line to distinguish the pre-“Rhee-form” era from the Era of Excellence and Data. See if you honestly see significant differences.

First, average NAEP math scores by states for 8th grade kids, 1990-2013. Remember, please, this is public AND private schools. I chose the states because they were the highest- or lowest-scoring ones in the nation (MA & MS) or because they were located near DC.