Does this sound familiar?

A veteran DCPS math teacher at Hardy MS has been asking DCPS top administrator Jason Kamras for details, in writing, on exactly how the “Value-Added” portion of IMPACT teacher evaluation system is calculated for teachers. To date, she has still not received an answer.

How the “Value-Added” portion of the IMPACT actually works is rather important: for a lot of teachers, it’s about half of their annual score. The general outline of the VAM is explained in the IMPACT documents, but none of the details. Supposedly, all of the scores of all of a teachers’ students’ in April are compared with all of those same students’ scores last April; and then, the socio-economic status and current achievement scores of those students are taken into account somehow, and the teacher is labeled with a single number that supposedly shows how his or her students gained during that year with respect to all other similar students.

But how those comparisons are made is totally unclear. So far I have heard that the algorithm, or mathematical procedure, that has been used is designed to make it so that exactly half of all teachers are deemed, in non-technical terms, ‘below average’ in that regard — which of course will set them up to be fired sooner or later. Whether that’s an accurate description of the algorithm, I don’t know. Ms. Bax told me that she heard that DCPS expects that teachers with almost all Below-Basic students would be expected to achieve tremendous gains with their students. However, my own educational research indicates the opposite.

In any case, Kamras and his other staff haven’t put any details in writing. Yet.

At one place Kamras writes that “we use a regression equation to determine this score.” OK, Bax and Kamras and I all teach or taught math. We all understand a fair amount about regression equations. But there are lots of such equations! Just saying that there is a regression equation is involved is like saying Newton “used algebraic equations” in writing “Principia Mathematica”, or that Tolstoy used “words and sentences” when he wrote “War and Peace.” And just about equally informative.

I attach a series of emails between Ms. Bax, an 8th grade math teacher, and Mr. Kamras and a few other people in DCPS Central Administration. The emails were supplied to me, in frustration, by Ms. Bax. I used color to try to make it clear who was writing what: Green is Ms. Bax, and reds and browns and pinks denote those written by for various administrators. Note that this exchange of emails started in September of 2010.

Perhaps publicizing this exchange might prod Mr. Kamras to reveal details on a system that has already shown by Mathematica (the same group that designed the system) to be highly flawed and unreliable?

=========================================

From: “Bax, Sarah (MS)” <sarah.bax@dc.gov> Date: Mon, 13 Sep 2010 17:12:41 -0400

To: Jason Kamras jason.kamras@dc.gov Subject: Impact

Jason,

I hope the year is off to a great start for you.

I am writing concerning the IMPACT IVA score calculations. I am very frustrated with this process on a number of fronts. First, I would like to have an actual explanation of how the growth scores are calculated. As they have been explained, the process seems quite flawed in actually measuring teacher effectiveness. Further, I would like to know if teachers have any recourse in having their scores reexamined, etc.

Last year, 89% of the eighth graders at Hardy scored in the Proficient or Advanced range in Mathematics. As the sole eighth grade mathematics teacher last year, I taught almost all of the students except for a handful that were pulled for special education services. Beyond this accomplishment, I am extremely proud to report that 89% of our Black students were at that Proficient or Advanced level.

With statistics like these, I take issue with a report that scores my IVA at 3.4 (to add insult to this injury, even under your system if my students had earned just one-tenth more of a growth point, my IVA would be a 3.5 and I would be considered highly effective).

Frankly, I teach among the best of the best in DCPS– with very few of us rated highly effective. The IMPACT scoring system has had a terrific negative impact on morale at our school.

Kindly,

Sarah Bax

———————————————–

From: Kamras, Jason (MS) Sent: Tue 9/14/2010 7:50 AM To: Bax, Sarah (MS) Subject: Re: Impact

Hi Sarah,

I’m disappointed to hear how frustrated you are. Can you give me a call at 202-321-1248 to discuss?

Thanks,

Jason

Jason Kamras

Director, Teacher Human Capital

——————————————————–

Jason,

I really do not have the time to call to discuss my concerns. If you would forward the requested information regarding specific explanation about the growth scores calculation process I would be most obliged.

I would like specifics about the equation. Please forward my inquiry to one of your technical experts so that he or she may email me with additional information about the mathematical model.

Kindly,

Sarah

—————————————————

From: Barber, Yolanda (OOC) Sent: Mon 12/20/2010 2:12 PM To: Bax, Sarah (MS)

Subject: FW: IMPACT Question

Ms. Bax,

Sorry for the barrage of emails, but I received a response concerning your question. Please read the response below. I hope this helps. Please let me know if you’d like to continue with our session on the 4th. Thanks again.

Best!

Yolanda Barber

Master Educator | Secondary Mathematics

District of Columbia Public Schools

Office of the Chancellor

——————————————————–

From: Rodberg, Simon (OOC) Sent: Monday, December 20, 2010 2:05 PM

To: Barber, Yolanda (OOC); Lindy, Benjamin (DCPS); Gregory, Anna (OOC) Subject: RE: IMPACT Question

Hi Yolanda,

We will be doing more training, including full information on Ms. Bax’s question, this spring. We’d like to give a coherent, full explanation at that time rather than give piecemeal answers to questions in the meantime.

Thanks, and I hope you enjoy your break.

Simon

Simon Rodberg

Manager, IMPACT Design, Office of Human Capital

———————–

Yolanda,

I got notice a couple of weeks ago that I have jury duty on your office hours day at Hardy so I won’t be able to make the appointment. I’m sorry to miss you, but appreciate your efforts to send my concerns to the appropriate office.

The response below is obviously no help at all as it clearly indicates the Office of Human Capital is unwilling to answer my specific question regarding the calculations involved in determining my rating. I believe my only request was to have an accurate description of how the expected growth score is calculated. My question has been left unanswered since last spring. Can you imagine if a student of mine asked how his or her grade was determined and I told them I couldn’t provide a coherent explanation right now, but see me in a year?

Thanks again for your help. I look forward to meeting you in person in the future!

Kindly,

Sarah

——————————-

From: Bax, Sarah (MS) Sent: Tuesday, December 21, 2010 11:09 AM To: Rodberg, Simon (OOC)

Cc: Henderson, Kaya (OOC) Subject: FW: Appointment #80 (from DCPS Master Educator Office Hours Signup)

Mr. Rodberg,

I am requesting a response to my inquiry below: ‘explanation of actual algorithm to determine predicted growth score’.

Kindly,

S. Bax

—————————-

Ms. Bax,

What’s a good phone number to reach you on? I think it would be easiest to explain over the phone.

Thank you, and happy holidays.

Simon

Simon Rodberg

Manager, IMPACT Design, Office of Human Capital

—————————–

From: Kamras, Jason (DCPS) [mailto:jason.kamras@dc.gov] Sent: Sun 12/26/2010 1:42 PM

To: Bax, Sarah (MS) Cc: Henderson, Kaya (OOC) Subject: Value-added calculation

Hi Sarah,

The Chancellor informed me that you’re looking for a more detailed explanation of how your “predicted” score is calculated. In short, we use a regression equation to determine this score. If you’d like to know more about the specifics of the equation, please let me know and I can set up a time for your to meet with our technical experts.

Happy New Year!

Jason

Jason Kamras

Chief, Office of Human Capital

—————————————

On 12/27/10 12:17 PM, “Bax, Sarah (DCPS-MS)” <sarah.bax@dc.gov> wrote:

Jason,

I have requested an explanation of the value-added calculation since September, with my initial request beginning with you (see email exchange pasted below). I would like specifics about the equation. Please forward my inquiry to one of your technical experts so that he or she may email me with additional information about the mathematical model.

Kindly,

Sarah

————————————

On 12/27/10 12:23 PM, “Kamras, Jason (DCPS)” <jason.kamras@dc.gov> wrote:

My deepest apologies, Sarah. I’ll set this up as soon as I get back.

Jason Kamras

Chief, Office of Human Capital

—–Original Message—–

From: Kamras, Jason (DCPS) [mailto:jason.kamras@dc.gov] Sent: Tue 1/25/2011 11:02 PM

To: Bax, Sarah (MS) Subject: FW: Value-added calculation

Hi Sarah,

I just wanted to follow up on this. When could we get together to go over the equation?

Hope you’re well,

Jason

Jason Kamras

Chief, Office of Human Capital

———————————-

From: Bax, Sarah (MS) Sent: Fri 1/28/2011 1:15 PM To: Kamras, Jason (DCPS)

Subject: RE: Value-added calculation

Jason,

I really would just like something in writing that I can go over– and then I could contact you if I have questions. It is difficult to carve out meeting time in my schedule.

Kindly,

Sarah

————————–

From: “Bax, Sarah (DCPS-MS)” <sarah.bax@dc.gov> Date: Thu, 10 Feb 2011 14:05:43 -0500

To: Jason Kamras <jason.kamras@dc.gov> Subject: FW: Value-added calculation

Jason,

I didn’t hear back from you after this last email.

Kindly,

Sarah

———-

From: Kamras, Jason (DCPS) [mailto:jason.kamras@dc.gov] Sent: Thu 2/10/2011 6:00 PM

To: Bax, Sarah (MS) Subject: Re: Value-added calculation

Ugh. So sorry, Sarah. The only thing we have in writing is the technical report, which is being finalized. It should be available on our website this spring. Of course, let me know if you’d like to meet before then.

Best,

Jason

Jason Kamras

Chief, Office of Human Capital

————————————-

On Feb 25, 2011, at 9:29 PM, “Bax, Sarah (MS)” <sarah.bax@dc.gov> wrote:

Jason,

How do you justify evaluating people by a measure [for] which you are unable to provide explanation?

-Sarah

————–

Sat, February 26, 2011 11:25:33 AM

Sarah,

To be clear, we can certainly explain how the value-added calculation works. However, you’ve asked for a level of detail that is best explained by our technical partner, Mathematica Policy Research. When I offered you the opportunity to sit down with them, you declined.

As I have also noted previously, the detail you seek will be available in the formal Technical Report, which is being finalized and will be posted to our website in May. I very much look forward to the release, as I think you’ll be pleased by the thoughtfulness and statistical rigor that have guided our work in this area.

Finally, let me add that our model has been vetted and approved by a Technical Advisory Board of leading academics from around the country. We take this work very seriously, which is why we have subjected it to such extensive technical scrutiny.

Best,

Jason

Jason Kamras

Chief, Office of Human Capital

——————————-

Jason,

To be clear, I did not decline the opportunity to speak with your technical partner. On December 27th I wrote to you, “I would like specifics about the equation. Please forward my inquiry to one of your technical experts so that he or she may email me with additional information about the mathematical model.” I never received a response to this request.

In addition, both you and Mr. Rodberg offered to provide information about the equation to me on the phone or in person, but have yet to agree to send any information in writing. You have stated, “I just wanted to follow up on this.

When could we get together to go over the equation?” Mr. Rodberg wrote, “What’s a good phone number to reach you on? I think it would be easiest to explain over the phone.”

Why not transpose the explanation you would offer verbally to an email? Please send in writing the information that you do know about how the predicted growth score is calculated. For instance, I would expect you are familiar with what variables are considered and which data sources are used to determine their value. Let me know what you would tell me if I were to meet with you.

As a former teacher, you must realize the difficulty in arranging actual face-time meetings given my teaching duties. And as a former mathematics teacher, I would imagine you could identify with my desire to have an understanding of the quantitative components of my evaluation.

Sincerely,

Sarah

Jason Kamras is not suited to the work he is doing. I really wish he would go back into the classroom where his utility value is higher. He is incompetent and I hope the press begins to press him as much as he is trying to press us.

LikeLike

I wasn’t so terribly impressed by his accomplishments as a teacher — what little I could discover.

LikeLike

IMPACT is a completely ridiculous method for evaluating teachers. The fact that upper management is unable to put into writing a methodology for their madness, yet they justify it’s use to release large numbers of teachers from their positions. How are teachers supposed to know what standards they are held to if they are not put forward for them. The e-mail chain you show is completely unprofessional, yet it is also very typical of the bureaucratic chaos that exists in DCPS. I regret that you have to go through this and have your time wasted by people who clearly don’t respect the fact that as a teacher you are extremely pressed for time, however I can’t say that I am entirely surprised by this account.

LikeLike

First Grade Math Teacher…[…] tually measuring teacher effectiveness. Further, I would like to know if teach […]…

LikeLike

As you know, Mathematica Policy Research ( is the (highly paid) contractor that developed the VAM part of IMPACT for DCPS. See the following links:

Mathematica Designs Value-Added Model for the DC Public Schools

Using Value-Added Growth Models to Track Student Achievement

Mathematica says “DCPS sought an objective, fair, and transparent value-added model. We developed such a model in accord with these principles. It was reviewed by two independent value-added experts, Eric Hanushek of the Hoover Institution at Stanford University and Tim Sass of Florida State University.”

This must be the “Technical Advisory Board of leading academics from around the country” that Jason Kamras referred to in his email. In any event, one would also like to see their report that “vetted and approved” the VAM model.

Home pages for both are:

for Mr. Hanushek.

for Mr. Sass.

Mr. Hanushek has been in the news a great deal recently and appears to be the go-to person for a lot of media quotes. Just search for him on the washingtonpost.com site for quite a few. I would say it’s fair to call him “controversial”.

One questions the complete impartiality of these two experts especially when contrasted to the EPI (Economic Policy Instituted) report:

You’ll also note that Michelle Rhee thinks a great deal of Mr. Hanushek and that may reduce his impartiality w/r/t the DCPS contract. Listen to this Aspen Institute 07/03/10 Socrates Benefit Dinner — Video of Panel Discussion on Education Leadership and Education Reform:

At about 58 minutes in, Rhee says that she agrees Mr. Hanushek, 100%.

Also see this article:

“Proceed with Caution on Value-Added” Posted by Paul Teske Nov 29th, 2010.

and this one:

Norm’s Notes: Eric Hanushek, Politically Inspired “Research”

and there are many other cautionary articles that are well written and researched available for those that look for them. (Consider the LA and NYC controversies.)

Mathematica and DCPS have never produced the “Technical Support Documentation” to go along with their work on IMPACT’s VAM. They say “we worked with DCPS to construct a model appropriate for its district. In the course of making decisions about the value-added model, we presented DCPS with policy options that were informed by the best available research and empirical results using DCPS data. DCPS weighed the trade-offs associated with these options in the context of its goals and circumstances. ”

So….. what were those “policy options” and “trade-off” and “the best available research” that account for classroom “variations” due to student assignment, team teaching, and all the other factors. Maybe if we actually had the documentation we’d know a lot more or be able to ask better informed questions. This should be part of the new openness promised by Mr. Gray. Mayor Gray, Ms. Henderson and even Mathematica have promised “transparency” with both IMPACT and the VAM model. To live up to this promise and announcement, DCPS must publish the technical documentation.

One would also like to know how it is that Mathematica can use “scale scores” from OSSE’s DC-CAS tests and track individual students from year to year when no one else in DC or DCPS can do that or knows how to do that (or can begin to explain it). (Just look at the failures of SLED). Especially since “scale scores” are not designed to do that across grades.

DC and DCPS needs to do appropriate and in-depth oversight of the whole IMPACT project (experiment). It is and was unrealistic to use it fresh out of the gate as MR did with total presumption of accuracy (and pay lots of money to the winners). Due to the technical nature, such oversight will logically require using a panel of unbiased experts to delve deeply (and not just make superficial comments). What cross checks and balances are currently being done to verify results so far? …..and the many other questions.

One would think that the DCPS math teachers should also be able to understand the technical documentation, if clearly written in sufficient detail. Indeed, there needs to be a way to challenge the calculations and the formula used for “growth” for any given classroom. One would think that all the math teachers in the District would want to know the details and help their colleagues to understand it.

Finally, why is Ms. Henderson so sensitive to pressure and questions about IMPACT as reported in the Examiner? The article says “She does not like hearing her contract or evaluation tool challenged.” One would think that Ms. Henderson would welcome all opportunities to explain in depth both of those accomplishments especially in the context of Mr. Gray’s pointed emphasis on collaboration and community outreach. Any good contract or evaluation tool should stand up to scrutiny.

So…there are many open, important and unanswered questions still on the table. Such questions have come from multiple teachers across the District. Mr. Kamras has stonewalled them all in similar fashion. Even to the point of intimating that “it’s too complicated to understand” (or maybe that he just can’t explain it himself). This is a great failure of Mr. Kamras and Ms. Henderson in the “Human Capital” department to adequately explain 50% of the evaluation!

Now in it’s second year…How has the VAM model changed? What formula adjustments have been made? What was learned from the first year? How do we know the calculations actually matched what happened in each classroom? How has it been cross and back checked?

These questions are in addition to the straight forward question: What formula was used to evaluate me last year? How has it changed for this year? How do I know my individual classroom factors were accounted for? And the larger question, is such a VAM calculation even appropriate for rating teachers and making merit pay and employment decisions given the reservations expressed in the above research reports?

…///…(anon)

LikeLike

Very revealing post GF. Thanks.

LikeLike

Boy, this is very telling about the skill these folks display in spreading bull.

This is typical of the Rhee/Fenty/Henderson administration. So the truth is that IMPACT is still in the planning stage and people are being evaluated and FIRED using this instrument.

LikeLike

Can we hope that Mayor Vincent Gray and his advisors on DCPS will read this exchange and immediately halt the use of the IMPACT instrument? And replace Jason Kamras with a competent administrator with expertise in statistics?

LikeLike

Wow! Jason can not give in writing how LAST YEARS evaluations were calculated. So happy that he is excited about THIS years release this spring. A little late to be letting teachers know how the calculations are being done for this year. Should have been publish BEFORE the school year began.

LikeLike

Presumably this is covered by the Freedom of Information Act?

LikeLike

Not surprising that they don’t want to put anything in writing and that they claim that it would be easier to explain over the phone. Pesky written documents have a way of coming back to bite you.

LikeLike

That’s what I kept thinking. “Let’s talk about it over the phone.” “Let’s schedule a meeting.” I mean, Kamras was very cordial and professional, but it was clear he wanted it done on his terms. Perhaps he felt like she would be less confrontational in person…?

There are some things that might be easier to explain over the phone. I can’t fathom how a mathematical algorithm would be one of them.

LikeLike

The other possible motive here (although a formula has finally been given):

If he does it over the phone or in a meeting, no paper trail is created. It would all come down to the teacher’s account of what she thinks she was told.

LikeLike

Truly, these e-mails are fascinating.

Does Harper’s still publish “found documents” of this type at the start of each issue? These e-mails would be perfect for that format. They deserve the widest possible viewing.

LikeLike

Can you contact Mathematica directly and start an investigation, your last bit of digging went viral. I’d be particularly interested how SPED and ELL factor into the equation.

Keep up the good work!!!

LikeLike

I hope all teachers have seen the recent American Educational Research Journal article from February 2011 that notes that even with the calculations, without more information about the DC-CAS and its validity and reliability and how it stacks up against other tests, the value added scores are meaningless. The abstract is at http://aer.sagepub.com/content/48/1/163.abstract and a key conclusion from the article is

“these assessments produce substantially different answers about individual teacher performance and do not rank individual teachers consistently. Even using the same test but varying the timing of the baseline and outcome measure introduces a great deal of instability to teacher rankings. Therefore, if a school district were to reward teachers for their performance, it would identify a quite different set of teachers as the best performers depending simply on the specific reading assessment used.” (p. 187)

The science just does not support the use of these measures for teacher evaluation or rewarding teaching performance.

LikeLike

Many times the value added calculations must be kept confidential according to contract – as the vendor (MPR in this case) would like to sell them to other schools in the future. Trade secrets act allows this as an exception to FOIA.

LikeLike

I think it is very sketchy that the DCPS officials refuse to put anything about IMPACT into writing and pretend to respond to your questions with offers to “talk on the phone.” This is typical conduct of administrators who don’t want anything to come back and bite them, as Mr. Davis noted above in his comment. I applaud Ms. Bax for her persistence in her efforts to force the DPCS officials to commit to something (anything!) in writing with regard to IMPACT. The fact that they can’t, or won’t, provide such an explanation is extremely suspicious.

LikeLike

Get this story out in the media please, I think the public does not have a clue – obvioulsy neither does anyone else!!! What is also shocking is the large bonuses that were paid to be teachers based on faulty data, and the firings. You’d think that people could sue for this information in court.

LikeLike

While I give M. Rhee a C grade, unlike many blog commenters, this thread is truly spectacular. I have seen Mr. K roasted many a time and never knew why, other than hearsay. But this is a colossal, well documented runaround and irresponsible evasion. If, indeed, the Mayor emerges from his Big SUV and appoints Ms. Henderson on a permanent basis, part of the deal should be to replace Mr. K. There is no possible excuse to justify hiding the calc method. regardless of the state of the final report. The Mayor himself should be ordering its release, or perhaps we can start a mayoral recall campaign.

LikeLike

Ms. Bax doesn’t tell us what the test scores of her students were the previous year, when they were seventh graders. If they were high, however awarded, then she would be expected to raise them in order to be judged highly effective.

While Ms. Rhee has gone on and on about there being no differential in expectations, the equation makes explicit that they exist: Boys and girls are expected to learn at different rates. Because few black students are ESL, if the the ESL coefficient is negative, then black students are, along with other non-black students, expected to learn at a greater rate. (Odd, that ESL-status is unchanging.) And students getting free lunch are expected to learn at a different rate than those getting a subsidized lunch who are expected to learn a different rate than those getting no subsidy. Again, that status appears to be fixed for both years, though I don’t know which would contradict the Rhee dogma more: That children relieved of poverty-caused hunger might be expected to learn more, or that the change in income level should not affect what is expected of the student and therefor of the teacher.

LikeLike

With the coefficients, Ms. Bax can calculate how marginal changes in class composition would be expected to change expectations of how much her students are expected to learn. This is straightforward to understand for the gender coefficient: You could imagine, however improbable, an all-boy or all girl class with none of the values of the other variables affecting expected performance to change.

That, unfortunately, is not true of the other variables or characteristics, which are correlated. So, the marginal effects, interpretable as “causal” are not the same as “impact” coefficients.

LikeLike