2013 DC-CAS Scores Released (Sort of)

The DC Office of the State Superintendent has released a spreadsheet with overall proficiency rates in reading and math and a combination of the two for the  spring 2013 DC-CAS multiple-choice test. You can see the whole list here. and look for your favorite school, charter or public.

Overall, percentages of students deemed ‘proficient’ or ‘advanced’ seem to be up a bit from last year. At some schools, the scores went up enormously — which we know is often a suspicious sign.

But there are several major problems with this data:

(1) We have no idea whether this year’s test was easier, harder, or the same as previous years’ tests, since the manufacturer is allowed to keep every single item secret. The vast majority of teachers are forbidden even to look at the tests they are administering, to see if the items make sense, match the curriculum, are ridiculously hard, or are ridiculously easy.

(2) We have no idea what the cutoff scores are for any of the categories: “basic”, “below basic”, “proficient”, or “advanced”. Remember the scams that the education authorities pulled in New York State a few years ago on their state-wide required student exams? If not, let me remind you: every year they kept reducing the cutoff scores for passing, and as a result, the percentages of students passing got better and better. However, those rising scores didn’t match the results from the NAEP. It was shown that to get a passing grade on certain tests, a student only had to guess about 44% of the answers to get a proficient score — on a test where each question had four possible answers (A, B, C, D). (NYT article on this)

(3) In keeping with their usual tendency to make information hard to find, the OSSE data does not include any demographic data on the student bodies. We don’t know how many students took the tests, or what percentages belong to which ethnic group or race, or how many are on free or reduced-price lunch, or are in special education, or are immigrants with little or no English. Perhaps this information will come out on its own, perhaps not. It is certainly annoying that nearly every year they use a different format for reporting the data.

I think it’s time for a Freedom of Information Act request to get this information.

Details on those ‘Dozens and Dozens’ of Schools

As promised, I am posting the charts that I laboriously put together from data published by OSSE and a spreadsheet leaked to me, showing how unconvincing was the progress at the only 13 schools that WaPo reporter Jay Mathews could find that even vaguely resembled the ones that Michelle Rhee bragged about.

I am not going to look at the schools with large percentages of white students or at McKinley Tech.

First, here are my charts for Payne ES, Plummer ES and Prospect LC. I color-coded the chart much the way that Erich Martel does. That is, each diagonal sloping up to the right represents an individual cohort of students as they move from grade to grade, from year to year. Obviously I have no way of telling how many students transferred into our out of each cohort, but my experience in several DC public schools in all four quadrants indicates that students don’t move around all that much.

Thus, at Payne, under “Reading”, in the column for 3rd grade, in the row for 2010, you see the number 27. That means that at Payne, in the 3rd grade, in school year 2009-2010, about 27% of the students were ‘proficient’ or ‘advanced’ in reading according to their scores on the DC-CAS. The next year, most of those kids returned as fourth graders in the same school, and about 23% of them ‘passed’ the DC-CAS in reading because their answer sheets had enough correct answers for them to score ‘proficient’ or ‘advanced’. Note, that cell is also blue. But the next year, 2011-12, the percentage of students ‘passing’ the DC-CAS doubled, to 46%. I find that jump worthy of a red flag. Either that teacher did something astounding, or the students are an entirely different group of kids, or else someone cheated (most likely not the students).

payne plummer + Prospect

 

Any time I saw a drop or rise of 10% or more from the year before for a single cohort, I noted my “red flag” by writing the  percentage of passing scores in bold, red.

Notice that at Plummer, the cohort in blue, in reading, went fomr 40% passing to 18% passing to 46% passing in the course of three years. The cohort in green in math went from 60% passing to 18% passing to  29% passing.

At Prospect, the cohort in yellow goes from 25% passing to 0% passing to 5% passing to 5% passing in reading. In math, the same group goes from 13% to 31% to 0% to 24% passing.

You see anything weird with that? I sure do.

Next come Thomson, Tubman, Hart, and Sousa:

thomson tubman hart + sousa

 

The only one of these schools with a chart not covered with ‘red flags’ is Hart.

Your thoughts? (As usual, the “comment” button, below the end of this story, is tiny. Sorry about that.)

 

Double-Digit Increases and Decreases in NCLB Pass Rates: Real or Fraudulent?

A lot of DC public and charter schools have had a lot of double-digit year-to-year changes in their published proficiency rates from 2008 through 2012.

Of course, that sort of change may be entirely innocent, and even praiseworthy if it’s in a positive direction and is the result of better teaching methods. However, we now know that such changes are sometimes not innocent at all and reflect changes in methods of tampering with students’ answer sheets. (And we also know that DC’s Inspector General and the Chancellors of DCPS are determined NOT to look for any wrong-doing that might make their pet theories look bad.)

Whether these are innocent changes, or not, is for others to decide – but these schools’ scores are worth looking at again, one way or another. If it’s fraud, it needs to be stopped. If double-digit increases in DC-CAS pass rates are due to better teaching, then those methods need to be shared widely!

What I did was examine a spreadsheet published by OSSE and Mayor Gray’s office and examine how the percentages of “proficient” students in reading and math at each school changed one year to the next, or from one year to two years later for the period SY 2007-8 through SY 2011-12, five full years. I then counted how many times a school’s listed percentage of “proficient” students went up, or went down, by ten or more percentage points, from one year to the next, or from one year to two years later.

One charter school, D.C. Preparatory Academy PCS – Edgewood Elementary Campus, had ELEVEN double-digit changes from year to year or from one year to two years later. All were upward changes. Perhaps these are really the results of educational improvements, perhaps not. I have no way of knowing. If it’s really the result of better teaching, great! Let their secrets be shared! If it’s not legitimate, then the fraud needs to end.

Two regular DC public elementary schools, Tyler and Hendley, both had TEN double-digit changes measured in the same way. Both had four increases of 10% or more, and both had six decreases by the same amount.

Six schools had NINE double-digit changes. After the names of each school, I will list how many of these were in the positive and negative directions (i.e., up or down). Here they are:

  1. Burroughs EC (3 up, 6 down)
  2. D.C. Bilingual PCS (8 up, 1 down)
  3. Kimball ES (2 up, 7 down)
  4. Meridian PCS (5 up, 4 down)
  5. Potomac Lighthouse PCS (6 up, 3 down)
  6. Wilson J.O. ES (2 up, 7 down)

Thirteen schools had EIGHT double-digit year-to-year changes in proficiency rates. I will list them similarly:

  1. Aiton ES (0 up, 8 down)
  2. Barnard ES (Lincoln Hill Cluster)  (2 up, 6 down)
  3. Cesar Chavez PCS – Capitol Hill Campus (6 up, 2 down)
  4. Coolidge SHS (3 up, 5 down)
  5. Hospitality PCS (4 up, 4 down)
  6. Houston ES (3 up, 5 down)
  7. Ludlow-Taylor ES (5 up, 3 down)
  8. Noyes ES (1 up, 7 down)
  9. Raymond ES (1 up, 7 down)
  10. Roots PCS- Kennedy Street Campus (5 up, 3 down)
  11. Septima Clark PCS (8 up, 0 down)
  12. Thomas ES (4 up, 4 down)
  13. Washington Math Science Technology (WMST) PCS (4 up, 4 down)

Eighteen schools had SEVEN double-digit year-to-year changes:

  1. Booker T. Washington PCS (4 up, 3 down)
  2. Brent ES (7 up, 0 down)
  3. Community Academy PCS – Butler Bilingual (7 up, 0 down)
  4. Garrison ES (2 up, 5 down)
  5. Hearst ES (0 up, 7 down)
  6. Imagine Southeast PCS (6 up, 1 down)
  7. LaSalle-Backus EC (1 up, 6 down)
  8. Leckie ES (2 up,                 5 down)
  9. Marie Reed ES (2 up, 5 down)
  10. Martin Luther King ES (3 up, 4 down)
  11. McKinley Technology HS (7 up, 0 down)
  12. Payne ES (5 up, 2 down)
  13. Ross ES (6 up, 1 down)
  14. Sharpe Health School (4 up, 3 down)
  15. Takoma EC (0 up, 7 down)
  16. Tree of Life PCS (3 up, 4 down)
  17. Turner  ES at Green (3 up, 4 down)
  18. Two Rivers Elementary PCS (7 up, 0 down)

 

Seventeen schools had SIX double-digit year-to-year changes in proficiency rates:

  1. Bruce-Monroe ES at Park View (2 up, 4 down)
  2. Burrville ES (1 up, 5 down)
  3. C.W. Harris ES (2 up, 4 down)
  4. Center City PCS – Capitol Hill Campus (6 up, 0 down)
  5. Center City PCS – Trinidad Campus (5 up, 1 down)
  6. Cesar Chavez PCS – Bruce Prep Campus (6 up, 0 down)
  7. D.C. Preparatory Academcy PCS – Edgewood Middle Campus (6 up, 0 down)
  8. Ferebee Hope ES (1 up, 5 down)
  9. Friendship PCS – Blow-Pierce (2 up, 4 down)
  10. Friendship PCS – Collegiate (4 up, 2 down)
  11. Kenilworth ES (5 up, 1 down)
  12. Luke C. Moore Academy HS (4 up, 2 down)
  13. Mamie D. Lee School (4 up, 2 down)
  14. Roosevelt SHS (3 up, 3 down)
  15. Simon ES (3 up, 3 down)
  16. Stanton ES (3 up, 3 down)
  17. Winston EC (1 up, 5 down)

Let me caution my readers: Just because there are double-digit changes does not in itself mean there is fraud. Student populations can change in average socioeconomic status or composition for all sorts of reasons. Teaching staff and administrators can also change – and so can teaching methodologies, and sometimes entire schools move from one location to another one, with somewhat unpredictable results for good or for the opposite.

However, documented news articles in USA Today and the Atlanta Journal-Constitution, which I have referenced in this blog, have shown convincingly that some of the large swings are definitely due to massive amounts of erasures of incorrect answers, or improper coaching of students during the test by administrators or teachers.

If the increases in pass rates are in fact legitimate, then the rest of the teachers in DC need to know what those secrets are!

In any case, there should be further scrutiny to figure out what is causing such large swings in scores at so many schools.

Note: I got my data here: http://osse.dc.gov/release/mayor-vincent-c-gray-announces-2012-dc-cas-results

Published in: on October 4, 2012 at 5:26 pm  Comments (1)  
Tags: , , , , ,

An exercise in double-speak

Here is the text of the press release on investigating anomalies in the 2011 testing administration in DCPS. Notice what they say:

(1) It’s now 10 months after the test was given, and all they have done is finished with the “Request for Proposal” process, asking vendors to make bids to do some investigation. [I spoke just now with Marc Caposino (listed in the text below), and he said that his office would make a recommendation on Monday, but that after that it goes to Contracts & Proocurement.]

(2) Even though the publisher of the test itself, McGraw-Hill CTB, has other forensical statistical methods they are willing to provide to DCPS (for a fee), and they have had these for years, only now is DCPS beginning to wonder what other methods to use. [I asked Caposino why they didn't use an analysis of identical wrong answers as well; he said he wasn't in any of the focus groups or advisory panels, so he didn't know. He did say he had read "Freakonomics", though, and agreed that the investigation is taking way too long.]

(3) They don’t reveal who was on the panel. I’d like to talk to them. [MC said he'd send me the link that lists who was on those panels or focus groups, but it's in one of their prior press releases.]

(4) They claim the number of classes with cheating issues is minuscule. [I have my doubts. Ex-principals I've talked with at some length have told me that the pressure to cheat was huge. It seemed to me that if you didn't cheat, you were sure of losing your job.]

(5) They don’t point out that this cheating has affected both students and teachers in very negative ways, while the unscrupulous administrators or teachers who cheated have earned nice bonuses… [Again, we need to put both Wayne Ryan and Michelle Rhee in the hot seat in that interrogation room. Let them take the 5th amendment if they like. Let them! They both became wealthy and famous by cheating, or so it appears. They need to pay the price, just like any other white-collar criminal or embezzler!]

==========================

Thursday, February 9, 2012

OSSE’s RFP Process for Test Integrity Vendor Comes to an End

Selection of independent vendor for test integrity investigations is underway

Contact: Marc Caposino, (202) 727-7207

Washington, DC – The OSSE request for proposal to investigate classrooms for test integrity closed Tuesday, February 7, and resulted in multiple bids. OSSE will make a recommendation for vendor selection by Monday, February 13, 2012 to the Office of Contracts and Procurement for final determination.

“We are committed to restoring and improving confidence in our standardized tests security and recognize that teachers and students are working hard on improving test scores. We believe wholeheartedly that the overwhelming majority of school leaders, teachers, and students are playing by the rules,” stated Hosanna Mahaley, State Superintendent of Education.

During the 2011 cycle, Phase One of OSSE’s enhanced test security protocols included, among others, adding seals to the test booklets, doubling the number of schools monitored by OSSE during test administration, and shortening the test booklet pick-up period.

Phase two of the enhanced security protocols was about strengthening and building community understanding and belief in the erasure analysis process, which has been broadly discussed in the local and national media. OSSE consulted with an independent advisory committee of national experts in the area of education assessment who recommended two new methods, bringing the number of analyses to 4 key measures used to test for anomalies in classrooms:

  • Unusual student-level gains in achievement from 2010 to 2011
  • Wrong-to-right erasure analysis
  • Within classroom variances (new)
  • Wrong-to-right erasure analysis for 2010 and 2011 (new)

The third and final phase of the enhanced process, as recommended by the national experts, is securing an independent third party to conduct follow-up investigations of the classrooms that were flagged for potential impropriety.

It is important to recognize that the subjects of all investigations are entitled to a fair and impartial process. The mere fact that a classroom has been flagged is not evidence of wrongdoing. At the end of the investigative process, schools with classrooms guilty of impropriety will be disclosed and scores will be invalidated.

This year’s analysis resulted in 35 (0.82%) classrooms being identified for further investigation out of 4,279 classrooms administering the DC CAS.

“The call for total transparency and accuracy demanded that we take the time to bring in an independent agency to put to rest any amount of suspicion regarding our student’s performance,” explained Hosanna Mahaley, State Superintendent of Education.

Published in: on February 10, 2012 at 11:06 am  Leave a Comment  
Tags: , , ,

A couple of graphs on DCPS and DC Charter School Populations

A few preliminary results from the yesterday’s released DC-CAS data:

(1) The very fast rise in the charter-school population in DC seems to have nearly stopped at about 36% of the entire publicly-funded school population that is counted by DC-CAS.

(2) The population of DC public schools seems to be going up a little bit, BUT

(3) Most of that rise seems to be in the 10th grade, and we know for a fact that administrators “game” the 10th grade tests by sometimes holding students back in the 9th grade until they appear to be ready to pass the 10th grade; or else, they sometimes skip them over the 10th grade completely so they don’t have to take the test.

Published in: on August 3, 2011 at 11:46 am  Comments (4)  
Tags: , , ,

More Problems With DCPS Curriculum and DC-CAS

Upon taking a closer look at the DCPS standards and the DC-CAS, I submit that they should probably both be ignored by any teacher who actually wants to do right by students. If you are doing a good job teaching the things that students should actually know, it won’t make much difference on their DC-CAS scores. Conversely, if you teach to the DC-CAS, you are short-changing your students.

Case in point: Standards in Geometry and Algebra 1 ostensibly covered on the 10th grade DC-CAS. Recall that all 10th graders at this point in DCPS have supposedly finished and passed Algebra 1, and are enrolled in at least Geometry by 10th grade.

I have prepared a little chart giving the standards (or learning objectives) for Geometry: the ones listed in the DCPS list of learning standards, and the number of questions that I found on the page of released DC-CAS questions that supposedly address that standard. There is almost no correlation at all. In fact, if you threw a dart at the topics and chose them randomly, you would do a better job than the test-writing company did.

Published in: on March 23, 2011 at 12:37 pm  Leave a Comment  
Tags: , , , , ,

More Weird DC-CAS Questions

I began looking at the released 10th grade math questions today, and as usual I found some weird ones.

Here is one, where the only difference between answer C and D is the color scheme (D fits the colors in the graph, C doesn’t). Both of them have the math correct. Is the color scheme all that significant? Is that what we are testing for now?

Here’s another one, which merely asks students to tell the difference between a mean, a median, and a mode. Wait a second – isn’t that one of the 6th, 7th, and 8th grade standards?

Here are the exact wordings for the various “standards” that involve mean, median and mode:

For 6th grade: “6.DASP.1. Describe and compare data sets using the concepts of median, mean, mode, maximum and minimum, and range.”

For 7th grade: “7.DASP.1. Find, describe, and interpret appropriate measures of central tendency (mean, median, and mode) and spread (range) that represent a set of data.”

For 8th grade: “8.DASP.1. Revisit measures of central tendency (mean, median, and mode) and spread (range) that represent a set of data and then observe the change in each when an “outlier” is adjoined to the data set or removed from it. Use these notions to compare different sets of data and explain how each can be useful in a different way to summarize socialphenomena such as price levels, clothing sizes, and athletic performances.”

And for Algebra 1: “AI.D.1. Select, create, and interpret an appropriate graphical representation (e.g., scatter plot, table, stem-and-leafplots, circle graph, line graph, and line plot) for a set of data, and use appropriate statistics (e.g., mean, median, range,and mode) to communicate information about the data. Use these notions to compare different sets of data.”

Why is DCPS testing such a low-level skill in Algebra 1? And why do we insist on loading the curriculum with the same eleventy-umpteen standards each year, only varying by an adjective or adverb or phrase or two? Is it because we assume that nothing at all gets learned in any year, so that teachers have to yet again re-teach EVERYTHING all over again, starting from nothing?

Published in: on March 23, 2011 at 12:28 pm  Comments (1)  
Tags: , , ,

Strange Events at Dunbar SHS

Dunbar SHS has been in the news a lot recently. Erich Martel, a history teacher punitively transferred from Wilson, has dug deeply into the record, and has written the following:

[Attachment(s) from Erich Martel included below] 

Dunbar’s Decline Began Long Before Rhee and Continued Under Her

by Erich Martel
Dunbar HS has been in the news.  Supt Janey’s and Chancellor’s Rhee’s actions regarding Dunbar do not tell a simple hero – villain story.  Mostly, they reveal how tests and data can be adjusted to give the appearance of improvement and that students continue to receive diplomas that do not represent mastery of DCPS subject standards. The data from Dunbar HS are typical of those in many of our high schools.
For several decades, students have been allowed to graduate from DCPS high schools like Dunbar without meeting mandatory requirements in core subject matter. Standardized tests occasionally reveal, if only approximately, how great the gulf between image and reality is.  This commentary grew out of an attempt to address the following questions, which Colbert King’s article on Dunbar HS brought to mind:
1) How could DCPS officials say that “70% of the class of 2007 were expected to attend college,” when two years earlier grade 10 SAT9 tests revealed that over 90% received Below Basic scores in math and over 65% received Below Basic scores in reading?
and
2) Miracle or Mirage: How was the following possible:
April 2005:  SAT9 Grade 10 Math:  no students scored advanced; 3 students scored proficient;
April 2006:  DC CAS Grade 10 Math: 5 students scored advanced; 58 students scored proficient.
Hiding or attempting to manipulate reality always has consequences:  a few benefit; most pay a price.
Erich Martel
Social Studies
Phelps ACE HS
DCPS
(In January 2008, I was the WTU representative on the Quality School Review Team that visited Dunbar HS, a step that was preliminary to the chancellor’s restructuring decision under NCLB)
Overview
1)  Posted data on the OSSE site are incomplete
2) What do truancy and graduation rate data show?
3)  Dunbar’s SAT9 and DCCAS results I:  Miracle or Mirage on NJ Ave.? (See attachment, sheet 1)
4)  Dunbar’s SAT9 and DCCAS results II:  Miracle or Mirage on NJ Ave.? (See attachment, sheet 1)
5) Graduation: How is it that “70% of [the Dunbar class of 2007] were expected to attend college”?
6)  Correlation of Graduation Numbers and Grade 10 SAT9 Results
7)  In-house Alteration of Students’ Records
8) Enrollment Decline at Dunbar HS (attachment, sheet 2)
9) Student Records Audit at Dunbar HS, 2002-03
COMMENTARY
1)  Posted data on the OSSE site are incomplete, probably due to the failure of DCPS – and charters – to provide them.
For example:
a) Truancy data are available for only four years:  2003-04 to 2006-07 (Attachment, sheet 1, columns J & K);
b) Average daily attendance data are available for only 2009-10 (Sheet 1, column L);
c) Graduation rate data are available for only 2008-09 and 2009-10 (Sheet 1, Column M) (Columns E & I show the senior completion rates: number of June graduates divided by the number of seniors on the October OSSE enrollment count, 8 months earlier)
2) What do truancy and graduation rate data show?
a) Truancy at Dunbar went from 18.39% in 2003-04 (the year before Janey arrived) to 16.68% (his first year); the almost doubled to 29.99% in 2005-06 before climbing to 42.86% in his last year.
c) The graduation rate dropped from 81.7% in 2008-09 to 74.7% in 2009-10.
Was that Bedford’s fault? Those rates do not report that the majority of the students needed Credit Recovery and/or summer school to pass one or more courses needed for graduation.
The real test of graduation validity:  How many graduates who enrolled in colleges or professional schools:
i.  Needed to take non-credit remedial courses before moving on to credit-bearing courses?
ii. Lasted for at least one semester?    …. for two semesters?
3)  Dunbar’s SAT9 and DCCAS results I:  Miracle or Mirage? (See attachment, sheet 1)
a) Between April 1999 to April 2005 only 22 Dunbar 10th graders (out of 1,564 tested; not including the Pre-Engineering Academy) received scores of Advanced or Proficient on the SAT9 Math.
In April 2005, the last year of the SAT9, only 3 10th grade students scored Proficient (no one scored Advanced).
Yet, one year lated, April 2006, the first year of the DC CAS, 5 scored Advanced & 58 scored Proficient, a number  almost 3 times greater than the total (22) of the previous seven years.
b)  That dramatic increase occurred at the same two years that truancy jumped from 16.68% to 29.99%.
c)  The number of Adv & Prof scores then dropped during Janey’s last year (06-07), down in 07-08 (Rhee’s first year), up in 08-09, and down again in 09-10.
4)  Dunbar’s SAT9 and DCCAS results II:  Miracle or Mirage? (See attachment, sheet 1)
a)  In April 2005, 206 Dunbar 10th graders (out of 227 tested; not including the Pre-Engineering Academy) received scores of Below Basic on the SAT9 Math; one year later, April 2006, it dropped to 82, rose in 2007 to 98, fell to 72, then 61 and finally 38 in 2010 (under Bedford).
5) Graduation: How is it possible that “70% of [the Dunbar class of 2007] were expected to attend college”? (Colbert King, Washinton Post, 12/18/10).
Consider:  According to the OSSE enrollment audit, the Dunbar HS class of 2007 (seniors, including 25 Pre-Engineering seniors) numbered 223 students.
Two years earlier, in April 2005, the tenth graders of the future Class of 2007 produced the following results on the SAT9:
Math (Total:  227 students):  attachment:  sheet 1
Advanced:  0 students
Proficient:  3 students (or 1.32%)
Basic:   18 students (7.93%)
Below Basic:  206 students (90.75%)
Reading (Total:  234 students); attachment:  sheet 3
Advanced:  0 students
Proficient:  4 students (or 1.71%)
Basic:   75 students (32.05%)
Below Basic:  155 students (66.24%)
Recall what the four performance levels are supposed to mean (source:  “Stanford 9 Special Report:  Performance Standard Scores,” Harcourt Brace Educational Measurement, 1997):
Advanced: “signifiies superior academic performance”
Proficient: “denotes solid academic performance”
Basic: “denotes partial mastery” (emphasis added)
Below Basic:  “signifies less than partial mastery”
In June 2007, Dunbar HS had 171 graduates for a senior completion rate of 76.7% (171 graduates / 223 seniors).
Assuming that these were the top students who took the SAT9 two years earlier, the class of 2007 included only 3 students who were showed “solid academic achievement” in math and only 4 in reading.  Of the 171, only 18 had shown “partial mastery” in math, while 75 had shown “partial mastery” in reading.
Of that 171, 70% (120 students) were “expected to attend college.”  Thus, leaving aside the 51 students (30%) not expected to attend college, the 120 who [were] expected to attend college showed the following performance two years earlier as tenth graders:
In Reading:
- 0 showed “superior academic performance”
- 4 showed “solid academic achievement”;
- 75 showed “partial mastery”;
- 41 showed “less than partial mastery”
In Math:
- 0 showed “superior academic performance” 

- 3 showed “solid academic achievement”;
- 18 showed “partial mastery”;
- 99 showed “less than partial mastery”
6.  Correlation of Graduation Numbers and Grade 10 SAT9 Results
By the end of the 3rd Advisory or marking period, when students take the SAT9 (since 2006, the DC CAS), they have completed or should have completed Algebra I and Geometry.  How likely is it that 99 out120 students who showed less than partial mastery in grade 10 math were likely to overcome that deficit and master Algebra II/ Trigonometry or mastery of the elements of four years of English and writing skills?
Undoubtedly, some students managed to improve over the remaining two years.  By the same token, however, the depressing anti-academic attitudes that hamper teaching and learning could very easily have led some students who did well in grade 10 to lose interest.  In fact, most of the students required easier summer school classes to meet graduation requirements.
7.  In-house Alteration of Students’ Records
As the review of student academic records at Dunbar in 2002-03 concluded ( http://www.dcpswatch.com/dcps/030922b.htm and below) “the opportunity for tampering was greatly enhanced and the reliability of the students’ records was questionable.” Despite recommendations of a Student Records Management Task Force (August 2003), no systemic steps were taken to ensure that student records were secure against internal tampering. That was revealed in Wilson HS’s Class of 2006, when approximately 200 of the 420 seniors listed on the June graduation day program had not completed their mandatory requirements.  It is not known whether the DC Inspector General’s report of the audit of Wilson HS graduation records led Supt. Janey to tighten procedures in other high schools (http://tinyurl.com/2nukmj ).
8.   Enrollment Decline at Dunbar HS (attachment, sheet 2)
In order to understand the decline in enrollment in Dunbar HS, one must factor out the grade 9 increases, beginning in 2006-07, when 9th grades in high schools began to increase as junior high schools were transformed into middle schools.
Dunbar’s total enrollment dropped from 1070 in 2002-03 to 913 in 2006-07 (adjusted to 887) in Supt Janey’s last year.  The October 2009 adjusted enrollment (factoring out the grade 9 increase) is 664, down by more than 200 since Chancellor Rhee took over.
9. Student Records Audit at Dunbar HS, 2002-03
Following the revelations of altered student records at Wilson HS in June 2002, DCPS contracted with Gardiner Kamya Inc. to conduct reviews of 59 students’ transcripts and supporting documentation in each DCPS high school.  The following is Dunbar HS’s report.  It is posted on the dcpswatch website, because DCPS officials did not post it.
GARDINER KAMYA & ASSOCIATES, PC
CERTIFIED PUBLIC ACCOUNTANTS / MANAGEMENT CONSULTANTS

DISTRICT OF COLUMBIA PUBLIC SCHOOLS

INDEPENDENT ACCOUNTANTS’ REPORT ON APPLYING AGREED-UPON PROCEDURES REGARDING STUDENT RECORDS AT SIXTEEN HIGH SCHOOLS/SITES http://www.dcpswatch.com/dcps/030922b.htm

MARCH 30, 2003 (End of Field Work)
July 17, 2003 (DC PS Response)
September 22, 2003 (Report Submitted)

Submitted by:
Gardiner, Kamya & Associates, P.C.
1717 K Street, NW Suite 601
Washington, DC 20036

7. DUNBAR SENIOR HIGH SCHOOL

  1. Internal controls (Procedure #1, Page 7)
    The school did implement the grade verification process mandated by the DCPS. However, due to the state of the student records and the results of the procedures detailed below, we conclude that internal controls with respect to student grades were ineffective and there was no assurance that such grades were accurately reflected in the student records.
  2. Confidentially maintained (Procedure #2, Page 9)
    The procedure was completed without exception. [Comment:  This means that all alterations were done by those with legal access to student records]
  3. Completeness of Cumulative and Electronic files (Procedure #3, Page 9)
    1. Cumulative Files
      Eighteen files in our sample of 59 were incomplete. Two files were missing. Some of the incomplete files were missing more than one item. The missing items were as follows: 

      1. It is the school’s policy to create a Letter of Understanding for all students in grades 9 -12. However, the school could not provide a Letter of Understanding for 14 students in our sample;
      2. Two student’s file did not contain a transcript;
      3. Three students’ files did not contain a 9th grade report card;
      4. Six students’ files did not contain a report card;
      5. One student’s file did not contain either a 9th or 11th grade report card;
      6. One student did not have a 10th grade report card in his/her file.
    2. Electronic Files
      The school could not provide electronic data (COHI and SIS-HIST) for six students in our sample. Consequently, we could not determine the completeness of those files.
  4. Consistency (Procedure #4, Page 9)
    The school could not provide the teachers’ scan sheets for 31 of the 59 student files in our sample. In addition, 19 students had transferred in. Scan sheets were not available for these students. Also, the school did not provide report cards for 11 students, and two students’ file did not include a transcript. Of the records available for our review, we noted the following: 

    1. Two (2.0) credits were reported on the transcript of one student (for Army Jr. ROTC). The report card reported 1.0 credit.
  5. Accuracy (Procedure #5, Page 10)
    1. Carnegie Units and Letters of Understanding
      1. The transcripts of three students were not consistent with their Letters of Understanding as follows:
        1. One student’s transcript reported a credit of 1.0 for “Art 1 “. However, the Letter of Understanding reported a credit of 0.5 for the same course. Also, the transcript reported zero credits for electives. The Letter of Understanding reported 0.5 credits;
        2. One student’s transcript reported a credit of 1.0 for “Adapt Health and PE”. However, the Letter of Understanding did not report this credit; and
        3. One student’s transcript reported a credit of 2.0 for Army Jr. ROTC. However, the Letter of Understanding awarded a credit of 1.0.
      2. We could not determine whether classes taken and credits earned by 15 students were in accordance with DCPS Carnegie Unit requirements for the following reasons:
        1. Two student’s file did not contain a transcript;
        2. One student’s file contained a Letter of Understanding that was not completed by a counselor. In addition, the credits had not been properly calculated;
        3. The school could not provide a Letter of Understanding for 14 students;
      3. The Letters of Understanding in our sample did not report hours earned for community service.
    2. Mathematical accuracy of credits
      The school could not provide report cards for 11 students. In addition, the transcript of one student reported 2.0 credits awarded for Army Jr. ROTC, a one-year course while the report card showed 1.0 credit for this course.
    3. Grade changes
      The school could not provide electronic records for 6 students. The procedure was completed with respect to the remaining students without exception.
    4. Missing grades
      The procedure was completed without exception.
    5. Independent studies
      This school does not offer independent studies.
    6. Transfer credit
      The procedure was completed without exception.
  6. Tampering (Procedure #6, Page 11)
    Of the 59 student records included in our sample: 

    1. Twenty files were incomplete;
    2. The school could not provide scan sheets for 31 students;
    3. Scan sheets were not available for an additional 19 students who had transferred in;
    4. The school could not provide the electronic files for 6 students.
      We also noted that all administrative staff (i.e., principal, assistant principals, registrar and counselors) used the same password to gain read/write access to students’ electronic record. Because of these factors, the opportunity for tampering was greatly enhanced and the reliability of the students’ records was questionable.

Conclusion:

Based on the procedures performed, we conclude that:

  1. Internal controls were inadequate;
  2. Student records were incomplete, inconsistent, inaccurate, and unreliable;
  3. We could not conclude with respect to tampering because a significant number of files selected for review were not made available to us.



Major Failure of “Capital Gains” Program

The ‘Capital Gains’ program has failed.

When will its major backers – Michelle Rhee, Roland Fryer, and some billionaires – admit it?

Last year, there was nearly no difference in the changes in performance of the ‘experimental’ and ‘control’ groups in the ‘Capital Gains’ program. But this year, the second full year of the program,  the ‘treatment’ group – the schools where students received payments for doing various things – saw their scores drop significantly, whereas the ‘control’ group of schools saw their scores rise by a similar amount.

Here are the graphs and tables I prepared. First, reading:

As you can see, from 2009 to 2010, the fraction of students in all of the control group schools who scored ‘proficient’ or ‘advanced’ on the DC-CAS in reading, ROSE by a little over five percentage points. That’s the blue line, the middle column above. However, in the experimental group of schools, the fraction of students who ‘passed’ the DC-CAS FELL by about four percentage points. That’s the red line, the last column in the table.

How about math?

In this graph and table, you can  the fraction of students passing the DC-CAS in math ROSE by about four and a half percentage points from 2009 to 2010. But in the treatment group, where students were paid to do the right thing, the percentage of students passing the DC-CAS in math FELL by over two percentage points.

Another Credibility Gap? Looks Like Rhee, Fenty, and Reinoso are Gaming the Tests to Try to Look Good

Unfortunately, in the field of education, crime does sometimes pay – at least for some people.

In a number of states, local politicians have tried to make themselves and their pet educational De-Forms look good by gaming the testing system. For example, Rod Paige’s educational “miracle” in Houston, TX – the very model for NCLB – was later shown to be based on pushing out low-scoring students, which makes the scores for everybody else in the system look much better. (And helped get GWBush elected President and Rod Paige to become Sec’y of Education.) Other states have made the cut-off scores and tests for NCLB a whole lot easier. That way, even if students aren’t really doing better, they will APPEAR to the public to be doing better.

An alert reader has sent in some data that suggests that sort of “gaming the system” might be happening here in DC, but not in Massachusetts.  (It’s in a comment on a previous blog.) I wanted to make sure the data was correct before I posted the data here as a somewhat more-legible table, so I double-checked the sources and found that it was all correct.

Then, on a hunch, I went back a few years to see if the problem is getting worse here in DC, and how it compares to Massachusetts.

Result: Yes. There is a greater and greater gap between the NAEP and local DC-CAS scores, but that is NOT happening in the Bay State.

Here are the tables I made:

Notice a few things:

In 2007, in DC, the locally-commissioned DC-CAS test results, in both reading and math, and at both the 8th grade and 4th grade levels said that about 15 to 25%  more students were proficient than the national NAEP scores said.

In 2009, the discrepancies between the local DC test and the national NAEP tests were much larger than in 2007: our local test says that roughly 30% more of our students are proficient than the national test does, in all 4 sub-groups (reading, math, 4th grade, and 8th grade). In other words, the DC-CAS seems to be giving a more and more inflated view of how our students are performing.

In Massachusetts, the trend seems to be the other direction. In 2007, the gap between the state MCAS test and the NAEP ranged anywhere from positive 1% to negative 32%. However, 2 years later, the gap went from positive 9% to negative 5%.  So, in other words, the MCAS appears to be improving its match to the NAEP.

And let me emphasize something: in Massachusetts, the MCAS last year gave LOWER scores than the NAEP in half of the cases I sampled.

You can find the Massachusetts results at http://www.doe.mass.edu/mcas/results.html . (Notice that they make up new tests every year.)

The NAEP results can be found at http://nces.ed.gov/nationsreportcard/states/ (but be prepared to dig a bit).

And Fenty, Rhee, and Reinoso bragged about the DC-CAS at http://dcps.dc.gov/DCPS/About+DCPS/Press+Releases+and+Announcements/Press+Releases/Fenty,+Rhee+and+Reinoso+Announce+DCPS+2009+DC+CAS+Scores or here.

——————————————————————

There are all sorts of easy ways of doing that sort of score inflation. One method, which works quite well if you use the same test over and over, is simply to have administrators or teachers cheat.  Blatantly.  And get more proficient at it. (We have all heard allegations of this.)

Another, more subtle, way to inflate the scores is to slightly modify the test from year to year so that it uses slightly easier questions.

Another way is to change the cut-off scores so that whereas it might have required, say, 60% to be “proficient” one year, the next year, you only need to get 45%  of them right. (It has been shown that you can get a passing score on certain high-stakes tests in New York by guessing randomly.)

Or, you simply make it so that the entire curriculum IS teaching to the test – which is what Chancellor Rhee apparently thinks is the proper way to teach.

Published in: on April 15, 2010 at 7:29 pm  Comments (4)  
Tags: , , , ,
Follow

Get every new post delivered to your Inbox.

Join 349 other followers