A comparison of old and corrected reports on the SAT scores of incoming Claremont McKenna students shows that the reported score changes entailed systematic manipulation of the vast majority of the data points that CMC had reported.

This manipulation, which was announced Monday by President Pamela Gann in a schoolwide email, led to Vice President and Dean of Admission and Financial Aid Richard Vos’s resignation.

The new, corrected numbers demonstrate a roughly flat trajectory of SAT scores in both the Critical Reading and Math sections. The old, manipulated numbers showed a clear upward trajectory in both mean and median scores.

Critical Reading scores were artificially increased by an average of more than 17 points, a greater amount than Math scores, which were increased by an average of only 10.5 points.

The manipulation hid year-to-year drops in SAT scores as well. While the manipulated numbers showed CMC’s median Math section SAT score to be constant at 700 from 2004 to 2007, in reality, the median value had dropped to 680 from 700 in 2006 and 2007.

The score manipulations hid the fact that the freshmen admitted in 2011 — the class of 2015 — had mean Critical Reading scores that were the lowest since the class admitted in 2007. Their mean had been boosted by 23 points. The mean Math score of the freshmen admitted in 2007 — the class of 2011 — had been boosted 28 points.

Gann’s email said that reported scores were “generally inflated by an average of 10-20 points each,” without distinguishing median and mean scores. No median scores were artificially increased by more than 20 points, though median scores are necessarily multiples of 10.

But those current freshmen whose Critical Reading scores were so low may be heartened to learn that their Math scores had been manipulated only very slightly: one Math score between 500 and 540 was reported as being between 550 and 590. This manipulation did not change the mean or median.

The data manipulation also had the effect of hiding students’ scores which were extremely low. Since 2005, six students had Math scores below 500 but only one had been reported. Three of these six were admitted in 2008 or later. Five students had Critical Reading scores below 500 and only one was reported, though no students with Critical Reading scores below 500 began attending CMC after 2006.

CMC’s Office of Institutional Research publishes a “factbook,” which is not publicly available, that shows SAT scores by each incoming freshman class. The factbook document attributes the SAT data to the CMC Admissions Office. The document was updated Monday to replace the old, manipulated numbers with corrected numbers. The *Port Side* used this data for its analysis.

That factbook lists the number of incoming students with SAT scores within 50 point ranges on the Critical Reading and Math scores, as well as mean and median results.

In order to achieve median scores that appeared higher than they actually were, students’ scores were moved from lower ranges to higher ranges. Some scores were also fabricated out of thin air, as in 2009’s numbers, by lowering the total number of “missing” scores. **Update 2/1/12: [Corrected date]** At CMC, applicants are required to submit either SAT or ACT scores. If a student’s SAT scores are “missing,” then, according to CMC spokesman Max Benavidez, the school does not have their SAT scores in any format. Presumably, these students submitted ACT scores. Though 76 students out of the 282 who matriculated in 2009 did not report SAT scores, only 48 missing scores were reported in the original manipulated numbers. In another year, low scores where not reported by increasing the number of missing scores, as in 2005.

More than three quarters of non-zero data points that CMC had reported were manipulated.

CMC also publishes its statistics in a Common Data Set format; the common data sets for 2004-2005 school year through the current year were updated with corrected statistics on the afternoon of Friday, January 27, 2012, as shown by file metadata.

This article will be updated with more analysis, graphics and links to raw data over the course of the day.

Here is the data comparing results. Blue cells were manipulated downwards; yellow cells upwards.

**Critical Reading, 2004-2011**

**Math, 2004-2011**

**Tables split into Math/Critical Reading, updated 3:30pm 1/31/12. Google Docs by Jonathan O. Hirsch.**

The viewers below display the PDF versions of the factbook.

**Original, manipulated data**

**New, corrected data**

Nice job putting together such a thorough article so quickly; this is the only article I’ve read that actually explains the exact score manipulations.

Please repost it.

I don’t quite understand what you’re asking for. Do you mean that you want me to repost PDFs of the old and new tables?

I figured this table, which is just a digitized version, would be easier to use and contain more useful information. But, if it is the PDFs you want, I’m happy to put them back or link to them or something. Let me know.

Yes, that would be good. You had a table up before the one that is currently there. Please post it, too. It was a bit easier on those of us who are color challenged!

My apologies for the inconvenience. The PDF scans are back up.

You should list the data source upfront in the beginning of the article, it was confusing to figure out where this was coming from.

This would be much better with some charts rather than some tables and a qualitative description of that data. Show them we earned those math scores.

The charts are on their way, I promise.

Your data analysis seems to display a rather willful bias toward the suggestion that data manipulation misrepresented the “real” quality of the ’11 class. Quite the contrary.

Case in point: your assertions that “[t]he score manipulations hid the fact that the freshmen admitted in 2011 — the class of 2015 — had mean Critical Reading scores that were the lowest since the class admitted in 2007. Their mean had been boosted by 23 points. The mean Math score of the freshmen admitted in 2007 — the class of 2011 — had been boosted 28 points.”

All are true statements, technically speaking. Consider, however:

1. While the “real” CR mean for ’11 matriculants was (at 679) lower than for any cohort since ’07 matriculants (at 672), is the ’11 mean of 679 materially different than the ’09 mean (680)–or even the ’08 mean (684)? In other words, is a 1-point difference in CR means (’11 vs. ’09) anything but a point of trivia? Surely, for example, there’s nothing in a 1-point variation that has any utility in point estimation of classroom performance. Indeed, a 1-point variation is not even helpful in predicting future differences in cohort performance on the SAT itself.

2. Your analysis flagrantly ignores the fact that both the “real” mean Math score (707) and the “real” mean combined score (1386) for ’11 matriculants was the highest of any sampled year–and thus, presumably, the highest in CMC’s history. Indeed, the “real” mean M score of 707 was 10 points higher than the “real” mean M scores of the next highest cohorts (’10 and ’09). (Though I hasten to add, of course, that the 1-point increase over the ’10 “real mean combined score of 1385 is not, in any rational view, a useful predictor of anything.)

In any event, ALL of the “real” mean combined scores for all reported years–ranging from 1342 for ’07 matriculants(the class of 2011) to 1386 for ’11 matriculants (the class of 2015) fall within a single band of (93-97%ile) in the University of California’s scholarship requirement protocol.

None of this excuses the manipulation, or the small advantage that apparently was its aim. Any suggestion that the manipulation masked some deterioration in the matriculant population, however, lacks merit.

Thanks for your feedback and your clearly very careful reading of the article. As I’m sure you’re well aware, there are lots of stories to be told from a given set of statistics; my goal today was to explore the details of the data manipulation and what we had and hadn’t been told by CMC.

My goal certainly was not to make one class look better or worse than others. I noted, a few paragraphs after what you quoted, that the current freshmen (the 2011 matriculants) had almost no manipulation of their Math section scores; that is, their scores were good enough not to need much fudging!

The facts you cite are certainly correct: rather than showing a clear increase in the “quality” of CMC’s matriculants, they show scores remaining about even over the longer term, but showing some significant year-by-year variation. But that’s exactly what I said in the article. I’ll be adding some graphs — as soon as I make them — to demonstrate this.

You’re also totally correct that minuscule differences in SAT scores do not say much about a student. This is especially true at CMC: one of our strengths, one that hasn’t changed, is an emphasis on practicality and actions along with book learning. Already, major newspapers are using this incident as an impetus to call for us as a society to place less weight on standardized testing. But, given the weight that our society does in fact place on test scores, the precise details of the manipulations of CMC’s numbers are, I think, relevant.

Where do we make the donations to the Claremont Portside?

Send ‘em to ASCMC with Portside in the memo line. Include a letter indicating that the donation is to the CPS and copy Alyssa on it so she can follow up with ASCMC and make sure they handle the money properly.

“The score manipulations hid the fact that the freshmen admitted in 2011 — the class of 2015 — had mean Critical Reading scores that were the lowest since the class admitted in 2007. ” I don’t doubt the manipulations, but just a caution to be clear about what is being measured. You refer to admitted students here, but the charts show scores for matriculated students. As someone noted on the CHE website this week, there are three pools : admitted students, students who accept, and students who actually matriculate, and different institutions aren’t necessarily report on the same thing.

I took your data (nice work in re-creating that by the way) and made it into an online visualization. It’s at

http://public.tableausoftware.com/views/ClaremontMcKennaTestScores/Dashboard1?:embed=y

And if you wish, you can get an embed code there to put it in your story.

Nice! Great visualization.

Thanks for the interactive graphs. Could you adjust the scale of the SAT score to something smaller? At the very least it should have a minimum of 200, not zero, right?

I don’t understand why the admissions department just didn’t admit students with higher SAT scores if that was so important to the school.

BTW this is some of the best reporting I have seen in a college newspaper. This is more in-depth and thought out than the ivy league newspapers I read on a regular basis. Very nice job.