A review of
Standards-Based School Mathematics Curricula
What Are They? What Do Students Learn?
Edited by Sharon L. Senk, Michigan State University, and
Denisse R. Thompson, University of South Florida
Lawrence Erlbaum Associates, Publishers, 2003
The 515 page book is divided into five sections
The editors present an introduction to each educational level subdivision and each is followed by an independently written summary. The brand of curriculum reform they have in mind is identified throughout as "Standards-based" with the "Standards" appropriately italicized because of the curricular alignment with the 1989 NCTM document, not with any set of standards that would be evident so that could be objectively measured against. The chapters within each of Parts II-IV are essays written by people closely aligned with the projects themselves, in some cases, their primary authors. The Final Commentary is by Jeremy Kilpatrick
The elementary grades curricula are Trailblazers, Everyday Mathematics (EM), TERC, and Number Power. The middle school are Connected Mathematics (CM aka CMP), Mathematics in Context (MiC), and MATH Thematics, the STEM Project. High school are Core-Plus, MATH Connections, Interactive Mathematics Program (IMP), SIMMS Integrated Mathematics, and UCSMP.
Following education industry protocol, the editors feign impartiality by encouraging "civil discourse", not "name-calling" such as the use of terms such as "fuzzy math" on one side versus "parrot math" on the other. The tilt immediately becomes obvious, however, with a 1999 quote from then NCTM president Glenda Lappan, "We've had the longest running experiment in human history about whether rote memorization of facts and skills works. It doesn't." Not mentioned there, and only tangentially in the appropriate chapter later on, is that she was the lead author of CMP. It is unfortunate that recusing oneself when appropriate is not an industry standard. The usual put-down of more traditional mathematics presentation and competence expectation as "rote memorization of facts and skills" is all too evident throughout.
Data-based research that objective observers and parents would consider to be meaningful are nonexistent in this book. Schools are not identified to allow for independent follow-up, they are given pseudonyms, another industry standard. What data there is seldom reflects all-school or all-district averages, a fact that makes Kilpatrick's summary very misleading when he asserts, "The most common measures used in the studies were standardized tests of mathematics achievement." The students chosen - and the items, if from commercial tests - are highly selective. This kind of misinformation is yet another industry standard.
The greatly reduced emphasis on calculator use - among curricula for which calculator use is bread and butter - struck me as so incongruous that I checked the index for the occurrence of the word. That changed nothing; only minor and oblique references are made. Things are quite different than they would have been presented a few years ago, even with the curricula being essentially - if not exactly - the same. It was Ralph T. Putnam's summary of the elementary curricula that brought reality home to me, "Is Technology Important?" He goes on to say, "I was struck with the limited attention in these chapters to the role of calculators, computers, and other new technologies in the teaching and learning of mathematics. Three of the chapters mentioned in their curricular goals the importance of being able to 'choose appropriate tools,' including calculators, for computation." He then went on for a page and a half, ranting about the importance of these "powerful new tools". My take is that he hasn't noticed - and they have noticed - that this issue is a non-starter with the public and they're lying low. Better to let students use calculators during assessments and tell the public that computational skills are just as strong as ever than to jump up on a stump and preach as Dr. Putnam does at that point.
In this book, familiar words and ideas take on almost laughable characterizations. The primary TERC study contains only 96 students with half in each group, 56 third graders and 40 in fourth That is, the four reported subcategories (TERC or not, third or fourth) had roughly 25 students in each yet the data is reported using three significant digits. Truly preposterous, however, is that the study was conducted a decade ago, conducted by TERC, and, together with two education doctoral dissertations, "were the only rigorous studies of the Investigations curriculum that had been conducted at the time this chapter was written." CMP includes a "large scale" study but with only 162 up to a maximum of 820 students in each grade-level CMP subgroup or control, still far from district-wide progress that has been seen several times here in California, but under an entirely different philosophy of what it means to be "standards-based". The strongest criticism, however, is that the curriculum developers, or people in their close association, designed the assessment devices (BA for CMP, for example), and then wrote these curricular assessments. The fox is guarding the hen-house.
For definiteness, the chapter on IMP was written by Norman L. Webb of the Wisconsin Center for Education Research (WCER) who has been doing extremely misleading "independent" evaluation of IMP for a decade. That being the case, he is well aware of California's disastrous experience with this "Exemplary" curriculum yet he reports on the steady growth of school adoptions from a couple dozen in the early mid-nineties to over 500 today. Missing, however, is the acknowledgment that this growth is not in its state of creation, California. Here the 80 to 100 schools involved (now even fewer?) has always been a moving target. In district after district, IMP comes in with great fanfare, touted as educational salvation for ALL (caps original), and with the opinions of the school's most knowledgeable math and science faculty ignored if not overtly squelched. After a year, or two, or three, the pressure for college-intention toward a math-based discipline is quietly accommodated (sometimes quietly never abandoned!) and the pressure to offer everyone a choice becomes too much for the administration to bear. That is its death knell. Free choice is IMP death. (And Core Plus death, and CMP death, and ....)
A large part of the problem is the "truth" about IMP and SAT results for college entry. Table 16.5 indicates the problem clearly, an amazing 545 SAT Math score for IMP versus 531 for the Traditional. A few things belie the meaningfulness of this data. Although both groups consisted of unusually strong students, on average, the IMP students were stronger going clear back to their 7th grade CTBS scores. In fact, their *verbal* SAT discrepancy (interestingly, not mentioned in this book) exceeded their math, 551 versus 529. More importantly, however, this was a ridiculously small study. Only 58 students in each group to start, of which an unusually high percentage of both groups took the SAT (in 1995, WCER reported it at 69 IMP students and 72 non-IMP with the same average scores, not 58 and 58, but who remembers?). But the most damning aspect is that Prof. Webb had to go clear back to this study of a few students in the class of 1993 (so SAT taken more than a decade ago!) to find data that supports his idea that IMP enhances SAT Math scores. In fact, full-blown IMP negatively impacts SAT Mathematics performance as school after school across California has demonstrated. The first two LA Unified pilot schools were Roosevelt HS and Marshall HS, both started IMP in 1992, and, as the freshmen in 1992 hit the SAT a few years later, it was a mild disaster. Although neither had been all that stunning, Roosevelt dropped an average of 25 points while having 39 fewer seniors electing to take the SAT and Marshall dropped 41 points with 64 fewer students taking the SAT from 1992 to 1996. Within a couple of years, IMP was dropped even at those two schools and, within LAUSD, it has been a dead issue ever since.
I have this information because in 1997, I did a real "large scale" study, SAT Math statewide of all of the California schools that claimed to be IMP schools in 1996. Across the state, these schools did improve but at a rate roughly half that of the state as a whole (on a raw mathematics score basis, up 6.9% versus 13.6%). Even that is not an accurate figure, however, because many of these schools were already offering a more traditional program for their more math-based college-intending students. A quote from IMP's second in command, Diane Resek, in a letter to me dated September 25, 1996, "To our knowledge, there are no schools that offer IMP as the only college intending mathematics program. We encourage everyone to offer students, parents, and teachers a choice of programs, including both IMP and a traditional mathematics program."
One LA area school that still does use IMP has strong objectively assessed performance; it is the California Academy of Math and Science located on the Cal State Dominguez Hills campus. This success is *in spite of* IMP, not because of it. These parents are so pleased that their children have been given access to such a unique opportunity - essentially a tuition-free, competitive, private school - that they compensate in mathematics extensively outside of the school on behalf of their children. No mouse is going to bell this cat. For this school to drop IMP, it will have to happen in some other way; no parents-with-pitchforks revolt here.
Looking over my aging notes to write this, I see that not only was IMP using Dr. Webb's generous SAT assessment in its newsletter back in 1996, it also described in some detail (four glowing paragraphs) the positive IMP experience underway at Philadelphia's Central High. This school is a competitive enrollment, academically very successful school and IMP was happy to take credit for its success. How much IMP involvement actually occurred, I have no idea. However, Joseph Merlino's Greater Philly site seems to imply that the IMP hype in 1996 at Central in Philadelphia was just as believable as that which was coming out of Marshall and Roosevelt in LA a couple years earlier. There was clear, overt pressure to "choose" to become an IMP school among the LAUSD schools and some did succumb to the pressure.
Is the California "Circle of Life" experience with IMP already finding its way to the eastern seaboard? (Take Prof. Merlino's "through Staff Attrition and Administrative Neglect Despite Five Years Worth of Positive Student Achievement Data"  for what it's worth, of course.) Why is this history of "success" not mandated by the NSF to be part of the information that school curriculum decision makers automatically see? Why is snake oil sales a federally supported enterprise?
Reform Discontinued through Staff Attrition and Administrative Neglect Despite Five Years Worth of Positive Student Achievement Data
Interactive Mathematics Program (IMP) reform discontinued
Philadelphia Public HS Central HS
Wayne Bishop, PhD
Professor of Mathematics
California State University, LA