To Chancellor Klein re NAEP and Mathematics in New York City


From: Elizabeth Carson
Sent: Saturday, November 15, 2003 1:30 AM
To: 'Klein Joel I.'
Subject: RE: NAEP - (was another parent letter : using our children as
guinea pigs -or- political footballs)


Joel wrote:
Look at the Naep scores and compare NY, Mass and Conn with Cal and Texas. Facts are stubborn things.

Joel,

[...]

First of all before I get into the NAEP, I hope you're not assuming the NAEP results can serve as much indication of what works in mathematics education locally or nationally. This would be ludicrous since we do not have (thank God) uniformity in educational delivery in local districts much less statewide, of the implied nature in your reference above; ie all NCTM based programs (or not) and an all constructivist teaching approach (or not). You don't have it here in NYC even now. What did I hear this week, the list of waivered schools is up to 300?

I have heard laments of the lack of system wide coherence in instructional delivery, the implications for a transient student population, etc, expressed many many times, in explanations by "reformers" (locally luminaries such as Diana Lam, Lucy West, and Catherine Fosnot). Their justification for universal NCTM based program adoptions so the story goes, is based on the recognition that there is across districts a counterproductive high variability in the mathematics programs and curricula. This was in fact the explanation given to parents and the public for the District 2 math initiative and it was the explanation given to parents and the public for the need for the Children First universal curriculum. Bogus reasons for fuzzy math adoptions in both cases. But that's another conversation.

The fact is, the NAEP exam used in the latest assessment (not to be confused with the long-term trend NAEP) is mediocre, not nearly to a standard fit for a national exam, and certainly not fit for the use NCLB intends. Also it is important to realize that the scores quoted in the press are percentages scoring at the "basic" level. That's failing. NAGB's standard for satisfactory is the next level up, "proficient," not "basic."

Lets stick with the facts, sir, as Sergeant Friday would say.

We have only to look at recent international comparatives to get a much needed and yes rather painful reality check on the current state of US math achievement. There remains no cause for celebration Joel and there is no use in these NAEP results to validate your curricular choices whatsoever. Of course I personally don't need any studies national or international to tell me what any parent knows - the educational system is an utter mess and our children are being denied a decent mathematics education deliberately, by design, in the "reform" initiatives such as the one you've just launched citywide

While Usisikin might see reason to rejoice (no doubt, Mr Everyday Math himself is quite pleased to hear of the "NAEP miracle", especially given the NAEP has been so well corrupted by the NCTM "vision" of equal opportunity to mediocrity for all). Universities aren't reporting incoming freshmen with a greater skill set, quite the contrary, remediation is up; US employers aren't rejoicing over the skills of the populations they survey for potential employment, quite the contrary, their screaming for better qualified applicants and the number of Congressional approved HB-1 visas rises each year.


There are in fact (facts - those stubborn things) three NAEP programs, using two very different tests (four counting the trial urban program which we've discussed a few months back):

Here's some facts that distinguish the two tests and three programs in which they're used

(The information that immediately follows is from an old email I sent out for others' edification, as we all were initially confused by the several NAEP's.)

[I] long term trend NAEP: Objective: to measure student progress over time, with the SAME (my emphasis) assessments, in math, science, reading and writing

Students tested: 9, 13 and 17 year olds

History of years administered: (start and frequency differs for each subject)

Start years: Science: 1969; Reading: 1971; Writing; 1984 Mathematics: 1973, 1978,1982, 1986, 1990, 1992, 1994, 1996, 1999

Reporting: scores reported in percentages at or above 4 performance levels. (also in quartiles and scale scores 0-500, used to report subgroups, ie by race, gender, school, education level of parent). Performance Levels: 150 (lowest) simple arithmetic facts addition and subtraction without regrouping, rudimentary classification skills 200 add two-digit numbers some basic multiplication and division facts, relations among coins, read charts and graphs, some reasoning skills; 250 initial understanding of four basic operations, multiply two digit numbers, compare information on charts and graphs, developing ability to analyze simple logical relations; 350 (highest) apply a range of reasoning skills to solve multi-step problems, facility with fractions, percents, exponents square roots, solve linear equations

Performance Trends 1973 - 1999

The population as a whole showed slight increases (up to 5 percentage point rise ) in some performance levels in some years: 64% 9 year olds in level 200 in 1999 (59% in 1959) 15% 13 year olds in level 300 in 1999 (10% in 1971). Black and Hispanic students showed around a 15 pt rise in average scale scores in all three age groups; 13 year old black students' scores rose 23 pts. White students' average scale scores rose among 9 and 13 year olds by 9 pts, 17 year olds' rose by 5 pts.

Performance:

1999 17 year olds: Level 350: 6% ; Level 300: 40%; Level 250; 82%
13 year olds: Level 300: 15%; Level 250; 61%; Level 200: 93%
9 year olds: Level 250; 31%; Level 200: 83%; Level 150: 99%

Keep in mind Tom Loveless' statement that at the current rate of "improvement" it would take US 8th graders 125 years to catch up with Singapore and 83 years to catch up with Japan!

[II] State and Main NAEP: objectives in both cases: "to measure student progress over time with assessments THAT REFLECT CURRENT EDUCATIONAL CONTENT AND ASSESSMENT METHODOLOGY (my emphasis) (periodically revised)"

Translation: fuzzy NCTM Standards-based NAEP and therefore totally unacceptable to many experts (and parents) as a measure of what students truly need to know and be able to do in K-12 mathematics

[A] STATE NAEP: Authorize by Congress in 1988. Objective: to measure student progress over time, with assessments that reflect current educational content and assessment methodology (periodically revised) in math, science, reading and writing

In 2000 102,000 4th graders in 46 states; 98,000 8th graders in 44 states

History of Years administered: grade 4: 1992, 1966, 2000: grade 8: 1990, 1992, 1996, 2000

Reporting scores reported in percentages at or above 3 achievement levels; and in average scale scores 0-500

Achievement Levels *

Advanced: should consistently demonstrate the integration of procedural and conceptual knowledge and the synthesis of ideas in the five NAEP content strands

Proficient: (identified by the NAGB as the level that all students should reach) should consistently apply mathematical concepts and procedures to problems in the five content strands

Basic: should exhibit evidence of conceptual and procedural understanding in the five NAEP content strands

* Based on the NAEP Five Content Strands detailed in the Mathematics Framework for the 1996 and 2000 National Assessment of Education Progress: Number sense; properties and operations; measurement; geometry and spatial sense; data analysis, statistics and probability; and algebra and functions.

Achievement Trends

4th grade: Out of the 36 states that participated in years 1992 and 2000; 26 showed higher average scores 7 top scoring states: percentage at Proficient: ranged from 29% to 34% 8th grade: out of the 32 states that participated in years 1990 and 2000; 27 showed higher average scores 2 top scoring states percentage at proficient 37& and 40%

Achievement

Percentage of public school students attaining for the sample in which accommodations were not permitted

Grade 4 - At or Above Proficient

2000  25%
1996  20%
1992  17%

Grade 8 - At or Above Proficient

2000  26%
1996  23%
1992  20%
1990  15%

[B] MAIN NAEP. Objective: to measure student progress over time with assessments that reflect current educational content and assessment methodology (periodically revised)

Students tested: 4th, 8th and 12th graders. In year 2000 a "representative national sample" of 14,000 4th graders (742 schools); 16,000 8th graders (744 schools); 13,000 12th graders (558 schools)

History of years administered: 1990, 1992, 1996, 2000

Reporting: scores reported in percentages at or above 3 achievement levels; and in average scale scores 0-500

Achievement Levels*

Advanced: should consistently demonstrate the integration of procedural and conceptual knowledge and the synthesis of ideas in the five NAEP content strands Proficient (identified by the NAGB as the level that all students should reach) should consistently apply mathematical concepts and procedures to problems in the five content strands

Basic: should exhibit evidence of conceptual and procedural understanding in the five NAEP content strands

* Based on the NAEP Five Content Strands detailed in the Mathematics Framework for the 1996 and 2000 National Assessment of Education Progress: Number sense; properties and operations; measurement; geometry and spatial sense;data analysis, statistics and probability; and algebra and functions

Achievement Trends 1990-2000

Slight, steady gains were shown at 4th and 8th grades; 12th graders scores rose through 1996, then declined by 2000 All three grades, average score of white students as higher Gaps between white and black and Hispanic students remain relatively unchanged.

Achievement

2000 At or Above Proficient

12th grade: 17%
8th grade:  27%
4th grade:  26%


Below is NCES summary description of the NAEP administered in all 50 states in 2003, the scores of which were released this week and reported in today's press

What Does the NAEP Mathematics Assessment Measure?

The NAEP Mathematics Framework is the set of specifications used to develop the 1990, 1992, 1996, 2000, and 2003 assessments. It describes the skills and content measured in the assessment. The framework is developed by the National Assessment Governing Board (NAGB). The NAEP mathematics assessment uses a framework influenced by the National Council of Teachers of Mathematics (NCTM) Curriculum and Evaluation Standards for School Mathematics. In 1996 the framework was updated to reflect recent curricular emphases and objectives and maintains a connection with the 1990 and 1992 assessments. [here]

The NAEP Framework and the assessment used to evaluate state progress are both highly controversial. The State NAEP is certainly not universally regarded as nearing the sort of assessment we should be using as a national exam. You may (although I doubt it) find illuminating, some commentary delivered in the form of testimony to NAGB regarding the proposed 2004 NAEP Framework See testimonies by John Hoven, Tom Loveless, and Alan Siegel.

I hope you will look at the writings of Tom Loveless at the Brookings Institute in this area (as well as others related to mathematics education). Better still, fly him in to give you and the mayor a presentation.

Elizabeth


Return to the NYC HOLD main page or to the News page or to the Letters and Testimony page.