Testimony to the New York City Council Committee on Education
By Dr. Lori Mei
Hearing on Test Results, June 27, 2005
Good morning Councilwoman Moskowitz and members of the Education Committee. I am Dr. Lori Mei, Senior Instructional Manager of the Department of Education's Division of Assessment and Accountability and I am here to discuss our testing program and the recently released city test results. With me is Jennifer Bell-Ellwanger, Deputy Senior Instructional Manager for Assessment and Accountability. Also here today are senior staff from the testing companies that develop and score our City tests at grades 3, 5, 6 & 7. These companies, CTB/McGraw-Hill and Harcourt Assessment, Inc., are two of the largest and most respected testing companies in the country. They develop and independently score NYC's mathematics and English Language Arts tests. Later this morning, Dr. Daniel Lewis from CTB/McGraw-Hill and Dr. John Olson from Harcourt Assessment will discuss the development and psychometrics of the City's mathematics and ELA assessments respectively.
As in previous years, the 2005 City ELA and Mathematics tests were developed to assess students' mastery of State and City learning standards. As the testing companies will explain in greater detail, these tests are valid and reliable measures of our students' attainment of these standards. Our City tests are developed and scored according to the highest industry standards by CTB/McGraw-Hill and Harcourt Assessment. We hire the testing companies specifically for their expertise in constructing standardized tests, ensuring that the tests are similar from year-to-year and scoring the tests. As a result of the processes that the testing companies have in place, we can be sure that the tests are comparable from one year to the next and that improvement can be attributed to students' increased knowledge and skills and is not the result of differences in how the tests are developed. I am confident that they have done their work with care, with integrity and according to the Standards for Educational and Psychological Testing.
I have been with the Department for over 20 years, and I have served many Chancellors. I was appointed head of the Division of Assessment and Accountability by Chancellor Levy in 2001 and served as Deputy for a number of years before becoming Executive Director. I hold a B.A. from Smith College and a Ph.D. in Social Psychology from Teachers College, Columbia University where I focused on educational research and multivariate statistics.
First as deputy and, for the last several years as head of the office, I have overseen the testing process, utilizing the same high standards and careful review year after year. This year, as in previous years, I conducted a thorough review of the content and format of the City tests developed for us by our test publishers, and I examined the technical material supplied by the testing companies' psychometricians.
As the testing companies will explain to you in detail and as I can confirm, the 2005 tests are valid and reliable. The increased scores mean that our children simply performed better on these tests this year.
Further, I would like to dispel a number of inaccuracies and misconceptions that I have heard raised about students' performance on the City tests and on the State grade 4 ELA assessment this year.
In fact, there were no more grade 3 students repeating 3rd grade this year than last year. In the two most recent school years, 5.9 percent of all 3rd graders repeated 3rd grade-a total of 4,817 students in 2002-2003 and 4,507 students in 2003-2004. If we look only at 3rd graders held to the new promotion policy, the number of holdovers is virtually identical- 3,105 students held over in 2002-2003 as compared to 3,012 students in 2003-2004.
The eligibility requirements for the ELA test have not changed. They do not pertain to mathematics as all ELLs must take the mathematics assessments. DOE has followed the same State criteria since its inception 3 years ago. The only difference in 2005 was that DOE, in consultation with the NYSED, did not require State approval to permit English language learner students who met specific criteria determined by the SED to be exempted from the ELA. In all cases, the determination of which English language learners are exempted is made at the school level and always has been.
As is the case every year, there is some variability in the number and percentage of our special populations (e.g. English language learners and Special Education students) tested. In some years, there may be more English language learners or special education students enrolled in some grades and in other years there may be fewer students enrolled or tested.
Fluctuations in the number of students taking these tests are due to the level of English language proficiency and/or special education disability of particular students in a particular year. Determinations of which students are tested are made by school staff for English language learners and as per students' IEPs for special education students.
The tests were not easier this year. Dr. Lewis and Dr. Olson will describe to you the steps that they took to ensure that the 2005 City tests were similar in content to previous tests and that any variation in overall difficulty of different test forms was accounted for in their equating processes.
In addition, our review of the information provided to us by the testing companies confirms that the 2005 tests in both ELA and math, at each grade, were comparable to those administered in 2004. Thus, we are confident that the increases in the percentage of our students meeting and exceeding ELA and math standards this year is the result of increases in the number of students attaining the learning standards measured by our tests.
At this point, let me describe in more detail the review processes that the Department has in place regarding our city testing program.
1. As previously indicated, both the City ELA and Mathematics assessments are designed to measure students' attainment of New York State Learning Standards and New York City Performance Standards. As the testing companies will explain in greater detail, each item on the test for each subject and grade is designed to measure a specific skill. The tests are constructed by the testing companies from the respective company's pool of items aligned to the standards. The test publishers provide the content in terms of both the passages and items for the tests.
2. Research staff from Harcourt and CTB/McGraw-Hill provide NYC Assessment staff with a variety of statistical data for each item under consideration. DOE Assessment staff review the passages and items selected by the test publishers and item statistics and approve the final form of the tests.
3. As is the process for all standardized test development, the NYC ELA and Mathematics test items are piloted by the testing companies on stratified random national or New York samples of students. We have made sure that our City tests have been piloted on samples of students that are representative of our New York City population. The testing companies construct the tests so that they represent a range of items from easy to difficult.
4. Once the tests are administered, the test answer documents are scanned at DOE's Scan Center and electronic files of each student's answer to each test item are sent to the respective testing company for scoring.
5. The Research staff at Harcourt and CTB/McGraw-Hill apply their equating and calibration programs and score the tests.
Both companies conduct extensive quality control checks on the input (e.g., raw score) files and output (e.g., scaled score) files and both companies have certified that the raw score-to-scaled score conversion tables that they provided to NYC are valid and reliable.
To reiterate, our City tests are developed and scored by independent, respected test publishers according to the highest industry standards. We hire the testing companies specifically for their expertise in constructing standardized tests, equating different test forms and scoring the tests. Through their equating processes, the overall difficulty of test forms is statistically adjusted so that we can be sure that the tests are comparable from one year to the next and improvement can be attributed to students' increased knowledge and skills and are not the result of differences in how the tests are constructed.
I hope that this clarifies matters for you, I will be happy to take any additional questions that you may have.
Return to the New York City HOLD main page, the NYC HOLD News page, or the NYC HOLD Letters and Testimony page.