February 16, 2009

David Anderson: Do the NECAP Test Results Mislead?

Engaged Citizen

Secretary of Education Arne Duncan recently lamented, "I think we are lying to children and families when we tell children that they are meeting standards and, in fact, they are woefully unprepared to be successful in high school and have almost no chance of going to a good university and being successful." Do his remarks pertain to Rhode Island's 8th grade public school children? Is it the Rhode Island Department of Education (RIDE) that is misrepresenting their skills through their use of the New England Common Assessment Program (NECAP) tests?

Recently released public school achievement results from the NECAP test seem to confirm these worries. The proficiencies reported not only exaggerate the performance of Rhode Island students in mathematics and reading but also claim improving 8th grade reading scores when, in fact, they are declining. This seems inconsistent with the transparency that was promised.

Stakeholders, including Governor Carcieri, have been pushing Rhode Island educators to improve student performance. In response the NECAP 8th grade reading scores are giving them only the illusion of improvement. Illusion is a poor substitute for truth. These advocates of better education have been blindsided by RIDE and the NECAP testing regime.

To see the problem, please consider the 8th grade NECAP results in recent years. They are significantly inconsistent with those of the well-respected Nation's Report Card- also known as the National Assessment of Educational Progress (NAEP).

The NECAP measure of skills should roughly reflect that of the NAEP. It fails to do so in two regards:

Firstly, the NECAP exaggerates. It reports that roughly twice as many Rhode Island 8th grade children are proficient (at or above grade level) as are reported by the NAEP. We are told that last year 53% of Rhode Island pupils were proficient in math and reading when, in fact, the NAEP trend says it's only 26%.

Perhaps worse, and what is new to us, is the fact that NECAP reading proficiencies have been increasing over time while the NAEP shows declines. We call this the up-down problem.

You can see this in the graph, which shows the 8th grade reading proficiencies reported by the NECAP and those interpolated or reported for the NAEP. The vertical separation of the plots shows the exaggeration effect while the up-slope versus down-slope represents the up-down problem.

We also reviewed the performance of 8th grade pupils in the best school district, of Barrington, and in one of the worst districts, of Providence. Additionally, using a mapping technique we developed, we have estimated NAEP proficiencies for them.• By considering these cases parents and other stakeholders can get some sense of what's wrong with our public schools.

All must seem fine to Barrington 8th grade parents when the NECAP is claiming student proficiency percentages exceeding 90%. But not when they see our NAEP estimates showing one-third of their children below grade level.

Parents in Providence, while not pleased by the NECAP results showing only 28% of their 8th grade children proficient in both subjects, would be justifiably outraged if they knew that the more trustworthy NAEP estimates suggest that less than 10% are at or above grade level. If these numbers are correct, Providence schools are extremely dysfunctional.

How could these inconsistent results arise? It's fairly clear that the exaggeration problem is due to the policies of the NECAP authorities and, indirectly, of the participating states' departments of education.

As to the up-down problem of erroneously rising NECAP reading proficiencies, that uptrend could be due to a number of causes. Other states have seen officials gaming the tests to produce artificial gains- as happened in California some years ago. While we can't entirely rule out this possibility, we think the explanation lies more in the realm of good intentions gone awry.

Maybe the NECAP curriculum content is narrower than that of the NAEP and is thus more easily taught? It's like learning a booklet instead of a book. The one is mastered and the other not so well. Then scores go up for the one and down for the other.

But that is just a hypothesis. Further study of this is needed- probably by outside independent experts. Consideration should also be given to conducting the assessment function through an independent agency to remove concerns about conflicts of interest. That is the practice of Massachusetts's MCAS test.

Do the authorities of the Rhode Island Department of Education (RIDE) take these inconsistencies seriously?

As to the exaggeration problem, consider that at least one RIDE official apparently thinks that the NAEP standard is too high and should be lowered. She says, "NAEP 'proficient' is a very high aspirational standard."

It suggests that these officials would prefer to lower the NAEP standards rather than elevating those of the NECAP. Lowering expectations of students is no way to build tomorrow's work force. We should be maintaining or raising standards. Senator Kennedy has introduced legislation to encourage aligning state standards with the NAEP. Instead the gap between the NAEP and NECAP seems to be widening- at least with respect to 8th grade reading proficiencies.

When asked about the up-down problem a RIDE official said that the 8th grade reading test "has been administered ONLY ONCE" in recent years. But how could that be when the RIDE website reports 8th grade NECAP proficiencies for four consecutive years?

It seems that obfuscation is the primary response from RIDE officials while nary a word is said about studying these discrepancies.

We believe that when parents and other stakeholders have a more transparent picture of our public schools, then the needed political pressure can be developed to begin serious reforms. Our preliminary analysis of test results suggests that social promotion is a fundamental problem that exists in every public school in the state. But most interested parties don't yet see it that way- certainly not with the wool NECAP pulled over their eyes.


David V. Anderson, Ph.D., is CEO of Asora Education Enterprises, and his NAEP estimates were generated under a contract with the Ocean State Policy Research Institute.

Comments, although monitored, are not necessarily representative of the views Anchor Rising's contributors or approved by them. We reserve the right to delete or modify comments for any reason.

In Cranston, the Reading series--Reading Street--is made by the same company that makes the NECAP test. A quick look at the tests given in the classroom and the NECAP materials on line, will show you that rather than teaching reading, comprehension and writing, the reading series teaches you how to take the test. Teaching methods in the classroom often mimic the how to take the test mentality rather than real teaching. What NCLB and their standardized tests have done, is create an entire industry of consultants and so called experts that receive a whole lot of taxpayer money to teach our children how to take a test that does not demonstrate that they have actually learned anything.

Posted by: Chris at February 16, 2009 8:24 AM

I don't understand why people get upset about "teaching to the test" and test taking abilities.

For teaching to the test, I'm going on the assumption that some group of experts picked out the concepts and topics that are important for students to learn. If teachers are focusing on these concepts, isn't that what the test creators wanted the students to learn? If the point is for students to know how to comprehend a 2 paragraph story, but omit spelling from the test, then why focus on spelling? The experts deemed that not important. Focus on the reading comprehension of a two paragraph story. That is what was deemed important. If the concepts and topics of the test are wrong or incomplete, I don't blame the teachers for that, I blame the creators of the test and the administrators who decided that this is the test they want to use.

As for teaching test taking abilities, guess what, every measure of success in education is culled from test taking. What good is it to be brilliant, but you freeze up at test time? What grades would that person receive? Want to say that grades should not be that important? Tell that to the college admissions officers. Test taking ability isn't important? Tell that to people who need to take the SAT, MCAT or GRE. How about medical boards? Do you want to see a doctor who failed his medical board exams? But maybe he's brilliant, focused on learning and isn't good at test taking.

The author makes a point about the possible lowering of the NAEP standards for proficiency. While lowering standards may not be a good thing, but I don't know what the NAEP or NECAP standards of "proficiency" are. Maybe they are too high, maybe they're too low. I think before commenting on the option, we should know what is considered "proficient" at each level. If it is being able to perform calculus in the third grade, that might be a little high. If it is being about to do single digit multiplication tables in the 11th grade, that might be a little low. I think the key is to know where the "proficiency level" is.

Posted by: Patrick at February 16, 2009 9:13 AM

There are two federal definitions for "proficient," one for NAEP and one for NCLB. The NAEP vs state "proficency" analysis reported here depends on a confusion of the two definitions.

States must use the 2001 NCLB definition (meets grade level expectations) in order to receive federal funding. According to the National Assessment Governing Board, however, "It is important to understand clearly that the [NAEP] Proficient achievement level does not refer to “at grade” performance. Nor is performance at the Proficient level synonymous with “proficiency” in the subject. That is, students who may be considered proficient in a subject, given the common usage of the term, might not satisfy the requirements for performance at the NAEP achievement level. [. . .] Finally, even the best students you know may not meet the requirements for Advanced performance on NAEP." This means that even some of the best students you know score only NAEP Proficient, even thought they would likely score "advanced" on the state test.

Posted by: Bert at February 16, 2009 5:57 PM

As the author of the op-ed I delayed any responses until today when the Providence Journal actually published the piece (somewhat different from the version shown above). You can see it at http://www.projo.com/opinion/contributors/content/CT_anderson18_02-18-09_E4D98O1_v15.4002fd4.html

Chris and Patrick, above, commented on the "teaching to the test" aspect of the reading instruction here in
Rhode Island. Both are correct. If a curriculum is sufficiently broad and the examination database of questions is equally broad then teaching to the test (all possible tests) is a good strategy and a proper one. If the curriculum is broad but the test database only covers a subset of it, then teaching to the test becomes an end run around learning the full content of the curriculum and it belies the notion that the test measures the child's apprehension of that content. This is a poor strategy and while one can question the ethics of teaching to such a test, it will be difficult to stop. Finally, one can have a narrow curriculum with a corresponding narrow test database. Teaching to that test will help children master the narrow content but will not help them learn the wider content of a more robust curriculum. My guess is that this last scenario is most likely involved in the inconsistent proficiency levels and trends reported by NECAP (testing the narrow content) and NAEP (testing the wider curriculum).

Bert's comment acknowledges the confusion that arises when two different measures of proficiency are used. Education officials should qualify their uses of the words proficient with a footnote or other annotation to show that one is a weak standard and the other a strong one. Most educators in the United States consider the NAEP standard of proficiency to be that of what is meant to be at grade level. The very definition that the National Assessment Governing Board (NAEP's parent organization) uses for "proficient" is essentially a grade level criterion. If folks want a weaker standard, then they should be explicit about it and tell us how it relates to the NAEP standard. They could announce that the new 8th grade standard is what was formerly the 7th or 6th grade one. That's roughly what the NECAP 8th grade reading standard is measuring. It would be an example of dumbing down the curriculum.

Posted by: David Anderson at February 18, 2009 11:27 AM

As a working member of the measurement community I am obliged to adhere to the professional standards of the community (as promoted by, for example, the American Educational Research Assocation and the National Council for Measurement in Education). These standards clearly make it the test publisher's responsiblity to specify the appropriate use and interpretation of the data. To me, when NAEP's publisher says "It is important to understand clearly that the [NAEP] Proficient achievement level does not refer to “at grade” performance," I accept that interpretation over non-publisher interpretations of the data such as the claim advanced here that NAEP proficient is "essentially a grade level criterion." Sorry.

Posted by: Bert at February 18, 2009 9:33 PM

One might want to look to the Fordham Foundation's critique on NH's Math standards. Since the NECAP is an assessment of how well the students live up to these standards, one might want to know what experts think of our standards. For instance the mathematicians gave the NH standards an "F"...which means if your child passes the NECAP in NH, they are living up to F rated standards. The standards are Constructivism in their philosophy of education which is problematic in itself. See: http://www.cogtech.usc.edu/publications/kirschner_Sweller_Clark.pdf
We've been swindled

Posted by: MomWithAbrain at March 1, 2009 5:41 PM
Post a comment









Remember personal info?

Important note: The text "http:" cannot appear anywhere in your comment.