Scrap the map9

The manufacturer of the Measures of Academic Progress (MAP®) student assessment test, the Northwest Evaluation Association (NWEA),  sent a memo to the Charleston County School Board in Oct. 2010, warning that  it should NOT use the test to evaluate its teachers.

The Seattle School Board needs to get this memo too.

Center warned Charleston school board about using test score data

By Diette Courrégé
Wednesday, October 27, 2010

A research center warned Charleston school officials that it might be a mistake to gauge teachers’ effectiveness with a test that was designed to measure students’ progress.

Before the school board abandoned a proposal that would have evaluated teachers on their students’ performance, it received a two-page memo from the Kingsbury Center cautioning it from moving forward with the draft policy because it could “expose the district to potential legal liability.” (Article continues here.)

Here are some excerpts from the memo, by John Cronin of the Kingsbury Center, an affiliate of NWEA (bold emphasis mine):

Recently, it came to our attention that a database was published by the Charleston Post and Courier that reported on the academic growth of students taught by Charleston County School District teachers.
In this database, CCSD teachers were ranked on the basis of the percentages of their students whose growth matched or exceeded the growth of Virtual Comparison Groups of students that were created by the Kingsbury Center for the district. As a result of that publication,  the CCSD school board is now considering implementation of a teacher accountability policy in which 60% of a teacher’s evaluation would be dependent on student growth and achievement data of this type. When we learned of this, we asked the district administration for permission to send our comments on the proposed policy to the board, and we were encouraged to do so.

The Kingsbury Center supports efforts to implement accountability for both schools and teachers. We believe that Charleston students deserve no less. We also believe that student achievement data can inform the teacher evaluation process. But nearly all experts in our field agree that test results should not supersede the principal’s role in teacher evaluation.

The Kingsbury Center supports efforts to implement accountability for both schools and teachers. We believe that Charleston students deserve no less. We also believe that student achievement data can inform the teacher evaluation process. But nearly all experts in our field agree that test results should not supersede the principal’s role in teacher evaluation.

The memo goes on to list seven reasons why the school board should not use the MAP® test in this manner, and then offered the following recommendations:

Recommendations

We would recommend that the board refrain from adopting a policy in this area until the district is able to study and propose concrete strategies for addressing these issues which, unaddressed, may expose the district to potential legal liability:

a. To address the question of exactly what data would be used for evaluation of teachers who are not teaching in subjects in which value‐added measurements are currently used, particularly social studies, history, science, art, vocational education, and music.

b. To establish explicit criteria for performance on the tested measures that are validated as reasonable by using data from test publisher norms, the district or state’s past performance, or other legally defensible standards.

c. The policy should also require that statistical error be considered when applying these criteria.

d. We would recommend that student test data not receive more weight than the principal’s evaluation in importance.

e. The impact of measurement error, while relatively large when measuring individual student growth, decreases dramatically when that growth is aggregated to large groups. When the groups under consideration are several hundred, such as school level aggregations, measurement error has a much more negligible impact. Consequently, we can support using value‐added metrics as one factor among others in identifying under‐performing schools.

Subsequently, the Charleston, S.C., school board shelved the proposal to base 60 percent of its teachers’ evaluations on their student MAP® test scores.

How is this relevant to us in Seattle?

Charleston is the city where our recently sacked Superintendent Maria-Goodloe-Johnson worked as superintendent before she came here. The MAP® test product was introduced into Charleston’s schools during her term there.

The MAP® test was also introduced to Seattle’s schools by Goodloe-Johnson beginning in the fall of 2009 (in a no-bid contract, while Goodloe-Johnson was on NWEA’s board of directors), and is now administered to most Seattle Public School children three times a year.

Here in Seattle, the MAP® test currently is being used to evaluate teachers.

The test is not designed for this purpose and should not be used in that manner. Even NWEA says so.

So why is the Seattle School Board and superintendent allowing the MAP® test to be misused this way?

This is one of the issues that Seattle’s new Interim Superintendent Susan Enfield and the board need to address and correct.

— Sue p.

An additional note:

Brad Bernatek, the person in charge of the implementation of the MAP test within our school district and a Broad resident, in a meeting that we had with Jessica DeBarros and several other parents last year, stated directly to us that the MAP test was not designed to be used to evaluate a teacher’s performance when we had told him our concerns abut how the MAP test would be used.

Did Brad not make this clear to his boss, the superintendent, at that time? Or had Dr. Goodloe-Johnson chosen to ignore that detail when negotiating with the teacher’s union last year?

Inquiring minds want to know.

Dora