Dropping of MEA test results raises questions

0 Comments

What was for many the first full week back to school got off to a rocky start with a report in yesterday’s Maine Sunday Telegram revealing that the Maine Department of Education did not report the results of last spring’s 8th grade MEA writing test after deciding that the low scores they found resulted from a poor writing prompt.

Of even more concern is that the newspaper had to do a Freedom of Information Act request to get the test and related data.  The DOE made no mention of dropping the test’s findings when scores were released in July.

This story raises a whole host of questions, and one hopes the Department will do a better job of answering them than the students did at answering the MEA writing prompt.

First, why didn’t the Department at least explain why it was invalidating the test results when it released the MEA scores in July?  That the paper had to do a FOIA request to get this data months afterward is of some concern. I can think of no valid reason for the Department to withhold this information, and yet they did.

Second, what evidence, other than last year’s results from an entirely different class, does the Department have to suggest that these scores are not accurate for this class?  Yes, a 50% drop in the number of students with a satisfactory response is a steep drop, but what other assessment data is there to indicate the extent to which these findings are inaccurate?

For that matter, if this test is invalid, how will the writing skill of this class be assessed? The next standardized assessment for them under ordinary circumstances would be the 11th grade SAT test, which does include a writing section.  Until then, are we to simply go without data on where these students are with regard to their writing ability?  Or, will they get some kind of revised MEA test this spring as ninth graders?  The MEA data is used by teachers and administrators to plan instruction and curriculum – how are they to move forward in the absence of this data otherwise?

Moving forward on improving writing instruction would be a good idea. That “only 23 percent of eighth-graders who took the test last spring met or
exceeded state writing standards” is indeed shocking, but the fact that only “48 percent” did so a year ago is nothing to rave about.  Less than half of Maine 8th graders met or exceeded the state standards for writing last year.  How come that isn’t news?  Looking at the test itself, which requires students to develop a well-reasoned and well-written argument using supporting data provided to them, it is hard to imagine any more important skill for these students to have.  Half of them can’t do it in a good year?

What to make of the Department’s explanation that students either misunderstood the writing prompt or somehow took it too personally?  I’m not buying it.  The test could not be more straightforward.  The directions, which are read out loud to students, are clear.  The prompt is far simpler than some I’ve seen in the past.  The supporting evidence for both sides, provided to the students for their use in the essays, is likewise clear and fully usable to support whichever side the students decided to argue.  Is really possible that students feel so passionately about TV that they took leave of their senses because of this question and could not construct an acceptable persuasive essay as a consequence?

Could a better explanation be that students simply don’t care whether they do well or not? The test, in the immortal words of kids everywhere, “does not count.”  It does not effect GPA, is not used to do course placement in high school, and does not make its way onto college transcripts.  It has no meaning for them, and while they generally try to do well, no doubt there are many who just blew it off.

Perhaps the time has come to look at 8th grade exit exams in place of the MEA.  A number of states and districts have ended “social promotion” by using standards-based assessments to determine whether a student is ready to move on to the next grade or not.  When Maine’s learning standards were first implemented in the mid-1990’s, it was promised that students would at least have to demonstrate a mastery of standards to get a high school diploma, but we’re not even doing that, much less ensuring that 4th or 8th or 10th graders have the skills and knowledge to move on to the next grade level.  How about less testing for testing’s sake and more accountability?  At least then we’d know whether they were taking it seriously or not.

Clearly a number of additional issues could be raised here as well. (How did the state’s private schools do on the test compared to its public schools?)  One hopes, then,  that this story does not simply fade away, but generates a far more wide-ranging discussion about testing and its place in our educational system, and what these test results are telling us about about how Maine students are doing.

With everyone focussed on consolidation and reorganization, this might be a good time, with school underway and all, for more of a discussion about things that really matter – like teaching and learning.