Tuesday, April 29, 2008

State Alternative Assessment

I have issues with standardized tests. Not a shocking fact, but true. So, my state's alternative assessment should fill me with joy. It doesn't have the issues that bother me about multiple choice tests:
  • A multiple choice test shows a student's ability in one, brief period. If his best friend is mad at him, her parent lost a job, she didn't get enough sleep last night, or he hasn't had a good meal in days, a student will not likely perform well regardless of how much they know and are able to do.
  • Some people naturally do well on multiple choice tests and some struggle. I've watched brilliant students have trouble with multiple choice because they are able to justify more than one answer. They can talk themselves out of correct answers.
  • It's very stressful to be tested on an entire year (or multiple years) of learning at once. Some students get sick because they are so concerned about the test.
So, a portfolio assessment should be a great improvement. In theory, it is. I create portfolios for my students throughout the year to help illustrate all they have learned. (I have to admit that mine don't address every single standard, but they do show progress.)

However, the reality of these portfolios has been absurd. We use this alternative assessment for our limited English speakers in place of taking the reading test. We have to document each and every part of every reading standard. For example, standard 5.4 reads, "The student will read fiction and nonfiction with fluency and accuracy. Use context to clarify meaning of unfamiliar words. Use knowledge of root words, prefixes, and suffixes. Use dictionary, glossary, thesaurus, and other word-reference materials."

I think it's a good standard. I teach my students all of those things. To prove it for the portfolio, we must have documentation for each piece:
  1. "Use context to clarify meaning of unfamiliar words."
  2. "Use knowledge of root words,
  3. prefixes,
  4. and suffixes."
  5. "Use dictionary,
  6. glossary,
  7. thesaurus,
  8. and other word-reference materials."
For that one standard we need eight pieces of documentation. Fifth grade has four different reading standards, each of which requires about this many pieces of documentation. (Third grade has to document third grade standards and second grade standards.)

The sheer amount of documentation is daunting. That's my first frustration. I think it should be reasonable to think that we could prove a student's mastery of that standard with only three or four pieces of documentation (a multiple choice test would not hit every single one of those eight aspects).

My second frustration is on us. Instead of documenting students' learning in authentic ways; anecdotal records from reading groups and conferences, response letters students have written about their reading, literature discussion notes, etc., we're documenting mostly with worksheets. We rarely, rarely use worksheets at this school. I know this sort of documentation is a result of our inexperience with this. As we continue to use these portfolios we'll get better at using what we are naturally doing in the classroom for our documentation. Of course, that also assumes that the state will accept more authentic documentation.

Maybe I'm overreacting. Maybe I should be grateful the state allows us the option of an alternative assessment that is a portfolio. Maybe...but I think I'll still strive for an assessment that is authentic to the teaching and learning going on in our school.

2 comments:

Anonymous said...

"Some people naturally do well on multiple choice tests and some struggle. I've watched brilliant students have trouble with multiple choice because they are able to justify more than one answer. They can talk themselves out of correct answers."

I love this point. I test very well and always aced standardized tests, but I always thought that multiple choice reading comprehension questions that asked about the author's goals were ridiculous and an offense to writers. To liken it to your other point, a student's test answers, like an author's works, may be influenced by other stressful things in his life, but it isn't the student's or writer's goal to convey that, nor are they necessarily aware of the stress' impact.

gever said...

It seems to me that the problem is one of dimensionality. Your alternative assessment comes closer to measuring multiple dimensions of comprehension, but the rote application of once again reduces that dimensionality.

Marvin Minsky once said to me "Once you have got a number, then thinking stops." Amusingly, he blames this on Noam Chompsky, but the point is valid. We use tests to quantify the level of understanding in a child, then we use that numeric quantity to represent the child in a slew of statistical analysis. What we end up with is some believable data about the level of education in a population, and it feels believable because it is based on so many samples, but it fails to capture any of the connectedness (if you will) of that tested data inside the kids head.

The "alternative" testing system at least acknowledges that knowing where and how to use a given verb or noun, and in what contexts, is more important than being able to recognize (in a multiple-choice question) when it is used improperly.

That being said, I would like to see another alternative, one based on something akin to the Turing Test. If you can pass for a seventh-grader in any context a seventh-grader might find themselves, are you not ready for the seventh grade?