Thursday, March 27, 2014

I Took A Practice "Smarter Balanced" Test Today

Teachers at my school today took the practice Smarter Balanced test so that we'd know what it was like, what problems our students might encounter, and what type of questions would be asked.

I will say this:  some of the 11th grade math questions were worded in an obtuse way.  I will say that we have highly qualified, very competent math teachers at my school, and some of the problems had a few of us gathered around trying to figure out exactly what a problem was asking for.  If it takes 3 good math teachers, two with masters degrees and one working on one, to figure out what an Algebra 1 question is asking, then the question isn't a good one.

There's a difference between rigor, that which requires a depth of knowledge and skill, and confusion, which makes the problem unnecessarily hard while obscuring the actual math.

One question in particular bothered me.  It was about a right triangle, blah blah blah, and had 3 boxes for answers.  In one box was cos A < sin A, in the second was cos A = sin A, and in the third was cos A < sin A.  Below the box was a fourth box with a bunch of integer angle values.  I can't remember exactly how the question was worded, but I couldn't tell if clicking and dragging one angle value into each box was sufficient to answer the question or if I had to put all 10 or so angle values in one of the boxes in order to get credit.  That kind of confusion shouldn't be acceptable, especially for a test that will be given to half the high school juniors in the country.

Update, 3/29/14:  Thanks to Mr. W in the comments, here are all the questions and the solutions.     Scroll down to page 14/24 of the pdf file, or page 13 as shown on the picture (problem 682), to see the trig question I mentioned above.  Note the ambiguity in the directions:  Drag possible measures of angle A into the correct column.  Thinking like a high school student, I have satisfied the requirements of this problem if I have put one value in each column.  If it said "drag all possible measures of angle A" the instructions would be clear.

This is part of what's wrong with the way math is sometimes covered.  It should be excessively clear what is being asked; when it's not, math becomes some mysterious task of trying to divine what the teacher (or test) actually wants you to perform.

With this pdf file you can see the questions that were asked.  Do you agree with me about the wording of some of those questions?

If you're so inclined, take a look at the performance task.  Don't think like a drone here, but really explore the problem.  Is "fair" defined?  Will everyone define "fair" the same way?  Are you comfortable with a performance task for which you're only given credit if you agree with the problem-writer's (unexpressed) view of "fair"?
For this item, a full-credit response (2 points) include:
agreeing with the claim
justifying the response by citing at least one comparison between values used in the two systems.
Why must two values be given?  Where is that requirement stated?  There are other ways, even ways that match the author's (unstated) definition of "fair", that don't require listing specific points.

I know the fuzzy math proponents like subjectiveness, but this performance task is fundamentally flawed.


Elaine said...

... I vaguely remember that problem when we took it last fall... Honestly, I didn't feel the test was that bad, and none of the district teachers there at the time seemed to struggle with the wordings much.

In fact, they seemed very similar to the way the MARS tasks are written.

Mr. W said...

We have had to take that test 4 tines now. I remember that question, but can't remember the answer.

One of our teachers found the solutions to the whole test including the performance task. And all this "the answer isn't as important as the journey" isn't true. Each question was 1 point except for the ones that had the "choose all that apply". With those each one was worth one point. So tell me what has changed?

Also, this test isn't even going to be the one they take next year. It isn't the " self-correcting" test. Everyone gets the same questions. And who knows whatever next years test is might not be the same as the actual one that counts in 2016

Mr. W said...

here is the site for the scoring of the sbac. Grade 11 math is towards the bottom with the performance task. Enjoy and share, if your department is anything like ours, we all wanted to know how this would be scored. For over a year we tried to get someone in the district to get us the scored version. They never got it to us, another teacher found them.

maxutils said...

Without additional info, that question could be any of the three.

Mr. W said...

You're welcome for the answers. Wait til you start looking at the performance tasks. Our department started giving the middle schjol performance tasks to get them ready for that style of problems. One of the tasks is to design a garden that is "like a square." How is something like a square? Isn't a square or not? Is 5.1x5 like a square or does it go to 5.5x5?

And it's not just our school that has issues with the wording of the "official released questions" our other high school and middle schools also don't like them.

I am tired of reading people defend bad common core questions, by saying "it's just a bad question from a publisher that doesn't understand common core." Well how can they defend this when it is tne official questions?

maxutils said...

I took all of the test, and I got them all right (as one would hope I should), as to the way the math was tested? I thought they did a very good job of testing the actual mathematical concepts they wanted, in a way that MC could not do. Some of the wording was iffy ... I read the questions before Mr. W's followup; when I first did it, I had no problem at all with the wording ... I just felt it was redundant. Obviously you wouldn't do 45 degrees, but if you can answer for 10, you can answer for everything else... you could have tried to clarify it, but I honestly think that would make it clunkier. PErhaps a general rule of "Any correct answer should be placed in the appropriate clump/box/blank?" My other complaint ...and this comes from the large part of me who hates standardized tests're going to need to spend valuable time teaching testing strategy rather than math... and my god will the scores suck anyway.

Anonymous said...

Your remarks about stating problems clearly reminds me of a joke I heard from a teacher once when I was studying for an actuarial exam. He said that often the hard thing on the exams is not figuring out what the answer is but instead figuring out what the question is.

But in real life this too is often the case.

maxutils said...

Anonymous... totally true. Especially in subjective disciplines. You know what you want when you craft the question, but they don't. My favorite ex. is a notorious hard accounting teacher at UCD who delighted in crafting brutal test problems. One day he comes in for a two hour test and says, ... okay ... three questions on the test. 2 and 3 shouldn't be to bad. But ... 2 relies on your answers to 1. and most of you aren't going to solve 1. It might be the hardest question I've ever written. So if you can't get one? here's some fake numbers to use for question 2. And he wrote down fake numbers. But that's deliberate ... it's really easy to create a bad or imprecise test question.