Today we were introduced to the new software that's going to track every student's performance, on any number of tests (not just state standardized tests), for the 13 years they're in school. I can see how my last year's students did, or how my current students did on last spring's tests.
What I'd really like is for people who've done this to train us on how to use that data effectively to improve instruction and student achievement.
After learning about that we met in departments. I hate doing stuff like this: we had to determine if our department goals (created/modified last year) align to the school site goals, the WASC (accreditation) key areas, and the district's newly released "strategies". That kind of activity is mind-numbing to me. I guess there might be some utility in it, but when I do a cost/benefit analysis it comes out very negative. Yet, that's the new thing for this year....
If there is some way to figure out how the students who are one grade above what you currently teach did on some of the math SUBtests (such as multiplying fractions, or dividing decimals or whatever), you may find a potential weak spot in your teaching of *this* year's students and head that off at the pass. Is the information that detailed?
Information is rarely that detailed. They'll say "number operations" but you really don't know exactly what the question was. You're always left wondering how accurate it all was:
- was the kid tired, lazy, trying?
- did he care? (Scores don't count for the student up here - they don't have to pass or even take the test. They can just bubble in or fiddle.)
- was the question worded strangely or differently from what we do? Up here in the frozen north, they don't release the whole test like the NY regents do and we often don't see the results until 5-6 months later. (Test in Nov, scores by April) It makes it difficult to tailor your next years teaching when you can't get a sense of the target.
If RoTLC can get more data from his program, it'd be neat. Ultimately, though, you can lose yourself in the data aggregation and disaggregation and forget that kids change and the data collection was suspect. The summer transitions often make those new 10th graders unrecognizable. Getting a job often turns 11th grade slackers into 12th grade students. Girlfriend issues make far more changes occur than Education Commisioners.
Data is wonderful, but students aren't data. All the data in the world might help you make some remedial work available, but four weeks later he 'gets it', now you don't need any of it.
RoTLC is going to hear a LOT about "Tracking" and "Don't let his past performance dictate his future." I suspect that he will also be up to his MilSpec eyeballs in outside concerns that skin color is affecting scores and placement, not ability. (With all the yellers forgetting that class is a much better indicator than skin color, but I digress)
Finally, I'd say to everyone: There's good teachers and bad teachers, teachers who connected with Johnny and those who didn't. Don't assume anything from his old scores until you know Johnny, and really not even then. The only really useful assessments are the ones you give out yourself.
It's kind of funny that we are assuming that past teaching was not good enough and we need to improve our teaching by poring over those same past results. We tend to dwell on those past scores as if they were generated by the best teachers in the world using the best teaching methods known to man. Seems like a disconnect to me.
I'm going to stick to teaching algebra, helping the weaker ones remediate, pushing the group to do their best. I'll test them for my purposes and when I'm done, I'll send them to the next teach with a good background. Any difficulties the kids had last year - well, that's what these inservice days are for, in my opinion - talking about students and what they need.
Be careful what you ask for. We use Eduphoria which has gradebook capability, but for some reason we aren't using it. It does track kids and provide contact and history. We use a progame calle Esembler for gradebook and then use Zangle for attendance....too many different programs if you ask me. Pick one and fly with it.
We're using Zangle (what a stupid name) for all our administrative tasks (attendance, grade book, contact info, etc) and Data Director for all the testing data. It's conceivable that I won't look up testing data but once or twice a year, but Zangle will be running constantly. I agree, though--having too many applications is at least as bad as doing it all manually.
Post a Comment