In this post I identified a problem at our school, a decision that was made without backup data. I had some students run the numbers yesterday and today, and I've checked their work and it's correct.
And it doesn't look good. Our "Algebra 1.5" class isn't turning marginal students into excellent students. It didn't take students unprepared for Algebra 2 and, in a year, turn them into A or B Algebra 2 students in a proportion similar to the excelling students who didn't need or take that intermediate course.
Knowing I'd set the standard high, I lowered it bit. We then tested to see if the proportions of A's, B's, and C's overall matched that of Algebra 2 students who hadn't taken the course. It did not.
This tells me that the course, as currently designed, isn't doing nearly as well as we'd like it to. This could be a call to revamp the course, not just eliminate it, as our administration has done.
Ideally we'd evaluate the effectiveness of the course by comparing the Algebra 2 grades of students who took the intermediate course with the Algebra 2 grade they would have gotten had they not taken it; of course, that is impossible to do. We're going to try one more analysis, though, and compare the Algebra 2 grades of students who needed the intermediate course and didn't take it to those who did. We're going to search through our records and see if we can come up with that data.
As it stands right now, the course we have isn't what I'd call very effective. Here's how we can spin that: we created the course in response to what we saw as a student need, we tried the course for a few years, and now data tells us the course isn't effective and it's gone. The WASC people should love that kind of data analysis and introspection!