Let's begin with predictive validity. Among the countless studies done on this subject over the years, not a single one has failed to find a high correlation between SAT scores and academic performance in college, as measured by grades or persistence. On a personal note, during my ten years as Provost of SUNY, I had my institutional research staff repeatedly review the relationship between SAT scores and academic success among our 33 baccalaureate campuses and their 200,000 + students, and found - as all the national research has confirmed - a near perfect correlation. SUNY schools and students with higher SAT profiles had higher grade point averages and markedly higher graduation rates. (boldface is mine--Darren)
You should read the whole thing.
6 comments:
It took us three years to get permission from the university, but we finally were allowed to do a similar study. To control for "cultural variables," we only looked at students who took the first semester course in their first semester of their freshman year, which was just a little over 3000 students over the three years we collected data. Recalculating high school GPAs was a nightmare (why can't high schools just use a normal 4-point scale like everybody else?), and we looked at three variables: high school GPA, SAT math section score, and total points earned in the course (course grade). Even I was surprised by how high the correlation coefficient was between SAT math and course points earned, 0.87, if I remember correctly, but high school GPA and total points earned was very low, 0.2-something. Correlations between GPA and SAT were likewise very low.
Had we had some way to measure other variables, like time spent studying or working on assignments, for example, I would have been very interested to analyze those as well, but the data just weren't there. Even if they had been there, they would have been self-reported, and therefore, unreliable.
If the SAT is such a good predictor of success in college how tough would it be to develop an eleventh grade SAT to determine success in twelfth grade?
And in case there's any doubt, I'm serious.
What would be the point? The kid's going to 12th grade anyway, but doesn't have to go to college.
Because a good test will not only tell you how the kid will do but it'll also tell you how the kid did. Aggregate results and it'll tell you how well a teacher, a principal, a book or a curriculum did.
Of course, that sort of inquisitiveness is dependent on a desire to know the facts a good test reveals. Parents want to know that, kids to a lesser extent. But as I've maintained, there's no institutional impetus for the professionals to pursue such information. Your pride may demand you pursue information about how good a teacher your are but the institution of public education is largely indifferent to that sort of feedback at any level.
What you describe sounds very much like a "value-added" approach that allows policy makers to see how districts, schools, and individual teachers do over time by seeing not where their students are, but how much progress students made over the course of a school year. I support such approaches.
Additionally, I give an end-of-year test to my students, a test developed by one or more of the University of California campuses. What's nifty is that we get the results back in less than two weeks. I'm told how my classes did as a whole, how many students got which questions correct, what topic areas my students did well/poorly in, etc. My students get individual feed back in the form of "you did great in", "you should review", and "you need significant review in".
You may support what you refer too as the "value-added" approach but the public education system, structurally, does not.
What I mean by "structurally" is that if *all* testing were to disappear tomorrow would that impose any hardship on the public education system? Would whatever the mission of public education is assumed to be made more difficult by an absolute end to all testing?
Of course not.
Post a Comment