Wednesday, April 16, 2014

The SAT and Intelligence

Having a high enough SAT score from "back in the day" can get you into Mensa, so the smart people have some faith in the SAT as an intelligence test.  There's plenty of evidence that such faith is merited:
The College Board—the standardized testing behemoth that develops and administers the SAT and other tests—has redesigned its flagship product again. Beginning in spring 2016, the writing section will be optional, the reading section will no longer test “obscure” vocabulary words, and the math section will put more emphasis on solving problems with real-world relevance. Overall, as the College Board explains on its website, “The redesigned SAT will more closely reflect the real work of college and career, where a flexible command of evidence—whether found in text or graphic [sic]—is more important than ever.”

A number of pressures may be behind this redesign. Perhaps it’s competition from the ACT, or fear that unless the SAT is made to seem more relevant, more colleges will go the way of Wake Forest, Brandeis, and Sarah Lawrence and join the “test optional admissions movement,” which already boasts several hundred members. Or maybe it’s the wave of bad press that standardized testing, in general, has received over the past few years.

Critics of standardized testing are grabbing this opportunity to take their best shot at the SAT. They make two main arguments. The first is simply that a person’s SAT score is essentially meaningless—that it says nothing about whether that person will go on to succeed in college. Leon Botstein, president of Bard College and longtime standardized testing critic, wrote in Time that the SAT “needs to be abandoned and replaced"...

Along the same lines, Elizabeth Kolbert wrote in The New Yorker that “the SAT measures those skills—and really only those skills—necessary for the SATs.”

But this argument is wrong. The SAT does predict success in college—not perfectly, but relatively well, especially given that it takes just a few hours to administer. And, unlike a “complex portrait” of a student’s life, it can be scored in an objective way. (In a recent New York Times op-ed, the University of New Hampshire psychologist John D. Mayer aptly described the SAT’s validity as an “astonishing achievement.”) In a study published in Psychological Science, University of Minnesota researchers Paul Sackett, Nathan Kuncel, and their colleagues investigated the relationship between SAT scores and college grades in a very large sample: nearly 150,000 students from 110 colleges and universities. SAT scores predicted first-year college GPA about as well as high school grades did, and the best prediction was achieved by considering both factors. Botstein, Boylan, and Kolbert are either unaware of this directly relevant, easily accessible, and widely disseminated empirical evidence, or they have decided to ignore it and base their claims on intuition and anecdote—or perhaps on their beliefs about the way the world should be rather than the way it is.

Furthermore, contrary to popular belief, it’s not just first-year college GPA that SAT scores predict. In a four-year study that started with nearly 3,000 college students, a team of Michigan State University researchers led by Neal Schmitt found that test score (SAT or ACT—whichever the student took) correlated strongly with cumulative GPA at the end of the fourth year. If the students were ranked on both their test scores and cumulative GPAs, those who had test scores in the top half (above the 50th percentile, or median) would have had a roughly two-thirds chance of having a cumulative GPA in the top half. By contrast, students with bottom-half SAT scores would be only one-third likely to make it to the top half in GPA.

Test scores also predicted whether the students graduated: A student who scored in the 95th percentile on the SAT or ACT was about 60 percent more likely to graduate than a student who scored in the 50th percentile. Similarly impressive evidence supports the validity of the SAT’s graduate school counterparts: the Graduate Record Examinations, the Law School Admissions Test, and the Graduate Management Admission Test. A 2007 Science article summed up the evidence succinctly: “Standardized admissions tests have positive and useful relationships with subsequent student accomplishments.”

SAT scores even predict success beyond the college years...

The second popular anti-SAT argument is that, if the test measures anything at all, it’s not cognitive skill but socioeconomic status...It’s true that economic background correlates with SAT scores. Kids from well-off families tend to do better on the SAT. However, the correlation is far from perfect. In the University of Minnesota study of nearly 150,000 students, the correlation between socioeconomic status, or SES, and SAT was not trivial but not huge. (A perfect correlation has a value of 1; this one was .25.) What this means is that there are plenty of low-income students who get good scores on the SAT; there are even likely to be low-income students among those who achieve a perfect score on the SAT.
A correlation of .25 is very small.
What this all means is that the SAT measures something—some stable characteristic of high school students other than their parents’ income—that translates into success in college. And what could that characteristic be? General intelligence.
As the SAT is changed from an intelligence test to more of an achievement test, its primary usefulness is diluted.

There's plenty more in the article, including much about IQ and intelligence, and I encourage you to go read the whole thing.  Very interesting.

4 comments:

  1. I was told that part of the reason the SAT was changing was to bring it more in line with common core. Isn't the guy in charge of the SAT one of the primary people behind CCSS?

    ReplyDelete
  2. They seriously tried to argue that a correlation of .25 was 'not trivial' ? Seriously? That's why economists mock sociologists mercilessly.

    ReplyDelete
  3. PeggyU9:43 PM

    I don't even remember what I got the SAT. A coffee cup stain, probably. I also took the ACT, and don't remember that score either.

    I do remember not being that nervous about taking either of them. It was just something you did and life didn't revolve around test taking. Nowadays, a whole industry has sprung up to improve standardized test scores. And the amount the test prep services charge is appalling. Apparently, they must be getting clients, even at those prices!

    ReplyDelete
  4. It's all part of the arms race.

    ReplyDelete