Sunday, February 12, 2006

"The Research"

Every educator has heard the phrase "The research shows..." or "The research proves...." It's a phrase that's designed to shut off challenges to whatever the speaker is saying and to bully the listener into accepting it. After all, who could possibly disagree with what "the research" proves?

The problem is twofold. First, it's very likely the speaker is blowing smoke and has no research at all to back up his/her position. This is likely to be true if the speaker cannot refer to a specific study to justify the claim. Second, if there is research to support the speaker's position, very often (especially in education) that research is not based on valid research methodologies.

I'll give one lengthy specific example, and then get back to the general topic.

My research-busting hero is Christine Rossell of Boston University. Ten years ago she and partner Ken Baker looked at studies of bilingual education to determine the best way to teach non-native English speakers. At the time, "everyone" knew that transitional bilingual education was the best way to go--put the kids in a class in which the majority of time the home language is used to teach content (math, science, history, etc), and over the course of several years transition to having most of the instruction done in English. Study after study backed this process up.

Except they didn't.

Their report, published in Research in the Teaching of English, Vol. 30, No. 1, February 1996, states that methodologically sound studies had the following characteristics:

1. They were true experiments in which students were randomly assigned to treatment and control groups;
2. They had non-random assignment that either matched students in the treatment and comparison groups on factors that influence achievement or statistically controlled for them;
3. They included a comparison group of LEP (limited English proficient--Darren) students of the same ethnicity and similar language background;
4. Outcome measures were in English using normal curve equivalents (NCEs), raw scores, scale scores, or percentiles, but not grade equivalents;
5. There were no additional educational treatments, or the studies controlled for additional treatments if they existed.


Rossell and Baker found that unacceptable studies usually exhibited the following characteristics:

1. The study did not compare program alternatives or assess educational outcomes.
2. The study did not use randomly assigned students and made no effort to control for possible initial differences between students in different programs.
3. The study did not apply appropriate statistical tests (probably because of the heavy math involved--Darren).
4. The study used a(n English-speaking) norm-referenced design.
5. The study examined gains over the school year without a control group.
6. The study used grade-equivalent scores.
7. The study compared test results in different languages for students in different programs.
8. The study did not control for the confounding effect of other important educational treatments that were administered to at least one of the groups, but not all of them.


They then went into detail explaining the issues surrounding each of these flaws.

So what did they find? Of the 300 prominent studies they examined, only 72 were methodologically sound. Yes, that means that 228 studies, an overwhelming percentage of "the research", were statistical crap. And they identified the studies in both groups in an appendix. Later work by Rossell increased the number of studies she evaluated to over 500.

Three teaching methodologies were studied: submersion, transitional bilingual education, and structured English immersion. Here's my quick summary of each:
  • submersion--sink or swim
  • transitional bilingual education--described above
  • structured immersion--instruction is in English at a level the students can understand, in a self-contained classroom consisting entirely of LEP students
  • ESL--pull-out English instruction (only 3 of the 300 studies used this, so I'll ignore it here)

So what did the 72 sound studies show? Of the three primary methods of teaching multi-lingual students, structured English immersion clearly won out. Rossell summarizes her work here. Of those 300 studies, do you have any guesses which group, the 72 or the 228, the bilingual education lobby means when they talk about "the research"?

I have a personal story about "the research". I wrote about it here.

So now back to the general topic. This week's Education Gadfly, published online by the Fordham Foundation, has an editorial about the "mad, mad world of education research". The gist is contained in this paragraph:

Why are randomized experiments being dropped faster than a tainted control group? Hsieh put that question to a number of folks. One “speculated that with the increasing popularity of qualitative methods (i.e., not relying on quantitative data), some researchers may have rejected the underlying assumptions of experimental research in favor of a post-modern, relativist view.” A more cynical interpretation holds that because empirical research is difficult to conduct and yields unpopular results, many authors simply take their studies down an easier path. Why risk tenure by studying the effectiveness of phonics, for example, if a university promotion committee member worships at the altar of whole language? Why bother with multivariate analysis when a feminist critique of patriarchal statistical methods will do?

This is why and how we get whole language, fuzzy math, ethnomath, ESL, "education for social justice" (as opposed to learning how to read, write, compute, and think), etc. If something fe-e-e-e-e-ls good, it must be good. We don't need no education...and we don't need no facts, either.

It's ok, though. They're not my kids, right? Why should I care?

Have some fun. The next time you hear the phrase "the research shows", be skeptical. Ask for some backup information. Make the speaker prove that their statements are backed up by scholarship and not emotion. Watch how quickly you get shut down, get told "I'll have to get back to you", or have the subject changed.

1 comment:

TangoMan said...

This is no surprise to you and me, now is it?

I've been beating this drum for years.

My prediction - this report will go down the memory hole tout suite.