Friday, February 23, 2007

Grade Inflation?

I don't think my students will ever accuse me of contributing to grade inflation.

WASHINGTON (AP) -- It doesn't add up.

Two federal reports out Thursday offer conflicting messages about how well high-schoolers are doing academically.

One showed that seniors did poorly on national math and reading tests.

The other -- a review of high school transcripts from 2005 graduates -- showed students earning more credits, taking more challenging courses and getting better grades...

The transcript study showed that 2005 high school graduates had an overall grade-point average just shy of 3.0 -- or about a B. That has gone up from a grade-point average of about 2.7 in 1990.

It is unclear whether student performance has improved or whether grade inflation or something else might be responsible for the higher grades, the report said.


And what about all the kids who take tougher classes?

"I'm guessing that those levels don't connote the level of rigor that we think they do. Otherwise kids would be scoring higher on the NAEP test," said David Gordon, a governing board member and the superintendent of schools in Sacramento, California.

Mark Schneider, commissioner of the federal National Center for Education Statistics, said the government would conduct a study examining the rigor of high school courses.


Such a study could be crap. What worthwhile information could you draw from a study of high school courses nationwide? Unless there's something of which I'm not thinking, the only value you'd get would be at the school or teacher level. Anything higher than that would be too general to be useful.

So what's being tested? What's causing this concern about course rigor?

On the math test, about 60 percent of high school seniors performed at or above the basic level. At that level, a student should be able to convert a decimal to a fraction, for example.

Just one-fourth of 12th-graders were proficient or better in math, meaning they demonstrated solid academic performance. To qualify as "proficient," students might have to determine what type of graph should be used to display particular types of data...

On the math test, 29 percent of white students reached the proficient level, compared with 8 percent of Hispanics and 6 percent of blacks.


Obviously the standards aren't very high, so what causes those results? White racist teachers, a culture that doesn't value education, the soft bigotry of low expectations? And what can be done about it?

One of the stated goals of the federal No Child Left Behind law is to reduce the gaps in achievement between whites and minorities.

The law is up for review this year. It currently requires reading and math tests annually in grades three through eight and once in high school. The Bush administration wants to add more testing in high school.


Here's where the lefties will squeal: testing isn't teaching! Of course it's not, and it's not designed to teach any more than checking your cholesterol is designed to lower your cholesterol. We need testing data, however, to better target any reforms and/or improvements. If you don't know where the problem is--and I hope we can all agree there's a problem--you can't fix it.

Don't you think that after 12 years of schooling, students deserve to be able to do more than convert a decimal to a fraction, and should in fact be able to do so?

8 comments:

  1. Anonymous5:29 PM

    And the disparity of results in these two studies is a surprise? Good grief.

    Thought exercise: Which of these two data sets will be most accurate and most reflective or actual student progress, learning and knowledge?

    Data Set 1: Students will be tested once during the year, using one generic testing instrument. The sole determinant of progress will be that test grade.

    Data Set 2: Students will be continually tested and closely observed, daily, over an entire school year and will complete and be graded on more than 100 separate assignments. All of the grades on these assignments will be considered in determining progress.

    If you believe that data set #1 would produce the best results, please return to your usual occupation: repeatedly striking yourself on the head with a large rock while drooling.

    Even if the score on a single, mandatory high stakes test was to be added into the data from data set 2, it would be of so little additional value as to make clear that its bother and expense was simply not worth it. Yet in many states, and if some have their way, all states, data set 1 will prevail.

    ReplyDelete
  2. I certainly don't think one single test tells you everything you need to know about a student, but it helps to have *some* baseline comparison considering you have millions of teachers giving millions of assignments and assigning millions of grades--holding students to millions of different standards.

    ReplyDelete
  3. Anonymous7:46 PM

    Dear Darren:

    While I'm sure we agree on much more than our occasional exchanges in the blogosphere might suggest, I do disagree on the ultimate utility of such tests, particularly when one considers the time, effort and expense incurred compared with what would and should be taught if such tests were not mandatory.

    Interestingly, I've taught in three states, two midwest and Texas, and standards and methods of teaching high school English, and every other discipline with which I am familiar are remarkably similar, far more similar in fact that different. This shouldn't be remarkable in that people are people wherever they live and given the nature of the disciplines and of humanity, we should expect far more similarities than differences.

    Given the choice between spending a month and more each year drilling for a single test or spending that time in analysis, writing, revising reading and more writing, I suspect I'll chose the later each time. Besides, I can show exactly how and how much each of my students has improved over time, and can explain exactly how they can improve short or long term. I can also explain exactly why a given student, in general or in terms of a given assignment, has failed to live up to their potential. What can an educrat in my state capital say about the same student? Their test score. On a single test. Taken on a single day.

    Thanks for establishing and maintaining a high wire for intellectual discourse.

    ReplyDelete
  4. Anonymous6:18 AM

    That's a false dichotomy. It isn't either/or. But if you have been doing your job, and your assessments have, indeed, been accurate, then the institutional test will reflect that.

    You must be the same Mike who demanded that Ed produce data, but presented not only no data in response, but opinions presented as facts.

    ReplyDelete
  5. Mike, spending a month prepping for tests isn't a fault of the testing regimen. It's the fault of teachers and administrators who think *that* is the way to improve test scores, rather than teaching actual content.

    Additionally, I've heard from others that the TAKS tests aren't good tests. Blame the legislature for choosing a test that isn't good.

    Here in California, we don't take off-the-shelf tests anymore. All of our tests are created to align with our state standards. I find them to be rigorous--and good tests overall.

    ReplyDelete
  6. Anonymous2:35 PM

    Darren:

    Please understand, I use tests--properly--on a regular basis. I see the cost/benefit calculus a bit differently, I suspect.

    When high stakes tests are mandated, the law of unintended consequences comes along for the ride. And in Texas, and I suspect most other states, one cannot merely teach the curriculum and/or standards if one wants students to pass the test. Of course, the state educrats maintain that teaching to the test is absolutely unnecessary, but they are engaging in what we in English call "lying." To be fair, perhaps they're really so far removed that they don't get it.

    We do not "think" that is the only way to pass the tests, we "know" through years of experience, test results and decoding the information put out by the state about what students must do to pass, including examples they have provided, what is required to pass the test.

    And our tests are created specifically for Texas, aligned to standards, utterly valid, they're the most magnificent tests known to man. And if you don't believe me, just ask the educrats; they'll be glad to tell you how swell their efforts are. In fact, I recently attended a session of question vetting involving teacher from all over the state, wherein we were supposed to examine and approve test questions for future versions of the test. I have many disheartening anecdotes from that session, but I'll share one.

    Several of the TEA (Texas Education Agency--the state bureaucrats who administer education--our education bureaucracy is massive, overbearing, inefficient, and the absolute envy of other state educrats who would love to do the same) folks present said that they were getting many tear jerking stories on the written portions of the TAKS test, and couldn't understand why. I explained that when we were "taught" by our regional service center (they're all over the place, staffed by people who, judging by their levels of knowledge and performance were fired by local schools) they didn't really have a clue what we should do to ensure that students passed, but they did have state-provided examples of essays of the various score levels. Each and every essay that passed included a tear-jerking story to illustrate the writer's points. We've taught that since, and our students pass in the mid to high 90's. Do I need to say that the TEA folks just couldn't understand that their own examples apparently didn't make sense to them?

    Yes, the TAKS tests are far from competent. But in the real world, tests of this sort can easily cause more harm than good, and to little end.

    ReplyDelete
  7. Here's a post about testing that I agree with completely:
    http://rightwingnation.com/index.php/2007/02/24/2986/

    ReplyDelete
  8. Are you the same Mike to which he refers? That didn't occur to me until after I hit the "publish" button.

    ReplyDelete