Gas station without pumps

2014 December 27

We create a problem when we pass the incompetent

Filed under: Uncategorized — gasstationwithoutpumps @ 22:55
Tags: , , , ,

I finished my grading earlier this week, and I was little distressed at how many students did not pass my graduate bioinformatics class (19% of the students in the class did not pass this fall, about equally divided between the seniors and the first-year grads—note that “passing” for a grad student is B– or better, while for an undergrad is C or better). Some students were simply unprepared for the level of computer programming the course requires and were not able to get up to speed quickly enough.  They made substantial improvement during the quarter and should do fine next time around, particularly if they continue to practice their programming skills. Others have a history of failing courses and may or may not make the effort needed to develop their programming skills before their next attempt.

I don’t like to have students fail my courses (particularly not repeatedly, as some have done), but I can’t bring myself to pass students who have not come close to doing the required work. When I pass a student in a course, it means that I’m certifying that they are at least marginally competent in the skills that the course covers (most of my courses are about developing skills, not learning information).  I’ll give the students all the help and feedback I can to develop those skills, but I grade them on what they achieve, not on how much work they put in, what excuses they have, nor how many times they’ve attempted the course.

I often feel alone in holding the line on quality—I’m afraid that there are not enough faculty willing to fail students who don’t meet the requirements of the courses they are teaching.  Those teachers are just kicking the problem of inadequately prepared students on to the next teacher, or to the employer of the student who graduates without the skills a college graduate should have.

In The Academe Blog,  in the click-bait-named post Nude Adult Models, William Bennett, Common Core, Rotten Teachers, Apples, Robert Frost, Ulf Kirchdorfer wrote

The reality is that many teachers, whether prompted by supervisors or of their own volition, continue to pass students so that we have many that reach college with the most basic of literacy skills, in English, math, science, the foreign languages.

Tired of listening to some of my colleagues complain of college students being unable to write, I went to look at learning outcomes designed for students in secondary education, and sure enough, as I had suspected, even a junior high, or middle-school, student should be able to write a formulaic, basic five-paragraph theme.

Guess what. Many college students, even graduating ones, are unable to do so.

While I don’t often agree with Ulf (who often takes extreme positions just for the fun of argument), I have to agree with him that many of my students are not writing at what I would consider a college level for senior thesis proposals, even though they have had three prerequisite writing courses (including a tech writing course) as prerequisites to the senior thesis.  And it isn’t just writing coherent papers in English that is a problem, as evidenced by the failure rate in my bioinformatics course due to inadequate programming skills (despite several prerequisite programming courses).

In an article about Linda B. Nelson’s “spec” grading system, which attempts to fix some of the problems with current grading practices, she is quoted:

“Most students (today) have never failed at anything,” Nilson noted, since their generation grew up receiving inflated grades and trophies for mere participation in sports. “If they don’t fail now, they’re going to have a really hard life.”

It doesn’t do anyone any favors to pass students who do not meet the minimum competency expected—the students are deluded into thinking they are much more competent than they are (so that they don’t take the necessary actions to remediate their problems); future teachers are forced to either reteach what the students should already have learned (which means that the students who had the prerequisites get shortchanged) or lose a big chunk of the class; the university degree loses its value as a marker of competence; and employers ratchet up credentials needed for employment (as the degrees mean less, higher degrees are asked for).

There is pressure on faculty to raise pass rates and pass students who don’t have adequate preparation.  The University administration wants to increase the 4-year graduation rate while taking in more students from much weaker high schools. I worry that the administration is pushing for higher graduation rates without considering the problems caused by pressuring faculty to pass students who are not competent. The reputation of the university is based on the competence of its alumni—pumping out unqualified students would fairly quickly dissipate the university’s good name.

Four-year graduation is not very common in engineering fields—even good students who start with every advantage (like several AP courses in high school with good AP scores) have a hard time packing everything into 4 years. Minor changes to course schedules can throw off even the best-laid plans, so an extra quarter or two are completely routine occurrences. And that’s for the top students.  Students coming in with weak math preparation find it almost impossible to finish in 4 years, because they have to redo high school math (precalculus), causing delays in their starting physics and engineering classes. If they ever fail a course, they may end up a full year behind, because the tightening of instructional funding has resulted in many courses only being offered once a year.  There is a lot of pressure on faculty to pass kids who clearly are not meeting standards, so that their graduation is not delayed—as if the diploma was all that mattered, not the education it is supposed to represent.

There are things that administrators can do to reduce the pressure on faculty.  For example, they could stop pushing 4-year graduation rates, and pay more attention to the 5-year rates. The extra time would allow students with a weaker high school background to catch up.  (But our governor wants to reduce college to 3 years, which can only work if we either fail a lot of students or lower standards enormously—guess which he wants. Hint: he favors online education.) Students who need remedial work should be given extra support and extra time to get up to the level needed for college, not passed through college with only high school education.

Or they could stop admitting students to engineering programs who haven’t mastered high school math and high school English.  This could be difficult to do, as high school grades are so inflated that “A” really does mean “Average” now, and the standardized tests only cover the first two years of high school math and that superficially (my son, as a sixth grader, with no education in high school math, got a 720 on the SAT math section).  It is hard for admissions officers to tell whether a student is capable of college-level writing or college-level math if all the information they get is only checking 8th-grade-level performance.

Or administrators could encourage more transfer students from community colleges, where they may have taken several years to recover from inadequate high school education and get to the point where they can handle the proper expectations of college courses.  (That would help with the attrition due to freshman partying also.)

Or administrators could pay for enough tenured faculty to teach courses with high standards, without the pressure that untenured and contingent faculty feel to keep a high pass rate in order to get “good” teaching evaluations and retain their jobs.

Realistically, I don’t expect administrators to do any of those reasonable things, so it is up to the faculty to hold onto academic standards, despite pressure from administrators to raise the 4-year graduation rate.

2011 July 27

A is average

Filed under: Uncategorized — gasstationwithoutpumps @ 00:01
Tags: , , ,

Everyone has heard of schools and colleges that use the grading scale “A is average, B is bad, and C is catastrophic”.  But is that just a joke, or is there some truth to it?

A couple of weeks ago, Catherine Rampell published an article in the NY Times: A History of College Grade Inflation.  The article was based on “Where A Is Ordinary: The Evolution of American College and University Grading, 1940–2009” by Stuart Rojstaczer & Christopher Healy.  Unfortunately, the University library does not seem to have an electronic subscription to Teachers College Record (I don’t know how to parse that title), and I’m not willing to pay $7 to read the original paper, so you’ll be getting a rehash of a rehash here.

Catherine Rampell included the following figure, which she credits to the original authors:

Figure by Stuart Rojstaczer and Christopher Healy, via the NY Times article. Presumably original caption: "Note: 1940 and 1950 (nonconnected data points in figure) represent averages from 1935 to 1944 and 1945 to 1954, respectively. Data from 1960 onward represent annual averages in their database, smoothed with a three-year centered moving average."

This is the guts of the article—grade inflation in college really ramped up in the late 60s, but has been getting steadily stronger from 1984 on. (Strange—I would have expected the doublespeak of grade inflation to have peaked in Orwell’s 1984.)

The increasing number of A’s is not due to an increase in the quality of college students.  If anything, the average intelligence and preparedness of college students has dropped somewhat as a greater fraction of high school students enter college.  What we’re seeing here is pure grade inflation.

I have no idea how the data were collected, but the abstract claims that data were gathered from over 200 colleges and universities.

There is a lot missing from this analysis (which may be in the original article).  For example, the abstract says that “science and engineering-focused schools grade more stringently than those emphasizing the liberal arts.”  How much of the grade inflation then, is due to shifts in what majors students are in (are there a greater fraction of students in the easy-A majors?) and how much due to grade inflation within majors?  It would be interesting to see historical curves like this that were not averages across all majors in all colleges, but broken up by major (or groups of majors, to avoid problems with small numbers). I think that there has been grade inflation even in engineering, with B- replacing C and C replacing D, but not nearly as bad as in the humanities, where A- has replaced B, and B+ has replaced C.

In one sense, grade inflation does not matter much, because no one really expects grades to be comparable between different generations any more. But there is a problem with loss of information in the grade.  The information content of a distribution with probabilities [0.35, 0.34, 0.14, 0.12, 0.05] is 2.04 bits (using H= -\sum p_i \log_2 p_i), but with probabilities [0.43, 0.34, 0.15, 0.04, 0.04] there are only 1.83 bits of information.  We get 10% less information from a grade than we used to, so we need 10% more courses to get the same amount of information.  Perhaps this explains the push for more and more education—as each grade gets less informative, we need more and more of them to distinguish the marginally competent from the truly incompetent.

A grading scale set up to inform us maximally about the students would have roughly equal numbers in each category, but we’ve never seen that in the US and are not likely to any time soon.

 

2011 May 16

High school course title inflation

Filed under: Uncategorized — gasstationwithoutpumps @ 06:12
Tags: , , ,

The New York Times published an article by Sam Dillon about three weeks ago: As H.S. Course Titles Become Inflated, Test Scores Fail to Follow.  The basic observation is a simple one: the number of high-school students taking courses with impressive titles (like Advanced Placement) has gone up greatly over the last 20 years, but performance on standardized tests has remained flat.

There are many possible explanations:

  • The number taking rigorous courses is too small to have any effect on the average.  This was probably the case in 1990, when only about 5% of high school graduates had completed a “rigorous” curriculum.  But by 2009, when 13% had done so, there should have been some movement of the mean.
  • Courses have stayed about the same, but grade inflation has caused students who previously would have failed to now pass. There has undoubtedly been a lot of grade inflation, but the failure rate for AP courses has never been high, so there wasn’t much room for changing the number of students passing the course just by tweaking the grading.
  • Administrators and teachers have arbitrarily decided to put fancier names on their courses, as a marketing gesture to improve the chances of their students getting into college, without actually changing the courses.  This is the rather cynical view that Sam Dillon seems to favor.  There is certainly some truth to it, but I’m not convinced that many teachers would have gone along with it (I have less faith in the integrity of administrators and textbook publishers).
  • Advanced courses have been watered down to match the larger fraction of students attempting them.  This is consistent with my description of memes colliding—teachers need to teach the students they have, not the students they ought to have, so attempts to push more students into advanced classes results in the advanced classes becoming less advanced.

I favor the last explanation: that teachers and administrators, seeing that students taking advanced classes do better in college admissions and in college, have pushed more students to take advanced courses, with the inevitable result that those courses became less advanced.  It is a classic case of confusing correlation with causality: smarter, better-prepared, harder-working students are more likely to take advanced courses and are more likely to do well in college admissions and college.  This doesn’t mean that having weaker students take the advanced courses will make them do better.

What has happened is that the intervention that used to be effective for educating the top 5% of high-school students is no longer available, since it is has been replaced by a something supposedly suitable for the top 15%.  The growth of enrollment in advanced courses has paradoxically reduced the educational options for the top students.

Some of the changes that the enrollment growth in advanced classes entails include a greater emphasis on rote memory and drill (generally not needed by the top 5%), more review of forgotten material from previous courses (hence less time on new material), and a generally slower pace of learning with a higher busy work load.

AP exams have grown even faster than courses, with 1.2 million in 2000 and 3.1 million in 2010.  The failure rate (scores of 1 or 2) has gone up from 36.4% in 2000 to 42.5 % in 2010, but the number passing has risen (760 thousand to 1.78 million), so the growth has not all been at the bottom.

Bruce Orr, the principal of a high school that is sending large number of students to take (and fail) the AP tests, is quoted as saying “Just being in that rigorous course environment does these kids a world of good.”  Frankly, I’m dubious that setting kids up to fail the AP tests is doing them much good, and it is certainly harming the somewhat smaller number who could have passed the tests if they had been given a course that was really at the right level, rather than one watered down to match the level of students not ready for college-level work yet.

2010 July 28

Grade inflation

Filed under: Uncategorized — gasstationwithoutpumps @ 13:54
Tags: , ,

The site www.gradeinflation.com has some fascinating statistics on grade inflation at US universities, spanning several decades.  One plot starts around 1920, showing GPAs of around 2.3 (C+), gradual growth in the 30s and 40s to 2.5, which held fairly steady through the 50s.  In the 60s there was explosive grade inflation, bringing public-college GPAs to around 2.8 and private-college GPAs to around 3.1.  Grades held fairly steady through the 70s and 80s, then started creeping up again.  By the 2006–07 school year, private-college GPAs averaged 3.30 (B+) and public-college GPAs 3.01 (B).  The variance is now quite high, with some schools having a GPA over 3.5 (A-)  and others having a GPA of 2.7 (B-).  It seems that there really are schools using the “A is average, B is bad, C is catastrophic” grading rubric.

As I have long believed, there is a major difference between humanities and sciences grading standards.  What I didn’t realize is that the gap is growing.  In the 40s, the average GPA in the humanities was about 0.17 points higher than in the sciences.  In the 00s it was around 0.3 points.  In some schools, the difference was more than 0.5 points.  I think that there is a gap between engineering and science also (with engineering faculty being stricter graders than science faculty), but I have no data to back up this belief.

The site also gives a rule of thumb for guessing the GPA of a college:

\mbox{GPA} = 2.8 + \mbox{Rejection Percentage}/200 + \left\{ {0.2 \mbox{, if private}} \atop {0 \mbox{, if public}}\right.

The site examines several conjectures about why there has been so much grade inflation. It manages to reject some of the hypotheses as inconsistent with the data, but does not claim to have a convincing explanation of the phenomenon.

%d bloggers like this: