I just took over as undergrad director of a couple of our majors this summer, and I’ve been thinking about how to revamp the curriculum. Because the bioengineering major relies on courses from 11 different departments, it is very difficult to maintain. Even if each department were very stable and only changed their courses once a decade, our program would still be facing changes every year. We generally handle the changes, which we only find out about after it is too late to change our catalog copy, with exceptions and clunky workarounds, but that gets unwieldy after a while, and we have to do a clean sweep and start over. This year I want to do that clean sweep. Of course, I can’t do it by fiat—I have to get a curriculum committee together representing most of the departments and get them all to agree on a new curriculum. That is likely to be a difficult undertaking.
This fall, I got another task dumped on my by the administration: coming up with “Program Learning Outcomes” for our majors, showing how each required course supports one or more of the PLOs, and writing a plan for measuring 2 or more PLOs each year and how we will use the measurements for improving the program. The description of what they want is presented in a 46-page PDF file, which reads like it was generated by an eduspeak jargon generator, like the one at http://www.sciencegeek.net/lingo.html. Oh, and they want it all done by the end of Fall quarter.
There was an assumption built into the message from the Vice Provost of Academic Affairs that everyone already knew about this and had been working on it for months. (I didn’t get the message directly—it was forwarded from the department chair.) Of course, this was the first I had heard of the PLOs, so I searched my e-mail to see if there was any hint earlier. As it turns out there was.
There was a message sent to the faculty in June, during final exam week, that had six paragraphs listing what new degree program had been created and what ones had been removed. At the very end of the email, there was one paragraph that said
There is a campus-wide effort underway to prepare for our next Western Association of Schools and Colleges (WASC) accreditation, including further defining student based learning outcomes for each undergraduate and graduate major. Program learning outcomes will be posted on department websites for easy communication to all students. Assessment will be incorporated into department external reviews. More information is available on my VPAA website: http://academicaffairs.ucsc.edu/accreditation/docs/Call_Degree-PLOs_Apr2013.pdf
I certainly did not notice that paragraph at the time (I only found it now by searching through all my archived e-mail). Even if I had read it, it would not have told me that there was a major effort needed to produce PLOs—that only became apparent only clicking through to the link, which had a letter to deans, department chairs, and program directors explaining the new mandate from the administration to the faculty. That letter was probably never read by our over-worked department chair, and our faculty has certainly never discussed or agreed to this way of doing curricular planning and maintenance.
It seems like some of the worst practices of K–12 education administration are being pushed on the university—administrative mandates for creating bullshit objectives and reams of paper wasted on “rubrics” for measuring the unmeasurable.
The underlying idea—designing a curriculum by first figuring out what you want students to learn, then making sure that students do learn that—is sound, but the implementation by administrative fiat with no faculty buy-in whatsoever is doomed to being a time-wasting, bureaucratic exercise in producing piles of paper reports that are never read or used by the faculty or anyone else. It looks like the main purpose of this whole exercise was to hire an administrator to manage it—no one else is going to benefit from the half-hearted (at best), pro forma exercise.
When I was in the computer engineering department, we went through an extensive self study for ABET accreditation which took about 2 faculty-years of effort to complete (one faculty member full time, and the rest of us contributing part time). We analyzed all our courses, both expectations of what students would know going into the course and what they would be expected to know coming out of the course. We looked every path through the major, collected examples of student work to check calibration of grading in all courses, and generally did the curriculum analysis thoroughly. It was a valuable exercise that resulted in several improvements to individual courses and some improvements to the curriculum as a whole, but it was exhausting. It worked because all the faculty believed it was worth the effort and put in the time to make it work.
On the other hand, when EE went through their first self study, they out-sourced a lot of the work to someone in another department and did the minimum necessary to satisfy the bureaucrats. There was no attempt to fix any problems that were uncovered—just to paper them over so that no one would notice. It had no lasting effect on their courses and was a complete waste of everyone’s time. I fear that this new initiative from the administration will be more like the EE lack of effort than the computer engineering effort.
I care a lot about the design of curricula, having been heavily involved now in the design of curricula for 3 undergraduate majors and 2 grad programs. I was planning to put a lot of time into overhauling the bioengineering and bioinformatics undergrad curricula this Fall. And this PLO initiative will not help at all with that. Sure, I can come up with a list of some of the main things students should learn (though phrasing them in the eduspeak manner required in the 46 pages of instructions will not be easy nor useful). But all the trash that they put around that assumes that the curriculum is a fixed, unchanging entity, with all required courses taught by multiple instructors, and that the department has full control over all instructors for all required courses, and that improvement comes only in the form of minor tweaks to existing courses. None of those things are true of our program.
Did I mention that the bioengineering degree has required courses from 11 different departments? And that all but a few of these courses are being taught primarily for majors in other fields? Most of the annual curriculum maintenance is in the form of finding workarounds for changes made by other departments, not tweaking courses within the department—so all the verbiage about how assessment will “improve the curriculum, pedagogy, and/or advising” is just bureaucratic bullshit.
We already do a lot of outcomes assessment (for example, as undergrad director I review senior portfolios for evidence of research and writing skills, and we do exit interviews with all graduating seniors to get feedback on what parts of the curriculum are working and which need improvement). Writing up a brief description of what we already do, however, is unlikely to satisfy the administrators—given that they consider 46 pages of “guidelines” a suitable introduction to their pet project.
Our faculty are busy teaching and doing research, neither of which are aided by producing paper doorstops for administrative office doors. I doubt that I can get their attention for this administrative time waster. I’ll probably throw together something to try to satisfy the administrators, but I’d much rather spend time on improving the curriculum than on placating bureaucrats.