Gas station without pumps

2013 September 28

Program Learning Outcomes

Filed under: Uncategorized — gasstationwithoutpumps @ 19:52
Tags: , , ,

I just took over as undergrad director of a couple of our majors this summer, and I’ve been thinking about how to revamp the curriculum.  Because the bioengineering major relies on courses from 11 different departments, it is very difficult to maintain.  Even if each department were very stable and only changed their courses once a decade, our program would still be facing changes every year.  We generally handle the changes, which we only find out about after it is too late to change our catalog copy, with exceptions and clunky workarounds, but that gets unwieldy after a while, and we have to do a clean sweep and start over.  This year I want to do that clean sweep.  Of course, I can’t do it by fiat—I have to get a curriculum committee together representing most of the departments and get them all to agree on a new curriculum.  That is likely to be a difficult undertaking.

This fall, I got another task dumped on my by the administration: coming up with “Program Learning Outcomes” for our majors, showing how each required course supports one or more of the PLOs, and writing a plan for measuring 2 or more PLOs each year and how we will use the measurements for improving the program.  The description of what they want is presented in a 46-page PDF file, which reads like it was generated by an eduspeak jargon generator, like the one at  Oh, and they want it all done by the end of Fall quarter.

There was an assumption built into the message from the Vice Provost of Academic Affairs that everyone already knew about this and had been working on it for months.  (I didn’t get the message directly—it was forwarded from the department chair.)  Of course, this was the first I had heard of the PLOs, so I searched my e-mail to see if there was any hint earlier.  As it turns out there was.

There was a message sent to the faculty  in June, during final exam week, that had six paragraphs listing what new degree program had been created and what ones had been removed.  At the very end of the email, there was one paragraph that said

There is a campus-wide effort underway to prepare for our next Western Association of Schools and Colleges (WASC) accreditation, including further defining student based learning outcomes for each undergraduate and graduate major.  Program learning outcomes will be posted on department websites for easy communication to all students. Assessment will be incorporated into department external reviews. More information is available on my VPAA  website:

I certainly did not notice that paragraph at the time (I only found it now by searching through all my archived e-mail).  Even if I had read it, it would not have told me that there was a major effort needed to produce PLOs—that only became apparent only clicking through to the link, which had a letter to deans, department chairs, and program directors explaining the new mandate from the administration to the faculty.  That letter was probably never read by our over-worked department chair, and our faculty has certainly never discussed or agreed to this way of doing curricular planning and maintenance.

It seems like some of the worst practices of K–12 education administration are being pushed on the university—administrative mandates for creating bullshit objectives and reams of paper wasted on “rubrics” for measuring the unmeasurable.

The underlying idea—designing a curriculum by first figuring out what you want students to learn, then making sure that students do learn that—is sound, but the implementation by administrative fiat with no faculty buy-in whatsoever is doomed to being a time-wasting, bureaucratic exercise in producing piles of paper reports that are never read or used by the faculty or anyone else. It looks like the main purpose of this whole exercise was to hire an administrator to manage it—no one else is going to benefit from the half-hearted (at best), pro forma exercise.

When I was in the computer engineering department, we went through an extensive self study for ABET accreditation which took about 2 faculty-years of effort to complete (one faculty member full time, and the rest of us contributing part time).  We analyzed all our courses, both expectations of what students would know going into the course and what they would be expected to know coming out of the course.  We looked every path through the major, collected examples of student work to check calibration of grading in all courses, and generally did the curriculum analysis thoroughly.  It was a valuable exercise that resulted in several improvements to individual courses and some improvements to the curriculum as a whole, but it was exhausting.  It worked because all the faculty believed it was worth the effort and put in the time to make it work.

On the  other hand, when EE went through their first self study, they out-sourced a lot of the work to someone in another department and did the minimum necessary to satisfy the bureaucrats. There was no attempt to fix any problems that were uncovered—just to paper them over so that no one would notice. It had no lasting effect on their courses and was a complete waste of everyone’s time.  I fear that this new initiative from the administration will be more like the EE lack of effort than the computer engineering effort.

I care a lot about the design of curricula, having been heavily involved now in the design of curricula for 3 undergraduate majors and 2 grad programs.  I was planning to put a lot of time into overhauling the bioengineering and bioinformatics undergrad curricula this Fall.  And this PLO initiative will not help at all with that.  Sure, I can come up with a list of some of the main things students should learn (though phrasing them in the eduspeak manner required in the 46 pages of instructions will not be easy nor useful).  But all the trash that they put around that assumes that the curriculum is a fixed, unchanging entity, with all required courses taught by multiple instructors, and that the department has full control over all instructors for all required courses, and that improvement comes only in the form of minor tweaks to existing courses.  None of those things are true of our program.

Did I mention that the bioengineering degree has required courses from 11 different departments?  And that all but a few of these courses are being taught primarily for majors in other fields? Most of the annual curriculum maintenance is in the form of finding workarounds for changes made by other departments, not tweaking courses within the department—so all the verbiage about how assessment will “improve the curriculum, pedagogy, and/or advising” is just bureaucratic bullshit.

We already do a lot of outcomes assessment (for example, as undergrad director I review senior portfolios for evidence of research and writing skills, and we do exit interviews with all graduating seniors to get feedback on what parts of the curriculum are working and which need improvement).  Writing up a brief description of what we already do, however, is unlikely to satisfy the administrators—given that they consider 46 pages of “guidelines” a suitable introduction to their pet project.

Our faculty are busy teaching and doing research, neither of which are aided by producing paper doorstops for administrative office doors.  I doubt that I can get their attention for this administrative time waster. I’ll probably throw together something to try to satisfy the administrators, but I’d much rather spend time on improving the curriculum than on placating bureaucrats.

2013 August 27

Impedance matching as instructional metaphor

Filed under: Uncategorized — gasstationwithoutpumps @ 05:38
Tags: , ,

Joe Redish, in his blog The Unabashed Academic, wrote a post  (What should we tell a colleague about DBER?) about getting colleagues to listen to advice from education researchers.  He recounts incidents which help explain why so many teachers are reluctant to listen to education researchers, then summarizes the message that he would like to get across:

1. Think carefully about what your real goals are for the particular population of students you are teaching.

2. Find ways to get sufficient feedback from the students that you can figure out, not just whether they have learned what you have taught, but how they have interpreted it and what knowledge and perspectives they bring to your class.

3. Respect both the knowledge they are bringing and them as learners. “Impedance match” your instruction to where they are and what they have to work with.

4. Repeat. That is, go back and re-think your goals now that you know more about your students.

I rather like the metaphor of impedance matching for teaching students.  It explains quite clearly (at least to those with any electrical engineering) the problem that is being addressed, in trying to adjust the signal provided to the load to get as much of the signal absorbed by the load as possible.

Of course, there are several problems that this metaphor brings up.  One, we are rarely teaching a single student at a time, but we can only match one impedance.  The metaphor suggests that we have to pick one student as our “typical” student and match them.  While that is probably better than what some teachers do, it is far from adequate to teach the wide range of students we usually get even in tiny classes.  We need to provide a range of different instruction, so that everyone in the class is learning, though not necessarily all learning the same thing.

There is also the problem that impedance matching is a metaphor that assumes we are the source and the students passive receivers of information, which doesn’t work that well with his point 2, about getting sufficient feedback from the students.  When trying to match an unknown impedance, we usually adjust things so that we get no reflection from the load—exactly the opposite of what we want from students!

Impedance matching is done to minimize the power needed in the source to get sufficient signal in the load—often to maximize the signal-to-noise ratio at the receiver.  That seems like a worthy goal, either minimizing our effort to get a desired outcome, or maximizing what the students learn for a given level of effort.

But the real goal is maximizing the signal-to-noise ratio, and impedance matching is often not the most important part of that task.  Reducing the noise is often more important than increasing the signal, and if the noise is introduced early in the process, it can be difficult or impossible to remove later.  This comes back to his point 1: figuring out what your real goals are, and shaping your instruction to meet those goals.  If you add a bunch of extraneous stuff that does not help students toward the goals, it not only wastes time but may actively oppose their learning what is needed.

The signal-to-noise metaphor brings up another approach: active noise cancellation.  It is now fairly easy to get headphones that not only provide good reproduction of music, but also actively cancel the sounds from the environment, so that one can listen even on airplanes and in other noisy environments.  For active noise cancellation to work, you need to have microphones or other sensors to detect the noise, and processing to precisely counter act it.  This is the idea behind his point 2: finding out what misunderstandings the students are mixing with the ideas you are trying to get them to understand, and actively cancelling those misunderstandings.

Again, it is not an easy task for large-group instruction.  Even noise-cancelling headphones need separate microphones and processing for each ear, because the noise at one ear is not the same as the noise at the other.  If you try to cancel the wrong noise, you may add  up adding more noise than you remove, creating a misunderstanding where there wan none previously.

I was hoping to use a different metaphor for handling  varying, unknown noise—the spread-spectrum technique, where the signal is not concentrated in a single frequency, but spread over many and less susceptible to accidental or deliberate interference.  Unfortunately, the spread-spectrum technique requires close cooperation between the transmitter and receiver, with the receiver knowing the spreading code or the frequency hopping pattern.  I don’t think that students understand the codes we use nor that hopping around all over the place and expecting our students to know precisely where we will go next is likely to be a good teaching metaphor!

What about you, readers? Can you come up with useful metaphors about teaching from electrical engineering, physics, or computer science?  (By “useful”, I mean ones that will help a teacher understand a pedagogic technique better.)

2013 July 6

Disseminating the applied circuits labs

Filed under: Circuits course — gasstationwithoutpumps @ 11:41
Tags: , , , , , ,

The post I just published about academic conferences lead me to thinking (again) about how I should disseminate my course design (or individual lab designs) for the Applied Circuits course.  I suppose I should write up a paper and submit it to a conference or some journal like the Journal of Engineering Education.

One problem is that my course designs are not education research—they are attempts to solve particular curricular problems by taking advantage of my strengths as a teacher.  Some of the course design generalizes to other teachers and other institutions with somewhat different curricular needs, but there is no controlled experiment to say that my course design is “better” than another is some predefined, measurable way.

I’m not sure where this sort of here-are-some-good-ideas-you-may-be-able-to-use course design work fits in academic publishing.  If I were teaching physics, I would probably submit to The Physics Teacher, but I don’t know what the closest equivalent is in engineering—particularly interdisciplinary stuff like teaching circuits to bioengineers.  There don’t seem to be good journals intended for disseminating instructional labs and curricular design. Maybe Advances in Engineering Education would be a better fit than Journal of Engineering Education, even if I don’t feel that my course design contains “significant, proven innovations in engineering education practice, especially those that are best presented through the creative use of multimedia.”

Of course, distilling down the 200–400 pages of notes on my circuits course that I’ve collected on this blog to a conference presentation or a journal article is a daunting task—one that I’ll probably keep putting off until it is so stale that even I’m not interested in it any more.  It might even be easier to turn the notes into a book than into an article, since I would not have to do 100-to-1 compression.

I’m not sure who the right audience for such a book would be—instructors trying to create a new course, students taking my course, hobbyists wanting to learn the material at home, … ? That is, should I be writing about a case study in course design, should I be trying to create a textbook for the course, or should I be trying to put together a self-study book that could accompany a kit of parts for people to learn electronics at home?

Again, the book project is big enough that I’d probably keep putting off indefinitely.  If I was sure there was an audience, I’d be more inclined to put in the effort to disseminate the material beyond this blog, though.

Another approach for disseminating the course materials would be to put together stand-alone kits for each lab (with detailed tutorials) that could be sold independently.  Releasing one lab at a time would be a more incremental effort than doing all 10 labs in one package, but would require some redesign, both for reduced expectations of lab equipment and to make the kits more modular.

I suspect that one could do most of the experiments of the circuits course for about $360 in lab equipment and tools, using something like the Velleman Lab2U unit (which PartsExpress sells for $210), a $40 multimeter,  a $17 soldering station, an Arduino, and the $66 bag of tools and parts I put together for the course.  The Lab2U is only a single power supply, so some of the projects that used a multiple supply would need to be redesigned—most notably the class-D power amp.  Unfortunately, $360 is too big a chunk of money for anyone but a dedicated hobbyist, who quite likely already has most of the needed equipment.

Making the kits more modular might be difficult. For example, many of the labs require students to choose resistors and capacitors from a large set of possibilities, since their lab kit contained 10 each of 112 different resistors and 10 each of 25 different ceramic capacitors.  It is easy to justify the cost of those parts spread over 10 labs, but harder to provide that much selection for a single lab.  Perhaps one would have to sell a core kit (with breadboard, resistors, capacitors, …) to use with the Arduino and add-on kits for each lab.  The core kit would need to have some fundamental experiments (like RC time constants), so that it would be instructional even without add-ons.

I wonder if there is a market for such lab kits, and how I would find out if there were a market (without sinking months of my time and thousands of dollars). I wouldn’t want to assemble or market the resulting lab kits either, but would want to distribute them through a company like Sparkfun electronics, Adafruit Industries, or Seeedstudio, who have already set up the necessary business infrastructure.

I also wonder whether I’m capable of writing the tutorials at an introductory enough level to work for hobbyists, while still covering enough of the theory.  I’ve never cared for kits that have great assembly instructions, but treat the way the things works as too difficult for the purchaser of the kit.

Writing instructions that included the use of an oscilloscope or multimeter, when there are many different ones that the person may be trying to use, would be a very challenging task.  Oscilloscopes in particular have evolved to have many radically different user interfaces, some of which are very complicated.

Also, all my writing has been for well-educated people: college professors, university students chosen from the top 10% of high school students, my loyal blog readers, … .  Can I make my writing intelligible for an average, or slightly above average, high school student, without sounding condescending or patronizing?  From the rather unsuccessful attempts I’ve seen in kit instructions in the past, that is not an easy task.

2013 July 4

Caballero on teaching physics with computation

Yesterday, Danny Caballero left a comment on my blog with pointers to three preprints of papers he’s been a co-author on:

I couldn’t agree more with the overall sentiment in this post. Computing is important for all students of science and for science teachers. However, as you say, I do think my dissertation work has been misconstrued a bit in this post. So, let me clarify.

The inventory you mention measures a small slice of mechanics taught in introductory physics. It’s been an important slice, but maybe now it’s time to think beyond it. What are students learning above and beyond this assessment? In M&I, they are learning about modeling, computing, and connecting the two. My work shows that the present implementation of M&I doesn’t produce great gains on this assessment, that students make mistakes in their code, and that they are less inclined towards computing after instruction.

So what? We are getting to teach students how science is done and they are using computing to investigate models. Now, this implementation is not the most polished one, which means we have a good way to go. But, that’s OK, because we are not going to fix all these issues overnight. We have been working on them for the last two years in various contexts. But, we need to figure out how to teach science and computing together. And we need to figure out how to do it well.

So, I’ll point you to a few other publications on computing in physics that I’ve written. Two concern high school, and another deals with physics majors. The high school work shows we can implement ideas from my dissertation work at the high school level (as others are doing). Moreover, we find that students who know physics and computing ideas can make good models of systems and are not memorizing lines of code. In my dissertation work, we didn’t do any qualitative work like student interviews, but it’s clear that doing so is necessary. The work with physics majors is one of the first forays into integrating computing in upper-division physics. We show that a new model for implementation can positively affect student attitudes.

High School:


This morning I read his three papers.  They all describe prototype courses that use computational modeling to teach physics, with some analysis of the outcomes.  They are not controlled experiments, but prototyping proof-of-principle projects (as are most educational research “experiments”).  This is not a criticism of the papers—one generally has to do a lot of prototyping before arriving at a method that is robust and repeatable enough to be worth the difficulty and expense of controlled experiments.  Engineers see the need for prototyping, but too many people in other fields think that things have to work perfectly the first time or be discarded forever.

These papers bracket the previous work Danny did that studied computational modeling for first-year calculus-based physics.  The two high-school papers are about a 9th grade physics class that uses the Modeling Instruction approach to teaching physics, to which was added computational modeling using VPython.  The “upper-division” paper discusses adding computational modeling to a 2nd-year classical mechanics course for physics majors, following a traditional 1st-year calculus-based physics course.

I was a little unclear on the level of the 9th-grade course.  In one place he refers to it as “conceptual physics”, but in other parts of the description it sounds more like an algebra-based high school physics course (covering the mechanics half of AP Physics B), a step-up from conceptual physics.

From his description, it seemed fairly straightforward to add a computational component to the Modeling Instruction approach, and it helped students see that all the different “models” taught in that approach are really special cases of the same underlying general model.  They used Vpython with a couple of additional packages (PhysKit and PhysUtil) to make creating graphs and motion diagrams easier for beginning programmers.  The additional packages allow lines like

    graph.plot(t, cart.pos.x)

in the inner loop, simplifying the usual VPython interface a bit.

It sounds like the students were finishing the course with a mix of students who knew what they were doing and those who still hadn’t quite grasped the physics or hadn’t quite got the programming.  He did try analyzing some of the student work to see whether students were having difficulty with the physics or VPython for making the simulations, but I found the results hard to interpret—raw numbers don’t mean much to me, because I don’t have a good prior expectation of what 9th graders at a private high school should be able to do.  I’m curious whether difficulties with programming correlated with difficulties in understanding the physics, and whether grading the computational homework gave insight into the misconceptions the students had about the physics.

One of the strong points of the computational approach is that it allowed the students to model phenomena usually beyond the scope of 9th-grade physics (like a soccer ball with linear drag forces).  I found this to be the case for calculus-based physics also, where we modeled pendulums without the small-angle approximation (problem 4.P.89 in Matter and Interactions) and the magnetic field lines of a helical solenoid.

Some of his observations are unsurprising: “Students find debugging their programs difficult; that is, they have trouble determining whether they have made a coding error or a physics error and how to deal with that issue. ”  He also noticed that some students found installing the software difficult—I believe that the VPython developers have been working on that problem, though it is not yet at the level where all 9th graders will find it easy.

Personally, I’d like to see the physics simulations for high school students use computations with units throughout—this would help them catch a lot of their physics errors earlier.  I see this lack of units as one of the biggest flaws in VPython as an instructional tool for physics.  I’ve blogged about this before in Physics with Units, and I’ve done some VPython programming using Unum.  Unfortunately, the Vpython plotting and animation code does not play nicely with Unum, and having to strip out the units before plotting or drawing negates most of the advantages of keeping units around. I realize that for professional physics simulations, units are always implicit (in comments and variable names) rather than explicit, because that makes more efficient use of the computer, but for instructional purposes explicit units would be worth the inefficiency.

The 2nd-year classical mechanics course used Mathematica to solve ordinary differential equations (ODEs), and provided only minimal instruction in Mathematica.  The main improvement to the course from my perspective was the addition of a final project that allowed students to study an open-ended physics question of their own choice using computational modeling.  This improvement was discarded in subsequent offerings, because it required too much instructor time. Caballero wrote, “For junior and research-focused faculty, the computational project is a significant investment of their time and energy given the large enrollment in CM 1. Developing authentic, scientific experiences for students that can be sustained with little faculty input is challenging.”

This is a theme that I see repeatedly in course design in all disciplines and at most universities: the really good parts of prototype courses take a lot of instructor time and get discarded.  I think that the goal “sustained with little faculty input” is a wrong goal, but it is one shared by many faculty and administrators, who think that teaching is a burden that should be given as little effort as they can get away with. I’ve decided, rather deliberately, not to design my courses that way, but to design them around high faculty involvement.  I believe that the value of a University education depends on high-contact courses, and I’m willing to resist the MOOCification of the university at least in my own courses.  This does take a lot of my time, and I’ve given up on writing grant proposals to make the time—not a choice that most junior faculty could afford to make, given that promotion these days is based more on how much money is brought in than on the quality of teaching or research.

2013 June 29

Physics for life-sciences majors

Filed under: Uncategorized — gasstationwithoutpumps @ 08:07
Tags: , , , ,

I just read an article about teaching physics to biology students:

Dawn C. Meredith and Edward F. Redish
Reinventing physics for life-sciences majors
Physics Today, July 2013, page 38

An introductory physics course that meets the needs of biology students must recognize that the two disciplines see the world differently.

My apologies about pointing to an article that lives behind a paywall.  It should be accessible to most college students and college physics teachers, as most universities have subscriptions to Physics Today, but it may not be accessible to high-school teachers or other readers of my blog.

The article points out “the even more powerful lesson that what life-sciences students bring to our classrooms is not just less skill in mathematics than the average engineering student but a deeply different perception of what science means and a deeply different expectation of how it is done.” If the difference between physics and biology perceptions of science are profound (based mainly on the model-driven vs. data-driven aspects of the fields), imagine increasing that difference further by adding the difference between science (how things work) and engineering (making things that work).

This is the challenge I’ll be facing next year as undergraduate director for our bioengineering program—trying to fix curricular problems in the bioengineering major which involves required courses from nine different departments (math, physics, chemistry, MCD biology, computer science, biomolecular engineering, electrical engineering, computer engineering, applied math and statistics), when the faculty involved do not have any common understanding of what the end goal is.  Even if every department changes its curriculum only once a decade (a very slow rate of change for engineering and science fields), the bioengineering program has to cope with a major change almost every year—usually one made without our students in mind at all.

The Physics Today article talks about changing intro physics courses at the University of New Hampshire in Durham and the University of Maryland in College Park to be more relevant to biology students.  Their problem is slightly different from the problem at UCSC, since the Physics Today authors already gave up on calculus-based physics and UCSC requires a calculus-based physics course for biology students, but they  do have some good ideas about what parts of physics are important to teach and how to come up with biologically relevant problems that are still simple enough for physics modeling approaches to be useful. Their example “Assume a cylindrical worm” is a good example of scaling, but doesn’t teach much physics.

I’d like to pass on the article to the physics faculty who will be teaching the physics for biologists course for the next few years, but I have no idea who they will be (and quite likely the department doesn’t know either).  Even after 27 years at UCSC, I have no idea who in the physics department cares about teaching the biologists or even whether anyone does. The physics faculty I knew who cared about teaching retired years ago, and I’ve not been paying attention to who among the younger faculty have picked up that role (or even whether anyone has).

One decision the bioengineering program will face in doing any revisions is whether to require the physics for engineers and physicists course, or continue to allow the calculus-based physics for biologists course.  If the physics for biologists course had strong connections to biology, then arguments for either course could be made, and we could let the students choose. The bioengineering students would learn more physics if the physics were presented in ways that made it clearly relevant to what they were interested in. If it is just a watered-down physics course, though, taught by faculty who see it as a watered-down course, then the bioengineering students would be better off taking the more rigorous course taught by faculty who care about whether the students learn.

My only lens on the courses is what students tell me years later as they graduate—and each student has only taken one version of the course, so I get no direct comparisons.  Based on what I heard in the exit interviews this spring, the physics for biologists course is sometimes (often?) taught by faculty who have very low expectations of the students and who treat them as essentially incapable of doing physics—so why bother.  Of course, this is a student perception by students who did not see a clear connection between physics and biology, and may not reflect what the physics faculty were trying to do at all.

One problem is that the bioengineers need to take intro courses in so many disciplines that scheduling is a nightmare.  The physics course for engineers and physics majors is only offered once a year, so allowing the bioengineers to take the physics for biologists course offers them some extra scheduling flexibility. Unless, of course, they plan to do the bioelectronics concentration, where the extra calculus practice in the more rigorous physics course would prepare them better for the math-heavy electronics courses.

The physics piece of the bieoengineering curriculum is perhaps the least problematic, as the physics courses have little connection to the rest of the curriculum (even the electronics courses assume that the students learned nothing useful from the physics E&M course).  We have bigger problems with biology, biochemistry, statistics, and computer programming—all of which the students need to know but which there is not enough time to do the way each department wants to do it for their own majors.

« Previous PageNext Page »

%d bloggers like this: