I gave about the same quiz as I did last year, changing the numbers, removing one of the harder questions, and making sure that some of the other questions reflected worked examples we had done in class. The quiz was again on the 12th day of instruction. I had intended to move it to the 10th day, but one of the students was called out of town, so I rescheduled it so that everyone could take it at the same time.

I expected similar distribution to last year’s (last year the range was 3/32 to 12/32), but was hoping for slightly better. I saw a distinct bimodal distribution this year, with half the class getting scores from 0/33 to 6/33 and the other half getting 11/33 or 12/33. This is a little clearer distribution than last year’s, which spread the students out more uniformly. I was still hoping that some of the better students would get over half the points on the quiz, but they seemed to top out at 36%.

I worked this year’s quiz myself in about 24 minutes (which means the quiz was a little too long still—I want about a 3:1 ratio on time, and the students had only 70 minutes).

I was really depressed after last year’s quiz, because I had not been expecting such dismal performance. This year I was braced for it, but still hoping for better. Still there were some surprises:

- There were a few questions that should have been free points (like asking for the impedance of a resistor with resistance R)—I was disappointed that some students missed even the trivial questions.
- I had a pair of questions which were identical, except that one asked for algebraic formulas for impedance and the other gave component values and asked for numbers. I put the algebraic ones first this year, so the numeric ones were just a matter of plugging the numbers into the algebraic ones (and doing a sanity check). The algebraic ones had a mean score of 2/4 with a standard deviation of 1.2, while the numeric ones had a mean of 1.22/4 and a standard deviation of 1.2. I had not expected a drop in performance on the numeric ones, since the received wisdom in the physics education community is that students do better with numeric examples than algebraic ones.
- No one got any points on the oscilloscope probe example, even though it was
*identical*to an example we had worked in class. - The average score on a load-line problem was 1/6 with a standard deviation of 1.3. This did not look like a normal distribution, but an exponential one, with half the class getting no points.
- I had two low-pass RC filter questions. One asked for algebraic formulas; the other used the same circuit but asked for numeric answers using specific component values, voltages, and frequencies. The algebraic one was bimodal, with 2/3 of the class getting 0 and 1/3 getting the answers completely right. The numeric one was significantly worse, with only 2 out of 9 students getting any points (1/6 and 3/6).
- I asked a couple of voltage divider questions that required applying the voltage divider formula circuits in which the voltmeter was connected between two nodes, neither of which was ground. One asked for an algebraic results (a Wheatstone bridge), the other for a numeric result (voltage across the middle resistor of three in series. Students did very poorly on both, with only one person getting the voltage for the middle resistor (one got half credit for setting it up right, but computing wrong), and no one getting more than 1/5 for the Wheatstone bridge.

Last year I suggested several ways to handle the poor performance on the first quiz:

*I could tell them to study and give them another quiz. That would be totally useless, as it would just repeat the problems on this quiz. They don’t know what it is that they need to know, and vague exhortations to study are pointless. I don’t think the problem is lack of effort on their part, and that’s the only problem for which pep talks are a potential solution.**I could go over the quiz question by question, explaining how I expected students to solve them. This is classic lecture mode and the approach I used to use. It would be easy to do, but I doubt that it would help much. I already did an interactive lecture on the material, and another approach is now needed.**The students could get the quiz back and be told to go home and look up in their notes and on-line anything they did not get right. They would find and write down the right answers, as if this were homework. (This “quiz correction” is a standard strategy in high school teaching, but not common in college teaching.) One difficulty here is that they might be able to find answers (say by copying from other students in the class) without understanding how to do the problems. It is probably a better approach than yet another lecture, but I’m not sure it will work well enough. If the students were trying to get from 80% understanding to 95%, it might be fine, but to get from 30% to 80%, something more directed is needed. More time and open notes would help, but maybe not enough.**I could break them into groups and give each group a couple of the problems to work on together in class. This peer instruction technique would be a good one if about 1/2 the students were getting the problems right, but with the top of the class getting only 1/3 right, I may need to give them more guidance than just setting them loose. For example, on some of the problems there was a fundamental misreading of the circuit schematics that was very common. I could clear up that misunderstanding in a minute or so and have them rework the problems that depended on it.***Then**I could send them home to write correct solutions.*I could give out lots of problem sets to drill them on the material. Of course, since it took me more than all day Sunday to make an 8-question quiz, it would take me forever to generate enough drill problems to be of any use.*

I feel the same way this year about the possible teaching strategies, but this year I’m going to try a mix of methods 3 and 4, asking them to redo the quizzes at home, working with others until they are satisfied that they can now do the problems *and other similar problems* when asked. I’ll have them hand it in this year as a homework, but not go over it in class until after they turn it in. They need to take a more active role in trying to master the material, and not rely so much on my telling them what to do.

Monday we’ll cover inductors and loudspeakers, in preparation for the Tuesday measurement lab.

On Wednesday I was planning to do gnuplot analysis of the loudspeaker data, but I think I’ll keep that fairly short, so that we can get an intro to sampling and aliasing also before Thursday’s lab. I have to decide whether to bring in my son’s stroboscope and a moving object to demonstrate aliasing.

Friday, I’ll introduce op amps, with the intent of developing the block diagram in class on Monday for a simple op amp microphone circuit for the Tuesday lab. This weekend I need to rewrite that lab from last year—I decided last year to use the dual power supply with a center ground for their first op-amp design, rather than having them build a virtual ground (we’ll get that in the next lab assignment).