# Gas station without pumps

## 2014 January 22

### Fifth day of freshman design seminar

Today we continued the design exercise I started last week (see also 4th class) designing a photospectrometer.

I started the class by collecting the work I had asked them to do on fleshing out the design of the photospectrometer, which I have not read yet.  Just glancing at the pile, it looks like most wrote very little, and just drew a picture of the components we talked about in class, not adding much. I’ll know more about the class once I’ve had a chance to look at what they did more carefully.  I have asked them to mail their work to the class e-mail list by tomorrow night, so that the rest of the class can see what everyone has done.

I then told them that we’d do some electronics and computer programming, since many of them had requested that on the first-day survey, and that we’d use Arduino boards to do that learning.  I suggested that they get Arduino Uno Rev 3 boards as being the current standard (about $30), but told them that almost any Arduino board that used an ATMega processor would be fine—the older boards, like a Uno that is not Rev 3, are often half the price. They can probably get the boards from the University through BELS (the lab support group for the engineering labs), but it might be cheaper on line. I then asked them for some details that they had added to the photospectrometer design. The first one to come up was the photodetector, so we started talking about different photodetectors. They came up with photodiode, photomultiplier, and photocell, to which I added phototransistor and photoresistor. I asked them what sort of characteristics might be important to a photodetector, and (with some prompting) got them to come up with the ideas of sensitivity and which wavelengths the detector was sensitive to. One student came up with “resolution” for the sensor, and I took that as an opportunity for a digression into the differences between accuracy, precision, and resolution. I also talked about resolution being a property of the whole system (how many digits were in the numbers), but that there was a property of sensors that was related—how much noise they had. Sometime later in the class (I forget exactly when), I talked about Arduinos having a 10-bit analog-to-digital converter, and asked them to guess what that meant. The only guess was that it meant that there were 10 different levels. I was actually fairly pleased with this answer, as it got to the notion of resolution, and I could correct it from 10 to 210 without making anyone feel stupid. I told them that they should remember 210=1024, as that was a frequently used “magic” number in computers. I quickly sketched rough sensitivity plots for phototransistors/photodiodes and photoresistors, and explained that that was why photoresistors were used as ambient light sensors—because of how they matched human visual sensitivity. I also mentioned the slowness of photoresistors and said that they weren’t used for much for other applications. I told them how to find datasheets—either by Googling “photodiode data sheet” or by doing a search on Digi-key, choosing the cheapest part that seemed to do what they want, and looking up its data sheet. I assigned them the task of finding and trying to read a photodiode and a phototransistor data sheet. I’ll make that more explicit on the class web page later tonight. I’ll probably give them a part number for one, and make them look for the other from just a description. I showed them the schematic symbol for a phototransistor (though, unfortunately, not for a photodiode), but I didn’t attempt to explain how it works. I just told them that the current through the phototransistor was proportional to the light intensity as long as voltage across the transistor was at least 0.7v. (I’ll have to tell them where to find that information on the data sheet.) I also mentioned the notion of “dark current”, which prevents the phototransistor from getting down to 0 current. I then tried to get them to figure out how to convert the current into a voltage that could be measured by the Arduino. After a few tries, one of them finally remembered Ohm’s Law (V=IR), and they decided they needed a resistance. I told them that there were fairly constant resistance devices available (called “resistors”) and mentioned the notions of resistor tolerance and that resistors had thermal coefficients, so that the resistance changed as a function of temperature. I then gave them the phototransistor circuit below (though not the photodiode circuit): Simple circuits for measuring light with an Arduino. Update 2014 Feb 6: Q1 is intended to be an NPN phototransistor, not PNP as shown here! We then spent a fairly long time before they figured out that they needed to know what value resistor to use in the circuit. I got them (eventually) to the point where they realized that the maximum light intensity determined the maximum current, and if we set the maximum voltage to be as high as possible while keeping the transistor properly biased, the desired resistor was determined by R=Vmax/Imax. I showed them how increasing the value of R made the circuit more sensitive, but that if we made R too big the circuit would not be able to handle high light intensity. We then looked at the overall spectrometer design, and saw how the optics coupled everything together: the brightness of the light, the absorbance of the sample, the efficiency of the monochromator, and the sensitivity of the photodetector. I introduced the notion of interface specifications, so that design problems could be divided up among a team, and the need for negotiating changes to the interface specs rather than just “throwing problems over the fence” for some other part of the team to solve. I gave the example of the lamp designer finding out that the initial spec called for an expensive, bright light (like and HID bulb) but being able to reduce the cost and power enormously with a somewhat less bright LED. The change to the spec might be accommodated by shortening the optical path in the sample (but that would need non-standard cuvettes) or by making the light sensor more sensitive (which is pretty cheap to do). We ran out of time then, so did not get to looking at any other parts of the design. Some of the students were amazed at how much thinking went into just one little detail of the design (one resistor value). They are used to big-picture, fuzzy thinking, where getting the general idea is what is important, and are not used to sweating the details. Part of getting them to “think like engineers” is getting them to realize that the details do matter. For homework, I’ll ask them to figure out good values for R1 and R2 for particular parts (or maybe for one part that I specify and one that they choose), based on the data sheets and some light spec. In class on Monday, I’ll build the design they specified, and we’ll test it with the Arduino data logger. If I happen to pick a bad resistor choice, either because I gave them a bad light intensity to design for or because their math was wrong, we’ll get bad results, and I’ll show them how to debug the design and iteratively improve it. ## 2014 January 16 ### Fourth day of freshman design seminar Today we continued the design exercise I started last week, designing a photospectrometer. I started the class by asking the students if they had read about spectrometers, as I had assigned. They all assured me they had. I then asked them to take out a piece of paper and answer 4 questions: 1. What is Beer-Lambert Law? 2. What is absorbance? 3. What is Bragg’s Law? 4. What is a molar extinction coefficient? (also known as molar absorption coefficient or molar absorptivity) I gave them about 5 minutes to try to answer those questions, then I had them compare answers with their neighbors next to them. I then had them regroup at right angles and compare answers again. Finally I asked the whole class for their answers. Basically, no one knew any of the answers (so much for them having read about what spectrometers are used for or any details of how they work). The lack of answers lead me into a small lecture on each of the points, so that they could see the connections to photospectrometry: • Absorbance is a measure of how much light is absorbed (or diffused) by a sample. It is expressed as $A_{\lambda} = - \log \frac{I_{out}}{I_{in}}$. The λ is the wavelength of light at which the intensity is measured. I pointed out (towards the end of the mini-lecture) that whether the logarithm is base 10 or base e depends on who is doing it—chemists tend to use base 10, physicists almost always base e, and biologists vary depending on who taught them about absorbance. • Beer-Lambert Law is an expression for computing absorbance: $A_{\lambda} = \epsilon_{\lambda} l c$, where $\epsilon_{\lambda}$ is the molar extinction coefficient (a property of the substance) in $cm^{-1} M^{-1}$, $l$ is the length of the light path (in cm), and $c$ is the concentration of the absorbing substance (in M). I pointed out that concentration of a substance in solution could be computed from a known extinction coefficient and a measurement of the absorbance at the wavelength of the extinction coefficient. (This is the main use of absorbance in biomolecular labs.) I also pointed out that extinction coefficients had the same problem of absorbance of coming in base e or base 10 scaling. I also passed around a couple of disposable cuvettes for students to look at, talking about how they had a well-calibrated 1cm path inside (fixing length). One thing I goofed on in class—I said that the cuvettes were made out of acrylic, but when I got home I checked the boxes and found that I had polystyrene Brand cuvettes, 100 macro and 100 semi-micro size. The polystyrene cuvettes are a bit cheaper than the acrylic ones, but not quite as transparent in the UV (they go down to about 295nm instead of 270nm for acrylic and 220nm for the UV semimicro cuvettes) [http://www.brandtech.com/cuvettegraph.pdf]. I also measured the outside size of the cuvettes (which Brand does not document): the 2.5ml “macro” size is 1.24cm each way, and the 1.5ml “semi-micro” size is 1.22cm each way at the top, and 0.99cm by 1.20cm at the windows. • Bragg’s Law expresses the amount that light with wavelength $\lambda$ and input angle $\theta_{i}$ from the normal is diffracted from a grating with line spacing d: $\sin(\theta_{i}) + \sin(\theta_{m}) = \frac{m\lambda}{d}$. I had students compute the diffraction angle for the first spot (m=1) for the green laser (520 nm) and a diffraction grating with 1000nm ruling. I also passed around a$7 spectroscope that I had bought for homeschool physics and had students look at the fluorescent lights with it.

After the mini-lecture, we started the design exercise.  Since they were new to design thinking and needed scaffolding, I did this as a whole class exercise:

• First I reviewed what we knew of the inputs (a chemical sample in water in a cuvette) and the output (a plot of absorbance versus wavelength for about 300nm to 700nm, though polystyrene seems to be transparent for near infrared  (at least to 1000nm).
• Then I started asking them for components, explaining the concept of a block diagram (though I did not provide the connections between the blocks yet). They started with
• a device to spread out the light according to wavelength,
• (with a little prompting) a slit to get only one wavelength.

I pointed out that the spreading out the light spreader and slit could be thought of as a single unit (a monochromator) and that it had to be adjustable to get different wavelengths. They then added a sample holder for the cuvette.  It took a lot of prompting to get them to add a measuring device for the light, but they got there.  A little more prompting got them to add a computer interface for recording the spectrum.

• We also talked about omitting the monochromator slit and using an array of sensors (like a cellphone camera) to record the whole spectrum at once.

At that point we were running short on time, so I had them go to the boards in groups of 3 to try to flesh out the design with more detail.  After five minutes, we really were out of time and I had another meeting  to go to, so I assigned them their first written homework—a fleshed out design for the spectrophotometer.  They can do it individually or in groups of up to three people, and it is due in one week.

I’m very curious what they come up with. I’m hoping that the students will add a lot of details to the design (partly by looking at examples on the web, partly by thinking about what the needs are for each component and how the components have to work together).  I’m afraid that the students are still in  in the habit of regurgitating what the teacher has told them, rather than finding things on their own or thinking things through without leading questions, so I’m trying not to get my hopes up too high.

## 2014 January 13

### Third day of freshman design seminar

We had a few minutes at the beginning of class today, discussing how to have on-line group discussions, while still respecting FERPA rules. It was decided that about the only medium that would work for everybody was e-mail, so I created an e-mail discussion list.  Currently, this list is invitation-only, but we’ll add mentors to it as needed.  If there are regular readers of the blog who would like to participate in the student discussions, comment on this post requesting to be added, giving some information that I can share with the class.  I’ll forward your request to the class, who will decide collectively whether adding you would benefit them.

As soon as that discussion was over, we followed our tour guides over to one of the buildings that contains bioengineering labs.  We were supposed to have 3 tour guides for 3 labs, but none of the tour guides had responded to my e-mail over the weekend.  One showed up, another was replaced by someone else from the same lab, and the third I still haven’t heard from.  As it turned out, we didn’t really even have time for the two labs, so I’ll probably have to schedule another tour later on.

The first lab we toured was for a brand-new faculty member in protein engineering.  The tour guide was a senior doing a senior thesis in chemistry, and she gave us a good description both of the work she was doing and of the various equipment in the lab, which was all fairly generic molecular biology equipment (incubators, freezers, centrifuges, vortexers, electrophoresis boxes, gel viewer, ultrapure water source, PCR machines, pipetters, …).  The only really specialized piece of equipment was an HPLC machine for purifying proteins.  She demonstrated the tiny amounts of liquid that are handled, by showing her smallest pipetter (2µl max) and demonstrating how to use a different pipetter, putting a 9µl drop of water on the bench top (which people could barely see).

This was an excellent first lab to tour, as almost every procedure done in the lab is one that the students on the biomolecular track will do themselves many time (only the HPLC and the protein gels are specialized), so all the tools are ones the students will learn to use.

Next we went to the lab where samples are prepared for DNA sequencing.  There was nothing much new there—again it was all standard molecular biology equipment.  That went quickly, and we moved over to the nanopipette labs.  Because the rooms were so small and there were two undergrads willing to talk about their work, we split the class in half, with one group seeing the grad students and postdocs (the main lab space) and hearing a bit about the point of the projects, and the other group getting a demo of making the nanopipettes and hooking one up to the electronics to record the response to a sine wave.  We then swapped groups, so everyone got both.  We ran out of time before seeing the DNA sequencing machines and never had time to try to hunt down someone from the nanopore lab for a tour there.

Overall, I think that the lab tours went well and were successful at what they tried to do.  I’ll have at least one more tour for the labs in the biomed building—hopefully stressing things that are different there (flow cytometry, hyperthermophiles, stem cells, …).

On Wednesday, though, it is back to doing a design exercise.  We’ll try the spectrometer exercise again, now that students have had a chance to learn on their own about spectrometers, and I’ll scaffold the “systems thinking” of dividing a complex system into communicating subsystems.

## 2014 January 9

### Second day of freshman design seminar

Several students had looked for interesting ideas for projects on the web, but only one had sent me the URLs.  I asked the students to send me annotated links by Friday, so that I could put them into a web page over the weekend—they’ve now started to trickle in.

I started the class by going around the room getting a project idea from each student, which we discussed very briefly.  Several had come up with interesting projects, and a couple had come up with ones that seemed a bit too big for a 2-unit freshman course.  There was one that I thought was too ambitious even for a senior deign project, but I forget what it was now.  Two students came up with the same idea—a home-made centrifuge.

I then started a design exercise, which we’ll probably continue next week.  The exercise started with an idea—a spectrometer.  The exercise almost foundered right there, as almost no one in the class had ever heard of a spectrometer or a spectrum.  I ended up drawing a few colored lines on the board, which several students recognized as being spectral lines that they had seen in a chemistry or physics text book.  I then gave them the idea that one could plot the intensity of light as a function of wavelength.

Exercise 1: 2-minute writing, tell me what a spectrometer does.

Exercise 2: 2-minute writing, tell me what the inputs and outputs of a spectrometer are.

Exercise 3: 2-minute writing, tell me what a spectrometer might be used for—list uses cases.

A lot of the students were very frustrated by this assignment, as they had no idea what I was asking for or what a spectrometer might be used for.  That surprised me somewhat, as I thought that spectroscopy would have been covered either in chemistry or physics, which everyone had had in high school.

I then had students get into groups of 3 and share their results.  In some groups, there was active sharing and people seemed to be getting a clearer idea of what a spectrometer did.  Other groups seemed a bit dead, with no one having anything to say.  After a few minutes of this, I asked each group to report one thing from their discussion.  The first group reported something trivial (the input is light), the second reported something subtly incorrect (the output is the wavelength of light), and the third group echoed the first two.  I then picked on the third group, asking them to clarify the vague response.  [I had decided in advance to pick on the third group, whoever they were, unless they gave an awesome response.]  After several questions to the group, and increasingly vague answers, I finally got an answer that we could build on (that the output was an array).  But then they couldn’t clarify what the subscripts of the array were nor what the data in the array were.

I moved on to the third group and asked them to clarify what the outputs were.  After a fair amount of unclear description, they finally came out with the output being a graph, but once again they couldn’t say what the x and y axes were.   (I had a graph on the white board behind me, with the x-axis labeled with wavelength and the y-axis labeled with intensity, so I thought I was asking freebie questions.)  Eventually I gave up, as the frustration level of the students was rising, and not in a productive way.  I then explained the notion of the intensity of the light being a function of the wave length.

The fifth group surprised me in a good way, when I asked them for a statement.  They managed (in less clear wording) to come up with the idea that one may be measuring emission from a light source or absorbance of light.  This came up in the form of a use case—measuring UV in sunlight and determining how well sunblock blocks it.  I gave them the terminology.

Next I did a demo of a diffraction grating deflecting light, using 3 laser pointers (red, green, and violet).  I managed, by having several hands of students helping, to get all three lasers pointing through the 1000lines/mm diffraction grating at once at almost the same spot on the projection screen.  The diffracted spots then showed up nicely spaced on the screen, with the violet (405±10nm) deflected least, the red (650nm) deflected most, and the green 532±10nm in between.  Something I had not noticed before was that the green laser did not produce a single spot, but three distinct spots.  More on that below.

After the diffraction demo, I had the class break up into a different 5 groups of 3 and go to the white boards to start designing ways to convert what they now knew about diffraction into components for a spectrometer.  This exercise did not go well—I’ll need to scaffold it better next time we do it.  Students were stuck either on the output they wanted or on the physical phenomenon of diffraction, but no group came up with any connection between the two.  I think I tried to cram too much into the 70 minutes—I’ll revisit the spectrometer design problem after talking a bit about how one decomposes a design problem into subproblems.

In the meantime, I suggested that students look at Wikipedia articles about spectrometers, though the main article looks like it was lifted from a 30-year-old encyclopedia. The Spectrophotometry article looks better written.  I hope some of them think to Google the key words to find better explanations elsewhere.  Those taking physics may think to look in their physics texts, even if they are not taking the optics part of the physics course.

### Three dots

I took a photo at home of the three spots (a bit smeared, because I was holding the diffraction grating with one hand and operating the camera with another). I tried again with the camera on a tripod and the diffraction grating in my Panavise Junior, but it did not come out any better, even after fussing with the exposure settings.
The lowest, brightest dot is the least deflected.

I don’t yet have a good explanation for why there are 3 dots for the green laser. They are clearly different diffractions and not just distortions of the laser spot, since they rotate with the diffraction grating and not with the laser.

I looked up green lasers on Wikipedia, and found out that the 532nm green laser pointer is probably a diode-pumped solid-state laser consisting of an 808nm IR laser diode, whose output is converted to even longer-wavelength 1064nm IR light by a neodymium-doped yttrium orthovanadate crystal laser, and then frequency-doubled in a non-linear potassium titanyl phosphate (KTiOPO4 or KTP) crystal to get 532nm output. I don’t see any mechanism there for producing other wavelengths, unless the 1064nm laser is generating more than one wavelength.

I wonder if the multiple spots are not from multiple wavelengths but from reflections off the front and back of the diffraction grating, resulting in different amounts of diffraction. If that were the case one would get multiple dots for the red and violet also, though the dots may be dim enough that we only see the brightest one.

Nope, that explanation doesn’t work—as the green laser runs for a while the extra spots disappear, but blowing on the laser to cool it brings them back. The extra spots are temperature-dependent!