Gas station without pumps

2016 September 15

Research Report | Siemens Competition

Filed under: Uncategorized — gasstationwithoutpumps @ 12:23
Tags: , , , ,

I was reading the guidelines for a research report for the Siemens competition for high-school science projects.  Overall, the guidelines are good, but I have one quibble with their description of the first section:

Introduction: the “why” section (2-3 pages)

  • Start with a broad picture of the problem you have chosen to study and why it is interesting. Provide a brief review of pertinent scientific literature, describe what information is missing and how your work addresses this gap in the literature. Previous relevant publications and patents must be properly cited in the text of the Research Report and included in the Reference section of your report.
  • Describe the specific problem to be solved, the research question to be answered, the hypothesis(es) to be tested, or the product to be developed (if any). Provide a brief rationale for the research and why the work is important.

I believe that they are encouraging a common mistake: burying the lede. Theses, grant proposals, student projects, and papers should start with a direct statement of the research question or design goal of the project, then provide the “broad picture” and “why it is interesting”. I’m very tired of wading through a page or more of mush trying to find out what a student project (or published research paper) is.

Swapping the two points that they put in the first section would improve the quality immensely.

2016 July 6

Outcomes assessment

Filed under: Uncategorized — gasstationwithoutpumps @ 21:47
Tags: , , , ,

In Confessions of a Community College Dean: The Whole and the Sum of Its Parts, “Dean Dad” wrote about outcomes assessment:

The first was a discussion on campus of the difference between the “course mapping” version of outcomes assessment and the “capstone” version.  Briefly, the first version implies locating each desired student outcome in a given class—“written communication” in “English composition,” say—and then demonstrating that each class achieves its role.  The idea is that if a student fulfills the various distribution requirements, and each requirement is tied to a given outcome, then the student will have achieved the outcomes by the end of the degree.

Except that it doesn’t always work that way.  Those of us who teach (or taught) in disciplines outside of English have had the repeated experience of getting horrible papers from students who passed—and even did well in—freshman comp.  For whatever reason, the skill that the requirement was supposed to impart somehow didn’t carry over.  Given that the purpose of “general education” is precisely to carry over, the ubiquity of that experience suggests a flaw in the model.  The whole doesn’t necessarily equal the sum of the parts.

In a “capstone” model, students in an end-of-sequence course do work that gets assessed against the desired overall outcomes.  Can the student in the 200 level history class write a paper showing reasonable command of sources?  The capstone approach recognizes that the point of an education isn’t the serial checking of boxes, but the acquisition and refinement of skills and knowledge that can transfer beyond their original source.  

I have certainly experienced the phenomenon of students doing well in freshman writing courses, but being unable to write reasonably in upper-division and graduate engineering courses—indeed, that is why I insisted on UCSC’s computer engineering curriculum requiring a tech writing course 30 years ago (and teaching it for about 14 years). I continue to teach writing-intensive courses—my current main course, Applied Electronics for Bioengineers, requires about 5–10 pages of writing from each pair of partners each week (though that load will drop in half next year, as the course is split into two quarters). The writing level of students increases noticeably during the quarter, though a number of students continue to have problems with organization, with paragraph structure, with grammatical sentences, and with punctuation (particularly commas).

But evaluating writing just once in a capstone course is no solution—that just invites a lowering of standards so that the failure rate is not too high. Nor can one guarantee that capstones will necessarily be a good check of all the desired outcomes. Indeed, one of the capstone options for the bioengineering degree at UCSC does not involve major writing—the results are presented orally, and oral presentations are frequent in the course.

I recently wrote an evaluation of the “Program Learning Outcomes” (PLOs) for the bioinformatics program at UCSC (and refused to write one for the bioengineering program—it was hard enough getting the info needed for the bioinformatics assessment).  The assessment falls more in the “various distribution requirements” camp than in the “capstone” camp.  We did not have much trouble showing that the PLOs were assessed thoroughly, largely because the PLOs were chosen to be things that the faculty really cared about and included in their course designs, rather than “wouldn’t it be nice if students magically acquired this” outcomes.

Here is the report (minus any attachments):
Program Learning Outcome Assessment
Bioinformatics BS program
Spring 2016
The bioinformatics program was asked to assess at least one of our Program Learning Outcomes (PLOs) [https://www.soe.ucsc.edu/departments/biomolecular-engineering/programs/bs-bioinformatics]:

A bioinformatics student completing the program should

  • have a detailed knowledge of statistics, computer science, biochemistry, and genetics;
  • be able to find and use information from a variety of sources, including books, journal articles, and online encyclopedias;
  • be able to design and conduct computational experiments, as well as to analyze and interpret data;
  • be able to apply their knowledge to write programs for answering research questions in biology;
  • be able to communicate problems, experiments, and design solutions in writing, orally, and as posters; and
  • be able to apply ethical reasoning to make decisions about engineering methods and solutions in a global, economic, environmental, and societal context.

Because the program graduates so few students, it did not seem very productive to examine the output of just the most recent graduating class—we would need a decade’s worth of output to get statistical significance, and any information about changes in the curriculum would be lost in viewing such a long time scale. Instead, we asked the faculty in our department who teach the required courses of the major how they assess the students for the objectives that they cover.

A Google form was used to collect the information.  The faculty were prompted

Please complete a separate form for each course that you teach that is a required or elective course for the Bioinformatics major (select from list below).  Only courses that are required (or are part of a small list of constrained choices) matter here, since we are looking for guarantees that all students are meeting the PLOs, not that there is some elective path that would cover them.

Please provide a sentence or two describing how your course(s) provide evidence that the student has met the outcome.  Be explicit (what assignment in what course provides the evidence)!  Your course does not have to provide evidence for all the PLOs—one or two PLOs supported strongly in a course is more convincing.

Responses were collected for 7 courses: BME 80G Bioethics, BME 110 Computational Biology Tools, BME 130 Genomes, BME 185 Technical Writing for Bioengineers, BME 205 Bioinformatics models & algorithms, BME 211 Computational Systems Biology, and BME 230/L Computational Genomics.  Each of these courses is required, except BME 185 (bioinformatics students may take CMPE 185) and only one of BME 211 and 230/L is required. Our hope is that all the PLOs are assessed thoroughly in these courses, so that we do not need to rely on courses outside our control for outcome assessment.

The responses to the questions are attached as a CSV file and will be summarized here [I’m not including the attachments in this blog version]. Despite the prompt, faculty did not always explain exactly how the outcome was assessed.

detailed knowledge of statistics, computer science, biochemistry, and genetics

All courses except BME 80G (bioethics) test some aspect of this objective. Most of the assignments in most the courses depend heavily on this content knowledge, and the faculty are convinced that this objective is being adequately addressed.  Note that we did not include the courses from other departments that actually teach the fundamental material—just the courses within our department that rely on it.

able to find and use information from a variety of sources, including books, journal articles, and online encyclopedias;

All the courses rely on students gathering information from a variety of sources, with different levels of search and different levels of interpretation needed in each course. All courses have at least one assignment that assesses students’ ability to use information from a variety of sources, and most have several.  Again, because of the pervasive nature of the objective in all our courses,  the faculty have no concern that the outcome is being inadequately assessed.

able to design and conduct computational experiments, as well as to analyze and interpret data;

All the courses except Bioethics require some data analysis, and several require computational experiments, but only BME 211 and 230/L have the students doing extensive design of the experiments.

able to apply their knowledge to write programs for answering research questions in biology;

BME 80G (Bioethics) and BME 110 (Bioinformatics tools) do not require programming, and BME 185 (Technical writing) has minimal programming, but all the other courses require writing computer programs, and the programming tasks are all directly related to research questions.  In BME 211 and BME 230/L the questions are genuine open research questions, not just classroom exercises.

able to communicate problems, experiments, and design solutions in writing, orally, and as posters; and

All courses except BME 110 (Bioinformatics tools) require written reports, and several of the courses require oral presentation. Only BME 185 (Technical writing) requires poster presentation, so we may want to institute a poster requirement in one of the other courses, to provide more practice at this form of professional communication, as posters are particularly important at bioinformatics conferences.

able to apply ethical reasoning to make decisions about engineering methods and solutions in a global, economic, environmental, and societal context.

BME 80G (Bioethics) is specifically focused on this PLO and covers it thoroughly, with all the assessments in the course testing students’ ability to apply ethical reasoning.  There is also coverage of research and engineering ethics in BME 185 (Technical Writing).  Although most of the courses do not teach ethics, the writing assessment in each of the courses holds students to high standards of research citation and written acknowledgement of collaboration.

Overall, the faculty feel that the PLOs are more than adequately assessed by the existing courses, even without looking at assessments in obviously relevant courses for the objectives from outside the department (such as AMS 132 for statistical reasoning). Because so many of the objectives are repeatedly assessed in multiple courses, they see no point to collecting portfolios of student work to assess the objectives in yet another process.

Only poster presentation and ethical reasoning are assessed in only one course, and practical research ethics is assessed in almost every course, leaving only poster presentation as a skill that might need to be reinforced in improvements to the curriculum.

2016 June 14

Things to do for book

Filed under: Circuits course — gasstationwithoutpumps @ 15:24
Tags: ,

I’ve finally finished my grading for the quarter, after a solid week of grading, and so I can now catch up on some of my administrative tasks (like checking the articulation framework documents, trying to find an undergraduate director for bioengineering for Fall quarter, checking the 30 or so senior exit portfolios, and so forth).

I can also start thinking about the tasks for me on revising my book, which will take up big chunks of summer and fall:

  • Move the book files into a source-code control system, probably mercurial, and use and off-site backup, probably BitBucket.  I should have had the files in a source-code control system from the beginning, but I never got around to setting it up.  This is a couple of years overdue, and I shouldn’t make any more updates to the book until I’ve done it.
  • Rearrange book to put labs in new order, moving all the audio labs into the second half, and moving the instrumentation amplifier and transimpedance amplifier into the first half.
  • Revise parts list for next year’s labs.
    • May want to use a different phototransistor (without the filter that makes it less sensitive to visible light).
    • Choose nFET with lower threshold voltage (maybe pFET also).
    • Find better resistor assortment.
  • Add hobbyist add-ons to the labs, for people who want to go beyond what we can do in class.  For example, I could add
    • designing a triangle-wave generator to the class-D amplifier, so that it can be self-contained,
    • sound input from a phone jack to class-D amplifier (with info about TRRS plugs)
    • logarithmic transimpedance amplifier for optical pulse monitor, to make it tolerant of different light levels and finger thicknesses
    • optical pulse monitor using reflected (actually back-scattered) light instead of transmitted light, so all the optics is on one side
    • motor controller based on H-bridge used in class D
    • temperature controller using thermistor, FET, and power resistor
    • galvanic skin response measurement?
    • oscillator design other than Schmitt trigger relaxation oscillator?  Maybe a Colpitts oscillator with the big inductor (though even with 10μF and 220μH, the frequency would be rather high for audio use)?
    • make Schmitt trigger out of comparator chip
    • EMG controller (either with analog envelope detection or with software envelope detection)
  • Insist on LaTeX for design reports.  I had too many reports with terrible math typesetting, incorrect figure numbering, and bad font substitutions with Microsoft Word or Google docs reports.  I’ll need to include a short tutorial in the book, with pointers to more complete ones.
  • Make it clear in the book that design reports build on each other, but each report needs to be self-contained—for example, the class-D amplifier report should contain circuits, results, and some discussion from the microphone, loudspeaker, and preamp labs; and the EKG lab report should include some information from the blood pressure and pulse monitor labs.
  • Add more background physics and math at the beginning of the book, to review (or introduce, for some students) topics we need.
  • Should I add a short lab characterizing the I-vs-V curve for an nFET and a pFET?  If so, where would I fit it in?  What about for a diode (could be LED)?
  • Bypass capacitor discussion should be moved to between the preamp lab and the class-D lab.  I need to talk more about power routing and location of bypass capacitors for the class D lab (it is important that the bypass capacitors be between the the noise-generating FETs and rest of the circuitry, which is noise-sensitive).  May need to introduce the concept of the power wiring not being a single node, so that “clean 5V” and “dirty 5V” are different nodes.
  • Class-D lab should have students measure and record the amount of current and power that their amplifier takes with no sound (removing the mic?) and with loud sound input, both with and without the LC filter.
  • Class-D lab should require students to show oscilloscope traces of the gate and drain of an nFET and of a pFET in the final H-bridge, both for turning on and for turning off.
  • EKG, blood pressure, and pulse monitor prelabs should have students compute the attenuation of 60Hz interference (relative to the signal in the passband) for low-pass filters that they design.

I should also review what students had to say about the course (look at discussion in previous post, for example).

 

2016 June 11

Teaching writing lab reports

Filed under: Circuits course — gasstationwithoutpumps @ 09:24
Tags: , , ,

Greg Jacobs, in his post Jacobs Physics: Report from the AP reading: Teach your class to write concise laboratory procedures. Please., asks high-school physics teachers to teach students how to write concisely:

Part (a) of our question asks for a description of a laboratory procedure. It could be answered in 20 words: “Use a meterstick to measure the height of a dropped ball before and after it bounces. Repeat for multiple heights.

“But oh, no … when America’s physics students are asked to describe a procedure, they go all Better Homes and Gardens Cookery Manual on us. Folks, it’s not necessary to tell me to gather the materials, nor to remind me to first obtain a ball and a wall to throw it against. Nor do you have to tell me that I’m going to record all data in a lab notebook, nor that I’m going to do anything carefully or exactly. Just get to the point—what should I measure, and how should I measure it.

Please don’t underestimate the emotional impact on the exam reader of being confronted with a wall of text. We have to grade over a hundred thousand exams. When we turn the page and see dense writing through which we have to wade to find the important bits that earn points, we figuratively—sometimes literally, especially near 5:00 PM—hit ourselves in the forehead. Now, we’re professionals, and I know that we all take pride in grading each exam appropriately to the rubric. Nevertheless, don’t you think it’s worth making things easy for us, when we be nearing brain fatigue? Just as good businesspeople make it easy for customers to give them money, a good physics student makes it easy for the grader to award points. 

Don’t think I’m making fun of or whining about students here. Writing a wall of text where a couple of sentences would suffice is a learned behavior. The students taking the AP exam are merely writing the same kinds of procedures that they’ve been writing in their own physics classes. It is thus our collective responsibility as physics teachers to teach conciseness.

As I’ve been spending far too much time this week grading an 11-cm-thick stack of design reports from my applied electronics course, I have considerable sympathy with Greg Jacobs’s view.

Technical writing is all about the 4 Cs: clear, correct, concise, and complete. Although there is always some tension between clarity and correctness, and between completeness and being concise, I generally find pretty high correlations between the four properties. Often, the very long reports are muddled, incomprehensible bundles of improperly applied factoids, while the essential information is missing entirely.

Part of the reason I have such a huge stack of papers to grade at the end of the quarter is that I have been giving “redo” grades for any errors in non-redundant representations (like schematic diagrams), putting a very high premium on correctness. For the class-D amplifier lab, 80% of the class had to redo the reports, mostly because they had not gotten the orientation of the FET transistors right in the schematics (a serious error that could lead to fires in the amplifier). I must have done a worse job at explaining the FET symbols—several times—than I thought, or maybe it is one of those things that people don’t learn unless they make a mistake and have it pointed out to them, repeatedly. I’ll be trying to fix the book and the lectures next year to reduce this problem.

I’ve also been down-grading students for lack of clarity (especially when the writing seems to indicate a lack of understanding, and not just inability to communicate) and for leaving out essential material (like not providing the schematics for their preamplifier as part of their amplifier lab report, not providing the parameters of the models they fit, or not providing the models they used at all). So clarity and completeness have had a fairly big impact on grades.

But I have not been giving bonus points for being concise, which I probably should start doing, as some students have started using a kitchen-sink approach, throwing in anything that might be tangentially related to the subject. Unfortunately, these are the students most likely to have unclear and incorrect reports, and they leave out the essential material in an attempt to throw in useless background, so their attempts at completeness generally backfire. I need to discourage this behavior, undoubtedly learned in middle school and high school, and get them to focus on the stuff that is unique to their design, rather than telling me Ohm’s Law or the voltage-divider formula over and over.

2016 March 19

Introduction of a technical paper

Filed under: Uncategorized — gasstationwithoutpumps @ 12:19
Tags: ,

I was recently pointed to a post The 5 pivotal paragraphs in a paper | Dynamic Ecologythat gives advice about how to structure a scientific paper.  Most of the advice is good, but I disagree with one statement:

First paragraph of the introduction—you should use this paragraph to embed and contextualize your work in the context of a large classic, timeless, eternal question. What drives species richness. Or controls abundance or distribution. Or gives the best management outcome. Or explains why species are invasive. Or controls carbon flux. You of course are not going to fully answer this question. Indeed no one person, and probably even no generation of scientists will fully answer this question, but ask a really big question. You can then use this big question setup to spend the rest of the introduction summarizing past attempts to answer this question, and show how they have all failed to address the key issue you are about to address.

The first paragraph of the introduction should be the specific point of the paper, not general BS. I read far too many papers (particularly student papers) where there is a huge wad of background dumped before the author gets around to telling me what they are writing about.  It irritates me—especially when I already know most of the background.

Don’t bury the specific goal of the paper at the end of the introduction—your readers may never get that far if you start out with general BS. Start with the specific goal of this paper—not the overall goal of a long-term research project or (even worse) the fundamental dogma of biology.

There is a term for this in journalism—it is known as “burying the lede”, which is considered a major flaw in news reporting.  It is a similarly large flaw in scientific writing.

I recommend that the first paragraph of a scientific article give the specific research question being answered in the article, and that the rest of the introduction then be used for the contextualizing that question—why is it important? how does this study answer it? For engineering reports, the first paragraph should give the main engineering design goal and constraints, again using the rest of the introduction to say why that was important.

If the conclusions of the paper do not answer the question raised in the first paragraph of the introduction, then the question is not specific enough.

 

Next Page »

Create a free website or blog at WordPress.com.

%d bloggers like this: