Mark Guzdial posted about computational thinking on his Computing Education blog, and a discussion ensued about the differences between algorithmic thinking and computational thinking. As a bioinformatician, I see plenty of value to both ways of thinking, but some important distinctions between them. First, let me define the terms, as different assumptions about what they mean can derail any sort of meaningful discussion.
Algorithmic thinking is thinking about how to accomplish a particular end. It is detail-oriented thinking about methods. For a wet-lab biologist, this is the sort of thinking that goes into developing a new lab protocol. For a computer programmer, this is what goes into the design of data structures and programs. I’m not really including debugging as part of this concept, as I see that as a separate skill. Indeed, those whose designs are good may have had less need to develop strong debugging skills, while those who have to fix up other people’s code may have developed strong debugging skill without having thought much about algorithms.
Computational thinking is thinking about data by using computers to summarize, massage, or transform data into a more easily understood form. The algorithms are not the key here, though good algorithms may well be needed to do the desired computation. In computational thinking, the focus is on the data and the interpretation of the data, and the algorithms are just tools available to help with that.
Here is a simple example: Computational thinking is needed to decide to plot the probability of passing a class as a function of the grade in the prerequisite class. Deciding whether to use sorting or hashing to make that plot and how to estimate the probability of passing from the raw data in the grade records requires algorithmic thinking.
In bioinformatics, we often distinguish between two types of research (or students or curricula): tool-using and tool-building.(I touched on this in one of my first blog posts.) Algorithmic thinking is needed for tool building, while computational thinking is needed for tool using. Neither is inherently better or more valuable than the other: both are essential (though not always in the same person) to make progress.
Mark Guzdial in his post gave some examples of empirical thinking (looking at data), and called it computational thinking, though he used no computation on the data. I think that this confusion on his part actually makes his main point stronger: that high school students are not taught computational thinking, and have no idea how to go about using computers to analyze data. Students get some empirical thinking in science classes, and some students get algorithmic thinking in computer science or math classes, but rarely are they exposed to the need to do more with data than some pocket calculator manipulations, and rarely do they have to think about how to transform their data into meaningful, interpretable information.