Chemical Education Roundup, 7-4-11

Happy independence day! A brand-spanking-new issue of Chemical Education Research & Practice found its way into my RSS reader list this week, so there’s plenty to talk about for this week’s roundup.

Let’s begin with a paper that gives multiple-choice tests in chemistry a second look. A lot of educators are nagged by the feeling that multiple-choice tests focus on factual understanding and memorization rather than conceptual understanding. One reason for this, argues George DeBoer in a recent CERP paper, is that typical analyses of multiple-choice tests treat them as dichotomous—every answer is either right or wrong, and the incorrect choices are lumped together and thrown out. In most cases, however, instructors deliberately design incorrect items (also known as “distractors”) to highlight incorrect lines of reasoning. If this is the case, we have a lot to learn from incorrect answers!

DeBoer applied Rasch modeling to a series of multiple-choice tests whose distractors were designed to pinpoint common chemistry misconceptions. Like item response theoretical models, Rasch models assign ability levels to students and difficulty levels to problems. The probability of a correct response on item x by student a is related to the difference between a‘s ability parameter and x‘s difficulty parameter. DeBoer’s model is even more finely grained, as it specifies probabilities for each choice on each item. Because each choice highlights a different misconception, one can plot the relationship between overall ability level and the probability of exhibiting some misconception (e.g., see the graph below). Cool stuff!

Misconceptions as a function of ability level

In other news, Penn et al. have validated the usefulness of concept maps as a measure of understanding in organic chemistry, using correlations to problem-set scores and final course grade. To generate the maps, they used a freely available concept-mapping tool called Cmaps.

The Journal of Computing in Higher Education has begun a special issue on interactions in distance education, and the first paper from that issue folds together two studies that address how different instructional strategies facilitate group interaction in online classrooms. The studies used the SOLO taxonomy and Community of Inquiry framework to evaluate instructional strategies; the results were largely complementary and fit together nicely in Kanuka’s article.

Finally, check out my friends’ blog on surviving in the wonderful, wild midwest at Adventures in the Midwest!

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s