Academic Performance and Concept Maps

“Mildly creepy yet thorough, with a number of ‘duh’ moments.”

That’s how I’d describe the Journal of Chemical Education’s latest cross-university look at the factors that influence academic performance in organic chemistry courses from Szu et al. Creepily, the researchers asked student volunteers to keep a diary of their daily studying activities, asking them to indicate at the resolution of fifteen minutes “when and where they were studying, with whom, and what materials they used.” As is so often the case (see my recent micoach lament), this rather invasive data produced some of their most interesting results. Does the data-supported conclusion that “higher-performing students study earlier, not more” constitute something interesting in an absolute sense? I’ll let you be the judge. To me it was a “duh” moment—staying ahead of course material means that concepts make more sense when hearing them for the “first” time.

The Szu paper continues the recent trend (a fad of Gaga-esque proportions) to use concept maps to measure students’ conceptual understanding of a subject. I’m still not aboard the concept-map bandwagon, myself. Strangely, most human graders seem to treat concept maps like glorified open-response essays during the assessment process. How does it make sense to grade something containing discrete, explicit connections between concepts with a scale like “0 = total nonsense, 4 = scientifically complete”? It only makes sense when one “reads” a concept map as one would read an essay response, mentally talking out “[concept A] [verb expression] [concept B]” for each link. There must be a better way to grade these damn things!

Let me put on my Nostradamus cap for a second: visualization libraries for directed graphs on the web are not quite “there” yet, but once they get “there,” network analysis will bust onto the concept map scene in a big way. Humans aided by computer analysis of concept map networks will take education’s latest short-answer proxies to the next level as assessment tools. Right now, for instance, the distance between concepts is meaningless (related only to aesthetic concerns). Even basic network metrics, such as relative in- and out-degrees, are impossible to get a grip on visually. Can you imagine the incredible statistics educators could gain by pooling machine-readable concept maps from universities all over the country? It blows the mind!


One Comment

  1. my problem with surveys and the like is understanding exactly what it is they are measuring. they assume perfect self knowledge, perfect rigor in answering truthfully, and perfect congruence between the intentions of the questioner and understanding of the answerer.
    I too look forward to being able to measure conceptual understanding of organic chemistry in a clearer fashion.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s