Posts Tagged ‘qualitative research’
“It was the best of times; it was the worst of times.” This sentiment nicely sums up the state of chemical education right now. While sequestration threatens the largest sources of funding for chemical education researchers in the US, the literature has been on fire in the past few weeks with some intriguing studies. There’s a lot to talk about, so let’s get right into it!
First, the bad news. STEM education takes a painful hit in the President’s budget for FY 2014.
The single biggest consolidation proposed this year is in the area of science, technology, engineering, and mathematics (STEM) education, where the Administration is proposing a bold restructuring of STEM education programs—consolidating 90 programs and realigning ongoing STEM education activities to improve the delivery, impact, and visibility of these efforts.
Don’t be fooled by the rhetoric–this is almost certainly bad news for American chem ed researchers. It will be interesting to see how existing NSF-funded programs respond to these changes, but it’s almost certain to hurt the proliferation of new programs. It’s worth noting also that this is only a proposed budget, but if President Obama is throwing STEM education under the bus, I don’t see Congress fighting back.
Enough with the bad news! The bright side is that a lot of interesting research is happening these days. I’ve been digging into the general chemistry literature lately for professional reasons, and a very recent study out of Middle Tennessee State University caught my eye. The research addressed student conceptions of gases, focusing on a question that asks about the effects of a temperature change on the particulate nature of helium gas (originally studied by Nurrenben and Pickering). The conclusion of the research is typical: scaffolding and schema-activating designs for assessments improve performance on conceptual problems relative to more vague designs, but the authors were unable to track down the exact source of the performance boost (despite a few controls).
One clue is provided by another recent study: that of Behmke and Atwood on the implementation of problems sensitive to cognitive load theory in an electronic homework system. The authors converted single, multi-step problems into sequences of related problems that “fade” from nearly complete when given to fully incomplete. Using an analytical approach based on item response theory, the authors observed that students exposed to the “statically fading” questions were very likely to perform better on subsequent related problems. The act of breaking a multi-step problem down and exposing its process over multiple problems can improve performance.
Jennifer Lewis and colleagues at USF have written a very important summary of the state of the art in psychometric measurement for chemistry education research. In addition to pointing out the typical methods researchers use to argue for the validity and reliability of survey results, Lewis et al. note that chemistry education research is becoming more interdisciplinary as evidence mounts for theoretical overlap between sub-fields of science education. They also draw attention to the need for qualitative research to complement quantitative efforts (see the MTSU study for a nice recent example of this idea). A nice read right after Lewis’s review is Barbera’s recent psychometric analysis of the Chemical Concepts Inventory.
In other news: a simple approach to assessing general chemistry laboratories; an investigation of apprenticeship in research groups; differential item functioning in science assessments; the evolution of online video in an organic chemistry course; teaching gas laws to blind students. Mouse over the links for full article titles!
What’s new in the world of chemical education in 2013? In this edition of the CE Roundup, I’ll engage in a bit of shameless self-promotion, and we’ll look at articles that shed new light on the costs of publishing, innovations in laboratory instruction, student evaluations, and more.
Let’s get the shameless self-promotion out of the way first. Two weeks ago, the Introductory Organic Chemistry MOOC (massive open online course) kicked off on Coursera. The materials for this course were prepared by myself and my colleagues at UIUC for use with our organic chemistry 1 course for non-majors. I’m leading the Intermediate Organic Chemistry (organic chemistry 2) effort, and although that class hasn’t started yet, I’ve been knee deep in the MOOC world for a while now. I’ve got a whole series of blog posts planned on the MOOC experience, so stay tuned!
What is it about the winter months and great literature articles? Perhaps the cold bores people into writing. Who knows? Either way, the literature’s been very interesting in early 2013.
First, teacher reflection and cognition in the classroom. Reflective teachers generally see better student evaluations than unreflective ones. No surprise there: drivers who actually watch the road are better than those who don’t! But how much reflection is enough? A recent study in Brit. J. Educ. Technol. sheds some light on the question. The authors found that formative (weekly) student evaluations increased teachers’ reflective practice, and that increased levels of the latter lead to higher student evaluations over a multi-year period. Some would say that formative student evaluations could promote a “consumer culture” in education, however. There’s an interesting debate brewing there. In a study focused on science teachers, a team of researchers writing in to J. Res. Sci. Teach. found that teachers’ “noticing patterns”—patterns in their attention during class—indicate the ways in which they frame the classroom. Particular noticing patterns point to particular frames. Furthermore, the authors add, a given teacher is capable of multiple frames, depending on the classroom’s context. Their theoretical ideas are elegantly demonstrated in a video-based study of a high school biology teacher in action.
Laboratory instruction came under the qualitative microscope this month in a report by Bretz, Towns, and co-workers. They studied how instructors of different laboratories prioritize cognitive (thinking), affective (feeling), and psychomotor (doing) learning goals. This work draws attention to a potentially concerning decline in affective learning goals as students move from general chemistry to organic chemistry. In other laboratory news, a simple apparatus for flash chromatography gives results comparable to traditional columns and “obviates the need for students to handle silica gel”, and instructors at South Dakota State University have reported on instructional design for a laboratory sequence aimed at producing student researchers.
The editor-in-chief of J. Chem. Educ. has written an editorial describing the costs of publishing, and rationalizing some recent price increases. It’s worth a look, particularly if you’re interested in the broader forces acting on academic journals these days. Also interesting are the editorials citations, which include familiar language from the journal’s past editors.
I’m sitting in the airport in San Diego, ready to head back home after a stint at the 243rd National ACS Meeting. All I can say is…wow. It’s been an amazing two days of presentations, posters, and networking. I finally met some long-time Twitter followers in real life, and got the chance to talk shop with some of my heroes in chemical education. Very cool. Some of my favorite highlights from the chemical education programming follow.
A symposium on Sunday “by grad students, for grad students” focused on research in chemical education (and featured yours truly… ). Taken as a whole, the symposium highlighted the amazing breadth and focus that chemical education research has gained over the past few years. Clear research paradigms are emerging that, in the long run, are going to fundamentally alter how we teach chemistry. Although I think it’s sometimes hard to feel excited watching the literature and working day to day, the wheels are in motion and the community is alive and well. Read the rest of this entry »
This week has been an interesting one in chemical education. I know I promised interactive concept mapping on the web like two weeks ago, but things have gotten a little crazy as I’ve gotten stuff together for a publication (and realized the massive amount of work I have to do to make the publication complete). I’m going to go ahead and stamp it with a “Coming Soon” label.
At any rate, the Journal of Chemical Education was abuzz this week with a debate about the role of the rate-limiting step assumption in enzyme kinetics. Definitely worth a read if you’re a biology-leaning chemist with an interest in Michaelis-Menten kinetics.
In Science, experimental philosophy in the social sciences came under the gun this week, as Shaun Nichols and David Carmel debate the role of surveys in social-science experiments. In education, the value of triangulation has been recognized for a long time as a means to support survey data. Student performance data, qualitative observations, student interviews, focus groups, and a loooong list of assessment techniques (including, but not limited to, surveys) may all be used to judge the effectiveness of a classroom intervention or change. In fact, when such data are missing, raised eyebrows are the norm. Personally, I learned this lesson the hard way on my first publication…
Speaking of classroom assessment, this JCE paper outlines a qualitative approach to assessing “inquiry-based” teaching methods, which involve open-ended problems that demand application of the scientific method to reach a reasonable solution. The authors argue that most current assessment techniques are inappropriate for inquiry-based activities (IBAs), advancing the “mental models” framework as a theoretical basis for assessment of IBAs. Critically, the goal is to shift the focus of assessment toward the learner and away from content exposure (and other irrelevant measuring sticks).