July 28, 2011

More reflections on CAQDAS

A few years ago while working on my master's in Instructional Technology, I was searching for creative uses of the ubiquitous PowerPoint and stumbled upon a column by artist/musician David Byrne in which he described his first encounter with the presentation software.  Byrne hated the application, calling it "limiting, inflexible, and biased." Despite this, Byrne decided to take up the medium anyway, in order to satirize it.

Then something strange happened.

Byrne realized he could make PowerPoint function as a "metaprogram" in which he could organize all his multimedia content into something "beautiful." He wrote, "I could bend the program to my own whim and use it as an artistic agent.... I could make works that were 'about' something, something beyond themselves, and that they could even have emotional resonance."

Guided by his curiosity and artistic vision, Byrne successfully and effectively co-opted an evil business software and turned it into a creative platform.  This idea really appealed to me at the time because we classroom teachers have been doing that sort of thing for a long time (gradebooks in Excel spreadsheets, writing workshops in MsWord).  

I was reminded of Byrne's artwork this week as I completed the EP604 course readings on CAQDAS, several of which made mention of the historical distrust of computers among some qualitative researchers. In their 1996 paper Qualitative Data Analysis: Technologies and Representations, which, interestingly, was published in what had to have been one of the very first digital journals of social science research, Coffey, Holbrook and Atkinson describe a tension between the increasingly diverse methods of contemporary ethnographic research and a trend towards homogeneity imposed by the "computing moment."

A section of Seale's chapter in Doing Qualitative Research poses this question: "Do computers impose a narrowly exclusive approach to the analysis of qualitative data?" (p. 257)

Following this week's readings and then last night's class discussion, I am convinced the answer is "no." The early fears about "orthodoxy" and "homogeneity" are unfounded. It seems that the qualitative researcher, confident of and consistent in her own methodology, can leverage the power of these seemingly positivist tools to do some powerful meaning making (Friese, 2011; Seale, 2010).

As a former public school teacher, I know a little about oppressive orthodoxies. There is an insidious strain of orthodoxy that pervades K-12 education, and it goes by the innocuous name of "best practice." It's actually very odd. Teachers are told, on the one hand, to implement research-based best practices, while, on the other hand, most progressive education reforms focus on making instruction individualized and learner-centered, not scripted and standardized. In other words, what's "best" for one child may not be "best" for another.  The only "best" practice is what works at a given time, in a given context, with a given student.

Now, as a novice qualitative researcher, I am visiting my classmates' blogs, attending software webinars, participating in EP604 class discussions, and thinking about "best practices" again, this time in light of "digital convergence" (Brown). The digital tools, by their very design, are exploding the notion. Rather than impose a singular and "right" way, the tools are to be explored, evaluated, and adapted to fit our epistemological and methodological needs (Brown; Friese; Seale).

For example:

  • Last night we were provided an overview of two CAQDAS tools, QDAMiner and Transana. Both programs imposed an a priori approach to coding, but our instructor suggested a trick to bypass that: simply create a generic code such as "quotes" or "clips" to use during the first cycle of coding.
  • In her article Using Atlas.ti for Analyzing the Financial Crisis Data, Friese describes in detail how, feeling the grounded theory approach to be inadequate, she devised her own idiosyncratic analytic procedures, which slowly evolved into what she calls "computer-assisted NCT analysis."  In her conclusion, Friese asserts that her coding and analysis process would be the same regardless of the software package she chose.
  • In an earlier post I shared what I was learning about Mendeley which performs double-duty as both a citation manager and a collaborative platform for scholarly research. According to its developers, Mendeley is highly individualized to fit the "idiosyncratic processes of researchers." I wondered about Mendeley as a place for self-publishing.Could it be it is the very embodiment of Brown's "scenario," in which the "massification" of "combined technologies could...provide the opportunity for the proliferation and democratization of the production and dissemination of qualitative research knowledge"? 
Mendeley as a "scholarly Facebook"? Probably not. But I like to think about the possibilities. Even Coffey, Holbrook and Atkinson (1996) concluded that the contemporary ethnographer should give the proliferation of digital tools "serious and systematic" attention or risk becoming a "dreadful anachronism."

P.S. Just for fun, here is a PowerPoint animation set to a remix of Canon in D.  Nothing anachronistic here.



Share/Bookmark

1 comment:

  1. Great post. I think that scholars are using the term "appropriation" or "reappropriation" of technology to describe/study that phenomenon/tendency humans have of repurposing technologies to meet our own needs, regardless of how/what they were originally designed for. If you are interested in writing around/exploring the historical discourses around qualitative research & technology, let me know and maybe we can work on something together.

    ReplyDelete

Be nice! And thanks for visiting my blog!