March 25, 2013

ATLAS.ti and network views: No magic wands

After reading Susanne Friese's book, I knew what functions of the ATLAS.ti program I most wanted to put to work: memos, queries, and network views. I have tried unsuccessfully in the past to use these features, but they require a certain amount of finesse and a certain amount of insight into their underlying logic that I just did not possess.

My usual strategy for learning a new technology (pushing a lot of buttons, clicking a lot of links, and generally just fooling around with it until it works) just wasn't cutting it.

So, I turned to Friese, whose book outlines the strategic use of memos for running data queries and integrating findings. Memos are instrumental to the entire analytical process but play an especially important role during writing and production of the final report (read: DISSERTATION).

Friese encourages: "In order to see the benefits of it, you have to try it." She says some people may call it "magic," but it is really just "the added value of approaching analysis in a systematic way utilizing the options available" (p. 142).

Stop expecting "magic." Start applying a "systematic" approach. That is what I needed to do.

Memos
So, I started small, with one primary document (Watt's 2007 article on researcher reflexivity), one memo, and one writing task. I wanted to use the memo tool to develop some discursive text -- a blog post -- based on Watt's article.  Before I began reading and coding the article in ATLAS, I opened the Memo Manager, created a memo, gave it a "proper" title, and categorized it by type.

Meanwhile, I also created, titled, and categorized a free memo to record my process of using memos in ATLAS. Yes. A memo about memo'ing -- a "meta memo," which, in turn, became this blog post.

According to Friese, memos can be linked to quotations, to codes, and to other memos. I tried this and linked the Watt memo to the PDoc itself, to specific quotations within the PDoc, and to significant codes. Now what?

Queries & Network Views
Friese says memos serve an analytic function (a "container of ideas") and a technical function. After coding the data (in my case, the Watt literature), the user may probe more deeply by running queries. Queries serve a data retrieval function by probing connections between PDocs, quotes, codes, and memos in depth. The memo serves as a place to record descriptions, interpretations, and ideas based on these probes. Friese calls it "second-level conceptual analysis" (p. 7).

However, for the task at hand, I simply had one PDoc, one memo, a few codes, and a handful of quotes. Upon further consideration, I decided to save the query function for another day. For purposes of developing my blog post, I turned my attention to the network view and memo output functions. I was especially intrigued by the potential of network views for displaying data, creating concept maps, and developing writing heuristics.

Projects created within ATLAS.ti are referred to as hermeneutic units (HUs), based on the science of text interpretation (I know because I finally took the time to look it up.). The HU consists of links that the user creates between all sorts of object nodes:  PDocs, quotes, codes, and memos.  Thus, the HU is really a network, and network views are detailed perspectives on different aspects of the network. 

The thing is, there truly is nothing "magical" about network views, which is why I never had any success with them in the past! I possessed a fundamental misconception of this tool: I thought when I clicked the display network option, the network would magically populate the screen like a flow chart, with all the nodes logically mapped out and connected for me. 

On the contrary, when the user displays the network for an item and all its nodes, the user must manipulate the objects, linking and labeling them in a manner that makes sense to the user. This is why Friese has an entire chapter (Chapter 7) devoted to the creation and manipulation of network views.

My Process
I took some time to explore the quotes and codes associated with the Watt PDoc. How could I use the network view to glean new insights from Watt (2007) in relation to my current project, the dissertation? This exploration was messy and recursive. I spent a lot of time clicking on the quote nodes, which allowed me to read them in full. I found that it was more fruitful to right-click on the quotations and Display in Context. This resulted in more reading and, in some cases, more coding. Finally, under the Display menu, I opted for Quotation Verbosity, Full Text, so I could work with the quotations in full as I arranged them in the network view.

Arranging the codes and experimenting with the Code-to-Code Relations Editor helped build my understanding of exactly what I wanted to say in my Watt reflection. Naming the semantic relationships between the code nodes led me to see how most of the codes had one thing in common: they each connected somehow to the "trustworthiness" code.

I discovered I could double-click on the memo object to display its text in full, and I could double-click on the memo text to edit it. So, as I arranged the network view, I revised and wrote whole new portions of the memo. To illustrate points in the memo, I copied and pasted selected quotes from the context of the article. Toggling in this manner, from network view to memo and back again, became tedious, so I broke my paperless rule and exported the (mostly) finalized network view as a graphic file to my desktop and printed it. It sat on my physical desktop as I finalized the blog post. I would not have done this if I had a larger computer screen or dual monitors.

As can be seen in the "before" and "after" screenshots, I only scratched the surface of what is possible with network views. Using the network view did not make my writing process more efficient -- it took as much, or possibly more time to compose the blog post -- but it did help me conceptualize something new about an article I had already read multiple times over the last three years.

Network View BEFORE

Network View AFTER

I drafted the Watt blog post entirely in ATLAS. I am sure there are multiple ways to go about this; I saved the memo as an .rtf file to my desktop, then copied, pasted, and did final edits in my blog editor.

A final word about Memo Output
So, in the end, I did not utilize the memo output function. With the memo output function, the user selects the memo, right-clicks, selects Output and Selected Memo with Quotations, and chooses Editor as the destination. An editable document is generated that "includes everything you need to write the results chapter of your research report or paper" (p. 146).  However, I can see that if I was working with multiple memos and needed to string them together into one coherent piece of text, such as a chapter of findings, I might want to use this option to pull everything into a more robust word processing program.
Share/Bookmark

March 21, 2013

Reflecting on Watt (2007)

Watt's 2007 piece, "On Becoming a Qualitative Researcher," is a first-person narrative of a novice researcher who "put reflexivity to the test by keeping a research journal" (pp. 82-83).

Unlike positivist studies, which are assessed by familiar standards of validity and reliability, the rigor of qualitative inquiry is determined more subjectively, beginning with the researcher herself through a process of reflexivity. In the absence of precise designs and formulas to guide their practice, qualitative researchers, Watt explains, must be willing to self-scrutinize. 

Reflexivity, as defined by Glesne (2006), "involves critical reflection on how researcher, research participants, setting, and phenomenon of interest interact and influence each other" (p. 6). Reflexivity statements "provide a kind of map to the decisions you make so that the reader can better understand (and question) the interpretations you make" (p. 127).

As a student, Watt understood reflexivity on a theoretical level but wasn't sure what it looked like in practice.

This is the third time in as many years that I have read Watt's article, and with each new reading, I gain new insights. Not surprisingly, this time around, my insights were colored by the fact that I am now immersed in my own dissertation project.

What did surprise me was just how intimately connected the practice of reflexivity is to improving trustworthiness in research, a topic I will try to address at the end of this post.

A refined sense of the reflexive process
First, putting aside textbook definitions, my understanding of reflexivity has grown more refined, especially as it relates to the act of writing and developing identity.  I was really struck by the influence of a (somewhat) regular writing regime on Watt's own personal development as much as her research process. This is the kind of exploratory and personal journal writing that Laurel Richardson advocates in "Writing: A Method of Inquiry," which Watt refers to several times.

In her research journal, Watt summarized her readings from the literature, reflected on events in the field, recorded participant data, and made notes about her methodology. "Through using writing as a method of inquiry I was able to make links between how I carried out my study, reflective journal entries, and the literature on qualitative methodology" (p. 98).  In other words, she developed the deliberative stance as advocated by Piantanida and Garman (2009) and came to recognize the "centrality of writing as a way of coming to know" (Ch. 9, “Experiential Text as a Content for Theorizing,” para 12).

More importantly, through writing, the deliberative stance becomes internalized: "Reflective writing allowed me to meaningfully construct my own sense of what it means to become a qualitative researcher" (Watt, p. 83).

On becoming an expert
Writing is one way to negotiate moments of conflict and disequilibrium. Like discussion and other forms of peer mediation, reflective writing creates a feedback loop -- a conversation with oneself -- for improving practice and developing expertise.

Yet, as Watt points out, it is during moments of crisis and tension that novice researchers typically slow down or shut down their writing entirely.

For instance, when faced with the task of data analysis, Watt was at a "complete loss" (p. 95), and so she made lots of charts and diagrams. She explains, "I was so focused on the need to do something with the data that I did not consider journaling as a means to think things through, on both a personal and a research level. That was a mistake. In retrospect, this was perhaps the time I needed it most" (p. 96).

The mistake is common in my field, education. Like novice qualitative researchers, new teachers experience a profound sense of disequilibrium. Overwhelmed and isolated, they put reflection on the back burner.

But reflexivity is the antidote to repeating mistakes, and building upon a foundation of subjective experience is a hallmark of expertise. Watt cites Eisner's (1991) concept of educational "connoisseurship," in which seasoned practitioners leverage their subjectivities in productive ways. Piantanida and Garman also cite Eisner's work in their Chapter 5 discussion of deliberation as a means of attaining phronesis, "a valuing of wisdom that can guide action within the complexities of unfolding experience."

Whether it's "critical subjectivity," "connoiseurship," or "phronesis," these ideas are very useful to me because they are nuanced versions of what I would simply call "developing expertise," and teachers developing expertise about technology is the topic of my dissertation. For purposes of my study, I have found the CHAT concept known as "expansive learning" (Engestrom) to be helpful, but I like how Eisner's term goes directly at the teacher-learner, teacher-researcher experience. I may need to borrow or develop a more precise term other than "expertise" (something else to reflect upon later).

At any rate, I have noted before in this blog the parallel developmental trajectories of novice qualitative researchers and teachers-as-learners insofar as stance, disposition, and expertise are concerned. And after re-reading Watt, I have a renewed sense of how reflexivity mediates this process.

The Multiple Realities perspective
I have also referred previously in this blog to the Multiple Realities perspective, a central notion in my theoretical frameworks. I was excited to make a connection between Watt's article and Labbo and Reinking's (1999) seminal theoretical piece, "Negotiating the Multiple Realities of Technology in Literacy Research and Instruction."

Although Labbo and Reinking do not specifically use the term "reflexivity," they put forth the Multiple Realities perspective as a way for New Literacies researchers to monitor and leverage subjectivities in a way that strengthens research-to-practice connections:
For example, when a question related to instructional  practice begins with the phrase "What  does the research say about...?" we  believe it should be followed by an explicit consideration of which reality or set of realities is being considered. Doing so means that the answer will inherently be more complex than citing a string of studies and drawing conclusions from them. It also suggests that it may be important to identify the realities to which a potential answer does not apply or why a question is not a relevant or particularly good one within certain realities. (p. 488)
The Multiple Realities perspective, then, is a framework for guiding reflexive thinking within New Literacies research projects. I had never thought about it in this way until now! Maybe I need to be more intentional about referencing it in the methods section of my Chapter 3.

Implications for use of digital tools
Finally, a note about digital technologies and the role they serve in supporting reflexive practice: Watt's descriptions of her struggles to corral and make sense of the data, the codes, and the categories highlighted the value (for me) of digital tools generally, and CAQDAS in particular. This last insight will only become more compelling with the passage of time (Watt's article is now six years old) and my own increasing awareness of how to use digital tools for qualitative inquiry.

Trustworthiness
The common thread that connects each of the above ideas is how they promote trustworthiness in qualitative research. The discipline of regular, reflexive writing -- "chronicling one's thinking" -- helps the researcher mediate the meaning of her experiences and continually develop her expertise.

As Watt explains, "Revisiting my study has strengthened my confidence in my ability to negotiate the  complex process of qualitative inquiry, and I now see myself as a researcher. The multiple layers of reflection drawn upon in writing and revising this paper have made me more cognizant of how far I have come, and have taken me further along the path to becoming a qualitative researcher" (p. 98).

Reflexivity develops an internal authority of self-knowledge (identity) and self-efficacy, alternate "measures," if you will, of excellence and trustworthiness in research.  Reflexivity develops an authentically authoritative voice that delivers the research findings and interpretations in a way that resonates with readers.

References

Glesne, C. (2005). Becoming qualitative researchers: An introduction (3rd ed.). Boston: Allyn & Bacon.
 Labbo, L. D., & Reinking, D. (1999). Negotiating the multiple realities of technology in literacy research and instruction. Reading Research Quarterly, 34(4), 478–492. doi:10.1598/RRQ.34.4.5
Piantanida, M., & Garman, N. B. (2009). The qualitative dissertation: A guide for students and faculty (2nd ed., Kindle version.). Thousand Oaks, CA: Corwin.
Watt, D. (2007). On becoming a qualitative researcher: The value of reflexivity. The Qualitative Report, 12(1), 82–101.


Share/Bookmark

March 14, 2013

A question about reflexive statement and assumptions

Last week I presented my Chapter 3 to my workshop group. I wanted feedback on growing the logic-of-justification for case study method within a Cultural-Historical Activity Theory (CHAT) approach. To do this  part well, I know I must first clear the CHAT hurdle: does the reader understand what it is and how my use of case study relates to it?

As an evolving theoretical frameworks, CHAT is not widely used in education studies, even less so in the field of literacy studies. Roth and Lee (2007) call it a "best kept secret."

So, the biggest item of feedback I received during workshop last week had nothing to do with the incorporation of case study theory into my research design. It had to do with my explication of the underlying assumptions of my study, namely CHAT. The group wanted to know more about the assumptions of CHAT, which I only allude to in Chapter 3. It is within Chapter 1 that I discuss -- by way of a reflexivity statement -- the underlying assumptions of CHAT, as well as other elements of my frameworks (New Literacies theory and the multiple realities perspective).

My statement of reflexivity, as it currently appears in Chapter 1, outlines who I am as a researcher (background, interests, professional experience, and worldview) and how these aspects of my identity influence my choice of topic, question, frameworks, and methodological approach. Then, I get into the nitty-gritty of the underlying assumptions of my approach. But since these are shared assumptions, assumptions belonging to an entire research tradition/genre, should they not be in the methods section of the study?

My question: does it make sense to split my reflexive section? In other words, should I keep the personal statement of reflexivity where it is, but save the built-in assumptions of my research genre for Chapter 3? 

I would like to submit my Chapter 3 revisions as party of my mid-term project report for EP 659, so feedback on this point will be greatly appreciated.

Additional project update: I read most of Susanne Friese's (2012) book about ATLAS.ti last week, and posted my reading notes.

References:
Roth, W-M., and Lee, J-L. (2007). “‘Vygotsky’s Neglected Legacy’: Cultural-Historical Activity Theory.” Review of Educational Research 77(2), 186–232. doi:10.3102/0034654306298273.



Share/Bookmark

March 12, 2013

Notes on ATLAS.ti from Friese (2012)

For my dissertation, I have created a hermeneutic unit (HU) in ATLAS.ti. It resides in a folder on my desktop, and I am gradually pulling in documents related to my project, starting with annotated literature and reading notes on Cultural-Historical Activity Theory (CHAT), identity theory, case study, and interviewing methods. I am grouping my literature using the document family function.

For now, my focus is on the document families for case study and interviewing, which I am reading, analyzing, and coding for purposes of writing a more substantial logic-of-justification (Piantanida & Garman, 2009). As part of my project contract for EP659, I will incorporate this synthesis of my readings into my Chapter 3 overhaul.

This is only the second time I have used ATLAS.ti to conduct a literature review. The first time, I found that most of my analysis occurred outside of the software, and I used ATLAS to simply organize and categorize my annotations. The software was helpful, but I did not allot enough time to achieve a true synthesis with it.

Now, as before, I feel my attempts with ATLAS are constrained by a lack of time, and I am attempted to revert back to my "old ways." Why bother taking time to learn a digital tool for a process that people have conducted successfully for years with paper and pencil?

But I want my use of CAQDAS to count for something this time around. So, with that in mind, I began reading the eBook version of Susanne Friese's (2013) QualitativeData Analysis with ATLAS.ti.

Friese's book appeals to those novices and skeptics who may ask, "...[I]f the computer doesn’t do the coding, then what is it good for?" (p. 1). She argues that CAQDAS opens up data coding to a host of new possibilities, and new users must understand the potential value-added of CAQDAS software or risk a cursory application of it. Friese describes an all-too-familiar scenario: "...[W]hen I started to use software to analyze qualitative data in 1992, I did what most novices probably do: I looked at the features, played around a bit, muddled my way through the software and the data, gained some insights and wrote a report. It worked – somehow. But it wasn’t very orderly or systematic" (p. 2).

Friese argues that nowhere in the extant literature on CAQDAS is there a systematic guide for its use. In her book on ATLAS.ti, she refrains from being overly prescriptive -- how can she be with a system that, at last count, offered more than 400 sub-menus? Friese simply outlines the approach she has honed over the last 20 years. Passages laden with technical how-to are couched in methodological terms. The “how” is linked to the “why,” so readers can appreciate the intended analytical rigor behind each skill-building exercise.

In consideration of my next big foray into ATLAS.ti, I read with an eye toward learning new methodological and technical practices and techniques. Here are some key take-aways:

Methodological advice and ah-ha's:


  • Learn to interpret the numerals inside brackets following each code. The first number refers to the number of times a code has been used (its "groundedness"); the second number relates to the code's "density" and has to do with how the code functions within the network of other codes. I never understood what the second number signified, until now. Interestingly, ATLAS offers network views, but the networks are created manually through the user's interpretive process. This is discussed more in-depth in Chapter 7.  
  • As an early analytic move, comment on each PDoc and group PDocs into families as you add them to the HU. Document families are a prerequisite for running certain kinds of data queries later on. 
  • Refer to Chapter 4 for an interesting discussion of how to use the coding tools in ATLAS. The tools are modeled after principles of Corbin and Strauss' grounded theory, but that does not limit how they are applied. Chapter 4 does an excellent job of weaving methodological advice with technical how-to, such as the use and pitfalls of in vivo coding. 
  • Be diligent about defining codes with the comment tool. Code definitions evolve over time, and sometimes over the course of a project, it is possible to forget what a code originally stood for. This is another reason to use the drag-and-drop method afforded by the Code Manager (Friese's preference) instead of the list option (what I am accustomed to using). By keeping the Code Manager open in the workspace, definitions are constantly viewable as you select and apply each code. 
  • Manage memo settings (p. 138). This step is crucial, as writing memos in ATLAS goes hand-in-hand with use of the query tool. I have been taking field notes and writing memos in Evernote, and I have been using Evernote's tagging utility to label my notes thus: methodological note, theoretical note, personal note, and observational note (based on Richardson's scheme, which I wrote about in a previous post). I like using the Evernote app on my iPad in the field; it's less intrusive. And I generally avoid using my PC laptop (which runs ATLAS) for any form of data entry because I dislike the keyboard. Following Friese's suggestion, I adjusted the memo types in ATLAS to align with the memo types/tags I already use. Now, as I begin coding and analyzing my texts with ATLAS, I may continue generating and labeling memos using my personal scheme. Next, I need to consider how to get my Evernote data into ATLAS. Copy and paste into internal text documents within ATLAS? (See pp. 55-56.) 
  • Create analytic memos based on the research questions (pp. 143-145, p. 148). It is likely that most of my subsequent memos, while working in ATLAS, will be theoretical or methodological in nature. Friese suggests a special class of analytic memos called "research question" memos for when the user enters into a second level of analysis that involves querying the data and finding relations. Research question memos may be generated at the start of the project and added to and revised over the duration of the project. She explains,"In your first research question memos, the answers will probably be descriptive. But in time they will become more abstract as you get ideas for further questions, add new research question memos and basically take it one step further at a time, gaining more and more understanding, exploring more and more details of your data landscape and starting to see relations between them" (p. 145). 
  • After coding the texts/data, review Chapter 6 for ideas about how to query the data. The last half of Chapter 6 is very procedural and skills-based and will make more sense to a reader who already has a list of codes and has started to conceptualize those codes. Friese reviews the query tool, the co-occurrence explorer, and the Codes-Primary-Documents-Table, which can be used to find relationships and patterns. At this stage of analysis, the research question memos can be used to keep record of queries and results (as pictured on p. 144). Further, the memos, if set up correctly and linked to quotations, can be used to generate output that serves as "building blocks" for the findings chapter. 
  • Consider incorporating ATLAS.ti into the dissertation defense. ATLAS can support the presentation of findings in a number of ways listed on pp. 219-221.
Methodologically, this passage from page 1 of Chapter 6 says it all:
A lot of the analysis happens as you write, not by clicking on some buttons and outputting some results. You need to look at what the software retrieves, read through it and write it up in your own words in order to gain an understanding of what is happening in the data. Most of the time insights need to be worked at and are not revealed to you immediately by looking at the results. Simply seeing that there are, say, 10 quotations is not enough; numbers are sometimes useful, they hint that there might be something interesting there, but the important step is to take a closer look and to see what’s behind them. (p. 133)  
Technological advice and ah-ha's:  
  • When the Code Manager is open, navigate through a long list of codes by pointing the cursor on the Code Manager, and typing the first few letters of the desired code. If I ever amass the average number of 120-200 codes, this practice will be useful.  
  • When preparing transcripts in a word processor, save them as rich text (.rtf files) so that ATLAS does not need to convert them. Rich text is the standard file format in ATLAS, not .doc or docx. Typically, after transcribing in InqScribe, I format transcripts in MS Word. It would not be difficult for me to Save As .rtf. This is something to consider; although, in the past I have used .doc files in ATLAS and did not incur any problems. 
  • Many suggestions in Chapter 2 for formatting transcripts! I have not yet loaded any transcripts into my dissertation HU, and Friese's formatting guidelines should be easy to implement. These include marking speakers with unique identifiers that do not appear in the text (i.e. "INT" for "interviewer," and so on), double spacing between turns, and breaking up long turns with empty lines. Some of these preparations are to facilitate use of the ATLAS automatic coding tool, which I don't know if I even want or need to use, but Friese suggests getting into the habit anyway.  
  • Learn the ATLAS.ti file protocol. Friese calls it "the biggest hurdle" in ATLAS project management, and on pp. 36-37 she thoroughly demystifies this aspect of the program. It's a matter of understanding "external references," a system of document storage and retrieval that prevents a project file from becoming too big and unwieldy. I now have a firmer understanding of how the one folder for all data rule works and why I should follow this rule. More importantly, the name of each data file should be analytically helpful, not generic like "transcript1," "transcript2," and so on. Choose a file-naming system early. It will add transparency and efficiency to the project. The data files (primary documents) can be sorted in the document manager, providing a quick glimpse of your sampling. 
One immensely telling detail: the total number of menus and submenus in ATLAS.ti equaled 443 at the time Friese published her handbook. The point is that working with ATLAS can be highly individualized. The interface is designed to provide users with multiple options for performing the same function. Thus, through practice, the user develops his or her own preferred workflow and routines.  (I like this as it basically sums up my whole approach to teaching and learning with technology.)

References 
Friese, S. (2012). Qualitative data analysis with ATLAS.ti. (eBook.). Los Angeles: SAGE Publications Ltd.

Piantanida, M., & Garman, N. B. (2009). The qualitative dissertation: A guide for students and faculty. (2nd ed., Kindle version.). Thousand Oaks, CA: Corwin.
Share/Bookmark

March 5, 2013

Drawing implications from Potter & Hepburn

As preeminent voices in the field of discursive psychology and interaction studies, Potter and Hepburn began their careers researching interviews and interactions. It is through this lens that they offer advice and implications to qualitative researchers on the use of open-ended interview data.
from FreeDigitalPhotos.net

The authors contextualize their argument with 20 years of conversation analytic research on the Q&A format in institutional settings and everyday life. On top of that, researchers are now studying the use of Q&A in the social sciences, turning the "analytic searchlight" on themselves (p. 4).  

Potter and Hepburn want to bring rigor to a form of data collection that has been "too easy, too obvious, and too little studied," and, thus, has been the basis of much poor research (p. 3). They say their goal is to critique the ubiquitous open-ended interview so as to improve it. Considering their backgrounds, can they do this without coming across as theoretically and methodologically imperialist?

To their critics, Potter and Hepburn simply say "...[R]esearchers defending the status quo will need to show that the close dependence of both the form and content of the ‘answer' on the design and delivery of the ‘question' is not of general consequence" (p. 17). Who can argue with that?

A theory-to-practice feedback loop
The chapter is organized into two sections: analyzing interview data and reporting findings from interview data. Personally, I would have preferred more discussion about the design and conduct of interviews themselves, as that is what I'm in the process of doing right now. It seems that as we are interviewing so many procedural and methodological missteps can occur.

I guess it is Potter and Hepburn's belief that the researcher has more oversight over the analysis and reporting stages. We have greater opportunity at these stages to practice reflexivity and exercise conscious control over methodological choices than during the unstructured free-flow of qualitiative interviews. Their point is that theory feeds practice. The more carefully we attend to the challenges in our analysis and presentation of findings, the more thoughtful we will be in our future interview set-ups. Maybe we will reconsider our choice of data generation altogether. (True to their roots, Potter and Hepburn recommend more consideration be given to natural interactions.)

Each section of the chapter addresses four challenges. The first section deals with challenges in reporting and representing data; the second section discusses analytical challenges. As I summarize these, I will note specific implications for my current project.

Challenges and opportunities for reporting interview data
1) Make the interview set-up explicit. Describe how the participants were recruited and the explicit language and categories used to do so. Further, what kind of task, instruction, guidance were the interviewees given? I did not conduct a blind recruitment. I simply sent out a blanket invitation to all the study participants asking them to talk to me about their experience. I have the wording of that email which I can include as an appendix along with the interview protocol that I read verbatim at the outset of each interview.

2) Display the active role of the interviewer. Rather than inserting illustrative quotes from participants only, carefully include extracts that include the entire interactional turn between interviewer and interviewee. This enables readers to see "how the form of the answer may be occasioned by the form of the question" (p. 10). Fair enough. This seems more doable in a dissertation study than in a journal article with space constraints. Potter and Hepburn are building a case for a total departure from how participants' voices are depicted in qualitative studies. The introduction of the playscript format, itself not widely used in research articles, is still inadequate, according to the authors, because it does not capture "delivery of the talk" (p. 10). Are they really suggesting that all qualitative researchers use Jeffersonian transcription?

3) Represent talk in a way that captures action. And there it is: Jeffersonian transcription "allows the identification of a number of potentially consequential interviewer actions" (p. 13). Potter and Hepburn suggest that we give our extracts (not the full transcripts) the Jeffersonian treatment. This is also completely doable in a dissertation study and would likely be appreciated by committee members, so long as an explicit explanation is provided and a key to Jeffersonian notation appears in the appendices. I am currently revising portions of Chapter 3, and I wonder if this isn't something I should consider doing?

from FreeDigitalPhotos.net
4) Tie analytic observations to specific features of interviews. This suggestion is based on the claim that interview data is under-analyzed. Instead of combining interview data with large chunks of analysis in one paragraph, Potter and Hepburn recommend bolstering the analysis with simple formatting changes, such as representing turns on separate lines and numbering the lines.  Again, not hard to do when one is not constrained by space limitations or word counts, as in a dissertation study. But I wonder how these representational practices would be received by journal editors and article reviewers?

Challenges and opportunities during analysis
5) Flooding. This is the phenomenon of the interviewer's agenda creeping into the interview in benign ways, such as follow-up questions and probs formulated on the fly as well as "acknowledgement tokens" and agreement statements which potentially lead interviewees down a desired path. While these interactional events are natural and cannot necessarily be curtailed, they need to be acknowledged or risk analysis "chasing its own tail" (p. 20). I am not sure what the implications of this might be other than to heighten awareness and reflexivity and to perhaps include interpretive commentary to that effect when including extracts with multiple turns.

6) Footing. Footing refers to the various positions or categories that participants may speak from during an interview. In Extract 3 the interviewer poses a question that the interviewee may answer from a personal or institutional standpoint or some combination thereof. Likewise, the interviewer may be positioned as someone who is personally interested in the question or one who is just asking the question so it can go "on the record," like a disinterested reporter. This is fascinating. At this point in the discussion, it is safe to say that the extract Potter and Hepburn have used extensively throughout this article, the one illustrating a "put together on the spot" question about teachers (p. 19), is really just pretty terrible. It is like the proverbial "How did you feel when _____" questions that TV news reporters ask disaster survivors.  Does the question really bear asking? Potter and Hepburn don't belabor that point; they simply call attention to the lack of interviewer engagement, no "ohs" or "hmm-hmms." To an extent, these conversational elements are "subtle, complex and potentially consequential" (p. 24). When they are missing altogether, it suggests that the interviewer perhaps asked a question for which he or she already knew the answer. To be considered a credible social scientist, I had better ask thoughtful questions I genuinely care about, lest I be compared to a "news anchor."

7) Stake and interest. Unlike corporate-sponsored focus group surveys, qualitative research is typically conducted by someone who feels passionately about the topic, and interviewees are recruited because they ostensibly have an interest in the topic too. It is a wise analytic move to pay attention to how interviewer and interviewee manage their stake in the discussion. It seems that stake and interest are the the next logical progression from footing. If I openly acknowledge my interest in the topic, listen attentively, and engage with my participants, I must be reflexive about the consequences of these actions.

8) Cognitivism and individualism. Potter and Hepburn call qualitative researchers to task -- particularly constructivists -- for the ironic "privileging of conceptual mediation over action and the treatment of cognitive language as referential" (p. 28). Qualitative researchers put their interviewees in an impossible "epistemic position" to speak their minds and the minds of others, as if that kind of knowledge can be objectively reported and stated without performing a "range of ambitious cognitive judgments and feats of memory and analysis" (p. 29). Cognitive language (reminding me of the "residue" of positivism from Piantanida and Garman's book) creeps into our interactions, carrying with it the assumption that somehow people have "immediate and privileged access to their own opinions and attitudes" (p. 30). A discourse or conversation analysis approach would obviously be Potter's and Hepburn's desired route to mediate this challenge, but that isn't the only way. I am interested in the analytical tools of Cultural-Historical Activity Theory, which enable the researcher to explore "agentive dimensions" (emotions, identity, motivation) (Roth, 2009, p. 53). In one such study, Roth and colleagues (2004) drew from case study data, including transcripts, to theorize identity formation as predicated upon actions and outcomes.The authors contended that identity is not stable and is the outcome of participation in social activity. The key here is that interview data was not privileged over other forms (videotapes, journal reflections, field notes, and emails), a methodological move that Potter and Hepburn would endorse.

References
Potter, J., & Hepburn, A. (in press). Eight challenges for interview researchers. In J. F. Gubrium, & J. A. Holstein (Eds.), Handbook of interview research (2nd ed.). London: Sage Publications.
Roth, W.-M. (2009). On the inclusion of emotions, identity, and ethico-moral dimensions of actions. In A. Sannino, H. Daniels, & K. D. Gutierrez (Eds.), Learning and expanding with activity theory (pp. 53–71). Cambridge: Cambridge University Press.
Roth, W.-M. (2004). Activity theory and education: An introduction. Mind, Culture, and Activity, 11(1), 1–8. doi:10.1207/s15327884mca1101_1

Share/Bookmark