Reflection Post: Interviewing

This week’s exercise was eye-opening in a couple of ways. First, I have discovered that I am not a good interviewer. I failed to have enough probes in my critical incident interview to keep my subject talking for even the five minutes required for the exercise. The topic was something I am incredibly interested in and could talk about for hours on end, but I did not account for the fact that any subjects I would have might not be as engrossed as I am. This is an excellent lesson to learn. As researchers, we are, of course, generally very involved in the topics we are studying and can be oblivious to other people’s indifference or lukewarm attitude toward the subject.

Second, as the interviewee, I became aware of the difficulty of answering pretty random questions on the fly. Hopefully, I can adjust for my own shortcomings in the area of interviews, because the plan is to have interviews in my dissertation and some other research.

Reflection 4: Interviewing

Each of the three interview instruments (i.e. unstructured, critical incident, and semi-structure) have their own strengths. An unstructured interview allows for the discovery of topics that the interviewer might not conceive of on their own. For example I asked about online video viewing habits but had not thought about the primary language of the video shaping the choice of website. I found that this technique was the easiest for me to keep the interviewee talking for the allotted time. A factor that allows this to occur is that the specifics of the topic are controlled by the interviewee. This makes sense since the interviewee is most likely not to choose a direction that is of little interest to themselves.

I found the example driven element of the critical incident interview to be helpful in discovering details. This instrument provides a storytelling aspect that brings the topic to a more concrete example that encourages talking about behaviors in a situation rather than a more conceptual approach on how someone looks for videos. For example, I found that someone watched the same video 4 times in a row. This aspect of watching a video multiple consecutive times did not occur to me even though I engage in this very behavior myself.

The semi-structure interview was the most difficult. My troubles started with the formulating of the questions. I found that I am most comfortable with quantitative question forms. These are the type of questions that illicit yes/no, a category, or some point on a Likert scale. Thankfully I was provided an example, this allowed me to imitate the form and insert my chosen topic. Even with the more open-ended form I found it hard to keep the interviewee talking. My question writing is probably not to blame since it was the last interview round. My interviewee and I were showing signs of fatigue and starting to mentally move towards the next part of class.

In general, I found this activity gave me a taste of what is involved in interviewing that I would not necessarily understand from reading a chapter in a book. Many sources stress the need to take notes immediately after the interview. As demonstrated in class, many of us did not want to take the time for note taking and jumped into the next interview.

Interview Reflection post (#4)

My research question was “What types of information behaviors do people in engage in when selecting a restaurant when they go out for dinner?” Yeah, I know – very creative and inspired.  Not even sure what I picked it – since I never really go out myself.   Oh well – I guess it was the best I could come up with in the middle of a brain-freeze.

Anyway – Despite the less-than-inspired subject, I did glean a few insights into the  various interview techniques we discussed.  Perhaps the most interesting (at least to me) of these was that, although coming up with questions mirroring each “type” (unstructured/ semi-structured/ critical incident), in the end I found myself asking for essentially the same information.  So, for example, my “semi-structured” lead in question was “Please describe for me how you chose the restaurant you went to the last time you went out to dinner” – and I asked about typical problems in choosing, sources of info used for selection.  My unstructured question was “Tell me how you select the restaurant you to to when you go out to dinner.”  In speaking with my interviewee, I found myself almost reflexively asking the same sub-questions I had for my semi-structured and even my critical incident question.  While I certainly didn’t want to “lead the witness” in that instance – and I found myself suddenly being very conscious of that problem as I started speaking with her and trying to interact with her as she answered the question – I nonetheless found it at times to be necessary to “prompt” her a bit (just by way of interacting with her).  As can be imagined, because the semi-structured and critical-incident interview questions did not require this.

Speaking of the critical incident and semi-structured interview questions – I think as between the two I got basically the same “types” of response – that is, I don’t think I really got anything significant from the one that I didn’t get from the other.  This could be a function of the fact that, as it turned out, at least 2 of the 3 people I interviewed are like me in that they don’t go out to dinner very often (mostly special occasions).  From that perspective, I’d have to say that the unstructured interview was probably the most lively and interesting – and, although I did have to remind myself to “not lead”, I did find myself feeling freer to interact and discuss the topic with her, as opposed to sort of mechanically asking a series of questions.  I guess I’d have to say I thought that the “critical incident” method was next after that in terms of feeling more “interactive” with the interviewee.  I think that when you ask someone to recall a specific personal experience that something does click to help you connect with them.  As you might imagine from this discussion, I found the “semi-structured” approach to be probably the most mechanical and “rote” of these three techniques.

I suppose the conclusion I arrived at is that I can see how a blend of these approaches could be useful, even in one interview.  For example, you might start with some fairly unstructured questions, but then follow up later on with more specific critical-incident questions or semi-structured questions around the same subject to expound on the earlier discussion and, hopefully, serve as opportunities to ensure consistency of response.  I also would note that it became clear that, just because you have an unstructured interview question, it doesn’t mean you as the interviewer shouldn’t still have some prompt questions in your back-pocket!

Just for completeness – I’ll say a few words about the “flip side” in conclusion.  From the perspective of an interviewee, I think I also found the unstructured interview method a bit easier to “work with” – Sometimes when I was asked more specific/ direct questions I really couldn’t think of an example or a response right off and I could feel myself sort of tightening up a bit trying to think of “something” responsive.  In the “unstructured” question format (um – we can discuss that oxymoron later…) I just feel I could sort of get myself started and warm up to the subject a bit more easily.

Interview Activity Reflection

For this week’s activity I interviewed three people about how they prepare to shop for a new vehicle.  At the beginning of the activity I felt I would try to probe the different activities in which people engage while shopping for a new car. However, I quickly realized that the questions for the semi-structured and critical incident interviews were too similar to elicit different answers.  The similar answers could also be a result of the notion that most people in the United States take similar approaches to purchasing a vehicle, from initial research, to test driving, to negotiating, etc.

The semi-structured and critical incident interviews were too restrictive to me.  I did not feel as though I could probe for more in-depth information to responses that I found intriguing.  I felt as though I had to return to the interview protocol immediately and could not ask more questions that I had not already written down.  I know the purpose of these types of interviews is to attempt to answer certain questions and keep the interview moving in a specific direction, but the formats do seem restrictive when it comes to probing for more information.  This could also be a result of the extremely limited time in which we had to complete the activity.  There would undoubtedly be more time in the field to probe for more information.

The unstructured interview, however, allowed for more probing on my part without feeling guilty about abandoning my interview protocol, mainly because there was only one question on my interview protocol for this type of interview. This interview did not feel as awkward to me and seemed more conversational.  Despite its relative freedoms, I can easily see the danger to this type of interview.  The interviewer could easily become lost in the details of a specific line of inquiry and no longer see the forest for the trees.  Not having at lease a semi-defined protocol could lead to the interviewer wasting his or her time as well as the participant’s, and come home empty-handed with no valuable data.  As long as the interviewer is capable of keeping the interview on track, however, I think this interview allows for greater ability to probe interesting responses for more information.

Reflection 4 (Interviewing)

This week’s interviewing assignment was insightful for many reasons. For one, interviewing appears easier than it is. Secondly, my research project centers on oral history research and the readings that I have gathered so far agree on one thing: oral history is more than an interview. To this end, I hoped to see if I could perceive what the differences are, based on my rudimentary experience. Unfortunately, I didn’t fully embrace the ïnterviewer” mode, which was a disservice. Before I describe my experience, I should mention that my research question was ¨How do American adults seek vacation travel information?” I asked the subjects: ¨Think of the last time that you went on a vacation. How did you locate information?¨

I began with the structured, critical incident interview. For all intents and purposes, this was to be the most formal of the interviews. I scheduled it first because I anticipated that it would be taxing. On the contrary, I accomplished very little and personal conversation took over (which Suchman and Jordan advise against). We ¨lost the plot” by praising Ethiopian food as well as discussing Ethiopian restaurants in Atlanta and Columbus, Ohio. We were enjoying a pretty neat exchange on Jacksonville’s Ethiopian restaurants (and lamented the fact that there aren’t any in Tallahassee) when someone reminded me that I was to be keeping time and five minutes had passed. This is all to say that I learned the following: don’t inject too much of your own experience, even if it is plenteous; it is better to make a note and return to ¨juicy convo¨ after the interview has ended; the researcher should tame their enthusiasm, especially if the research is their ¨baby,¨; one must balance the pleasantries of conversation with hard fact-finding. In my opinion, the critical incident technique is best suited for investigating non-personal, non-individualistic phenomena–that is, work-related, domain-related or skill-related processes. Maybe Flanagan mentioned this; I can’t recall at this moment.

The second, semi-structured interview was also very laid-back. It reminded me of the reference interview. I asked one or two questions but, for the most part, the interviewee was self-directed. I was a bit more focused yet the interview still leaned on the side of a conversation. I talk too much! I learned that researchers should foster dialogue but not conversation, which is hard to do without being cold. I agree with Suchman and Jordan, who noted that ¨standardization¨can make the interview process awkward.

The last interview, the unstructured, was the most successful. I introduced myself, stated the purpose of the interview and discussed the goals. I listened more than I spoke, but this had to do with the fact that I was restricted in terms of the number of questions that I could ask. Though it was unstructured, the interview was more professional because the interviewee was briefed. She understood the mission and the data I was after and answered accordingly.

A final note: the interviews collectively taught me that ramblings and digressions are pretty inevitable. The researcher would do well to anticipate distractions. These blurbs might be handy:
¨I’m glad you mentioned that because it leads perfectly into my next question: ¨
¨Ah, that’s extremely important and, along those lines, I was wondering…¨
¨So if I understand you correctly youŕe saying _______. Let me ask you, how do you feel about…¨

Reflection 4: Interviews

For this week’s exercise, I asked my classmates about their usage practice of Wikipedia.My initial idea is to examine the users’ criteria for identifying high-quality answers in social Q&A sites,which is closer to my major area. However, it turned out to be the first interviewee I picked up never use social Q&A and thus didn’t understand what I was asking. This reminded me that delivering your interview question to the right population is the first thing you need to be sure, especially if you would like to ask questions about attitudes, beliefs, behaviors, and feelings.Everyone may have experience to go grocery shopping but obviously just a small group of people uses social Q&A!

Once I changed my topic to Wikipedia, the “population” problem still hasn’t been solved completely due to the high level of homogeneity in subjects. I got quite similar reflections from all three interviewees: they all used Wikipedia as a “starting point” and used the references and external links as indicators to evaluate the quality of certain article (the validity dimension), and mainly focused on the “objective” information the article provided. Dawn said “it will be an issue if any phd student trusts the Wikipedia article too much and does not care what the resources it referenced!” That’s the reason why all the interviewees reached high degree of agreement–because all of them are phd students. However, such bias in the population should be avoided in a research project. I didn’t have chance to share my experience in class about the interview I’m currently conducting –to examine the multilingual searching behavior of Chinese, Shu and I are trying our best to select our interviewees based on the communities they represent (e.g. Chinese lives in different parts of the world) and their professions. The purpose of the selection is to avoid the bias in the population I mentioned above.

As I mentioned in class, I feel unstructured interview turned out to be the most effective one. I assumed very few people will actually look at Wikipedia’s history or discussion pages so I didn’t include related questions in my semi-structured interview with Cheryl. However, in the unstructured interview with Tim he mentioned in some cases he will check those pages and use them as indicators of the quality. This provided new perspectives from the interviewee’s side and helped me to add questions to my interview instrument.

 

Reflection 4: Interviews

In this week’s exercise, I asked people about how they balance work and life.  During the course of the interviews, several things occurred to me.

First, it is very important to choose a population for whom your topic is relevant and engaging.  As we discussed in class, you would not ask a group of non-runners about how they structure their running schedule.  I got good responses because I tailored my question to the population.  When the research question comes before the population, as it does much of the time, it is important to seek out the appropriate population in order to get meaningful data.

Second, I found that the critical incident question, for me, would be better used as one of many questions rather than a stand-alone.  However, there might be instances where this would not be the case.  For example, suppose one were to interview eye witnesses to JFK’s assassination.  Probes might still be needed to get more detail during the course of the interview, but the initial question would very likely need to be a critical incident type question.

Third, as I also mentioned in class, my experience with I-Corps taught me a number of things about interviewing: it is best to have more than one interviewer if possible, because different people notice different things; it is always a good idea to debrief immediately after an interview, especially one where recording was not possible; the more thorough your notes are, the easier time you will have with the debriefing; and, it is helpful to practice both with familiar and unfamiliar faces before the “real” interviews begin.

Fourth, and perhaps the most important thing that I learned today, is to know thyself.  Dawn mentioned not thinking she was a good interviewer (she was!) and Dr. K mentioned that she could only do 1-2 research interviews per day and Tim mentioned that he was not that great with open-ended questioning.  These are the kinds of things we need to discover about ourselves in order to make the appropriate adjustments to get optimal results, or to choose a different methodological approach for our research.

Ultimately, conducting interviews today got me excited all over again to start my own dissertation research.  Granted, that’s probably a good year away, but I was reminded how much I actually enjoy talking to people and asking questions.  It was sort of an affirmation that I’m headed in the right direction.

Project Update 1: Preliminary Bibliography

I have decided to focus my project on the idea of the coming out narrative as information as process. There are innumerable sources discussing the act of coming out but few if any in LIS discussing it as an act of information. This first update is simply the bibliography of sources I have been gathering.

Preliminary Project Resource List

  • Barton, B. (2012). Pray the gay away: The extraordinary lives of Bible Belt gays.
    New York, NY: New York University Press.
  • Brown, M.A. (2011). Coming out narratives: Realities of intersectionality
    (Doctoral dissertation, Georgia State University). Retrieved from
    http://scholarworks.gsu.edu/sociology_diss/63
  • Buckland, M.K. (1991). Information as thing. Journal of the American Society of
    Information Science, 42(5), 351-361.
  • Corrigan, P.W., Kosyluk, K.A., & Rusch, N. (2013). Reducing self-stigma by
    coming out proud. American Journal of Public Health, 103(5), 794-800.
  • Denes, A., & Afifi, T.D. (2014). Coming out again: Exploring GLBQ individuals’
    communication with their parents after the first coming out. Journal of
    GLBT Family Studies, 10(3), 298-325.
  • Dunlap, A. (2014). Coming out narratives across generations. Journal of Gay
    and Lesbian Social Services, 26(3), 318-335. doi:
    10.1080/10538720.2014.924460
  • Goldman, L. (2008). Coming out, coming in: Nurturing the well-being and
    inclusion of gay youth in mainstream society. New York, NY: Routledge.
  • Gray, M.L. (2009). Negotiating identities/queering desires: Coming out online
    and the remediation of the coming-out story. Journal of Computer-
    Mediated Communication, 14(4), 1162-1189. doi:10.1111/j.1083-
    6101.2009.01485.x
  • Gray, M.L. (2009). Out in the country: Youth, media, and queer visibility in
    rural America. New York, NY: New York University Press.
  • Plummer, K. (1995). Telling sexual stories: Power, change, and social worlds.
    New York, NY: Routledge.
  • Rhoads, R.A. (1994). Coming out in college: The struggle for a queer identity.
    Westport, CT: Bergin & Garvey.
  • Riley, B.H. (2010). GLB adolescent’s “coming out.” Journal of Child and
    Adolescent Psychiatric Nursing, 23(1), 3-10.
  • Savin-Williams, R.C. (2001). Mom, Dad. I’m gay.: How families negotiate
    coming out. Washington, DC: American Psychological Association.
  • Signorile, M. (1993). Queer in America: Sex, media, and the closets of power.
    New York, NY: Random House.
  • Vargo, M.E. (1998). Acts of disclosure: The coming-out process of
    contemporary gay men. New York, NY: The Haworth Press.

Ana – Project Update 1

My project is on the oral history methodology. This is a natural fit since I was fortunate to be a part of a neat oral history project while working as a librarian at Florida Memorial University, a small HBCU in Miami, Florida. Oral history research resurfaced during last semester’s Research Methods seminar and I am glad that I have the chance to delve deeper in this course.

That said, so far I have compiled 19 articles and one book on oral history theory and best practice. Here is a Google Docs link. The readings examine everything from applications to various domains (nursing, religion, feminist studies, police work, post-colonial African studies); the philosophical underpinnings of oral history; step-by-step procedures for conducting fieldwork; ways of handling exceptional circumstances (humor, interviews with individual who have survived or participated in mass atrocities); a case study on the Civil Rights Oral History Survey Project (CROHSP) and even a survey of noteworthy international oral history projects. The journal, Oral History Review, is a great resource, I have discovered. I have, however, gathered articles from other publications.

Next up, I will synthesize the readings in order to identify major themes pertaining to the oral history methodology.

Project Update #1

Methods Statement: Content Analysis

1. Definition:

A research technique for making replicable and valid inferences from texts (or other meaningful matter) to the contexts of their use. (Krippendorff, 2012, p. 24)

2. Basic Procedure/Framework

What are the texts to analyze?
What is your research question?
What is the population or context that texts are involved?
What is the coding book or analytical construct?
What are the inferences to retrieve from the texts?
How can you validate your results?

(Krippendorff, 2012, p. 35)

3. Differences about quantitative and qualitative approaches in content analysis:
Quantitative: systematic, word counts, computer assisted technique, analytical path (Franzosi, 2008)
Qualitative: require close reading of relatively small amounts of textual matter (Krippendorff, 2012)

3.1 When should we apply quantitative and when qualitative approach?
(“QuantContentAnalysis” n.d.)

4. Qualitative approaches:
A. inductive and deductive (Graneheim & Lundman, 2004)
B. conventional, directed, or summative (Hsieh & Shannon, 2005).
C. discourse analysis, rhetorical analysis, ethnographic content analysis, and conversation analysis (Krippendorff, 2012)

5. What is the analyzing unit?
6. Sampling
7. Coding construct
8. Analytical techniques
9. Reliability and validity measurements
10. Examples about content analysis in online community settings.

References

Graneheim, U. H., & Lundman, B. (2004). Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness. Nurse Education Today, 24(2), 105–112. doi:10.1016/j.nedt.2003.10.001

Hsieh, H.-F., & Shannon, S. E. (2005). Three Approaches to Qualitative Content Analysis. Qualitative Health Research, 15(9), 1277–1288. doi:10.1177/1049732305276687

Krippendorff, K. (2012). Content Analysis: An Introduction to Its Methodology. SAGE.

QuantContentAnalysis < MoM < digitalmethods.net. (n.d.). Retrieved February 16, 2015, from https://www.digitalmethods.net/MoM/QuantContentAnalysis

Franzosi, R. (2008). Content analysis: Objective, systematic, and quantitative description of content. SAGE benchmarks in social research methods: Content analysis, 2-43.

Elo, S., & Kyngäs, H. (2008). The qualitative content analysis process. Journal of Advanced Nursing, 62(1), 107–115. doi:10.1111/j.1365-2648.2007.04569.x

Krippendorff, K. (2004). Reliability in Content Analysis. Human Communication Research, 30(3), 411–433. doi:10.1111/j.1468-2958.2004.tb00738.x

Berelson, B. (1952). Content analysis in communication research. New York, NY, US: Free Press.

Gleave, E., Welser, H. T., Lento, T. M., & Smith, M. A. (2009). A Conceptual and Operational Definition of “Social Role” in Online Community. In 42nd Hawaii International Conference on System Sciences, 2009. HICSS ’09 (pp. 1–11). doi:10.1109/HICSS.2009.6

Marra, R. M., Moore, J. L., & Klimczak, A. K. (2004). Content analysis of online discussion forums: A comparative analysis of protocols. Educational Technology Research and Development, 52(2), 23–40. doi:10.1007/BF02504837

Shea, P., Hayes, S., Vickers, J., Gozza-Cohen, M., Uzuner, S., Mehta, R., … Rangan, P. (2010). A re-examination of the community of inquiry framework: Social network and content analysis. The Internet and Higher Education, 13(1–2), 10–21. doi:10.1016/j.iheduc.2009.11.002

Kožuh, I., Hintermair, M., & Debevc, M. (2014). Examining the Characteristics of Deaf and Hard of Hearing Users of Social Networking Sites. In K. Miesenberger, D. Fels, D. Archambault, P. Peňáz, & W. Zagler (Eds.), Computers Helping People with Special Needs (pp. 498–505). Springer International Publishing. Retrieved from http://link.springer.com/chapter/10.1007/978-3-319-08599-9_74

Shoham, S., & Heber, M. (2012). Characteristics of a virtual community for individuals who are d/deaf and hard of hearing. American Annals of the Deaf, 157(3), 251–263.