Bookmark and Share

Research Methodology

    Chapter three describes in detail the procedure of the study. This involves relating the research methodology, restating the purpose statement and research questions, describing the participants, recounting the instruments, defining the study’s population and sample, providing steps for data collection, and detailing procedures used in analyzing the results of the research data.

Research Methodology

            For this study, the candidate undertook a phenomological study.  A phenomological study seeks understanding “the subjects’ experiences and how they make sense of those experiences.”[1]  Therefore, the researcher sought the most effective way to understand his subjects’ experiences and how those subjects understood those experiences.  The remainder of this chapter explains the methodology the student carried out.

Statement of Purpose

        The researcher commenced this project-dissertation for the following purposes:

        1.        To assess the attitudes of elders in churches of Christ in Kanawha and Lincoln Counties of West Virginia regarding inactive Christians

        2.        To assess elders’ views of their ability to work with inactive Christians

        3.        To assess elders’ willingness to learn how to deal with inactive Christians more effectively

Research Questions

        The researcher posed the following research questions regarding his study:

        1.        How do elders in the churches of Christ in Kanawha and Lincoln Counties of West Virginia view inactive Christians?

        2.        What encounters with inactive Christians have elders in the churches of Christ in Kanawha and Lincoln Counties of West Virginia experienced?

        3.        What have the elders learned from those encounters?

        4.        How do the selected elders view their role as a shepherd regarding inactive Christians?

        5.        How willing would the selected elders be to enroll in a seminar to learn skills to work with inactive Christians?


        The researcher contacted all mainstream churches of Christ in Kanawha or Lincoln County, West Virginia and invited the elders in those congregations to participate in the study.  Eighteen mainstream churches of Christ operate in the two counties under consideration.[2]  Because the student has preached repeatedly about restoring inactive Christians in the Alum Creek Church of Christ, he excluded the two elders serving that congregation.  Six of the remaining seventeen congregations do not have elders; therefore, eleven congregations remained for study.

        The researcher mailed a letter to each of the remaining congregations explaining the project and seeking the elders’ participation.[3]  One week following the letter’s mailing, the candidate called each elder requesting a face-to-face meeting during which the candidate explained the research project.[4]  Through the face-to-face meetings the researcher indicated both the importance of the elders’ participation and the importance of his research; the meetings also provided the researcher with first-hand knowledge of the participants’ surroundings.  At the face-to-face meetings, the student introduced the informed consent form.[5]

        The investigator included all willing elders in Lincoln and Kanawha Counties of West Virginia in the study.  The candidate telephoned sixteen elders whose names and phone numbers he collected from various sources.  The researcher personally knew six of the elders prior to the study.  Six other names were collected from websites of congregations listing contact information for the elders.  The four additional names were collected from ministers or members of the congregation the student knew.

        Of the sixteen elders telephoned, six refused participation in the study.  One additional elder agreed to participate, but he relocated before an interview could be conducted.  Nine elders participated in the study, for a participation percentage of 56.25.

        The student broke demographic data into the following categories to make comparisons between the participants easier: age, length of time as an elder, size of the congregation served, and religious background.  The table below provides the demographics of the participants.

Table 1.  Demographics of Participants
Time as an Elder
Size of Congregation
Religious Background
Subject #1
4 years
Churches of Christ
Subject #2
40 years
Subject #3
25 years
Churches of Christ
Subject #4
5 years
Churches of Christ
Subject #5
10 years
Churches of Christ
Subject #6
21 years
Churches of Christ
Subject #7
5 years
Churches of Christ
Subject #8
15 years
Churches of Christ
Subject #9
15 years
Churches of Christ


        The researcher served as the primary instrument for this study.  Two writers said:
Qualitative researchers believe that the researcher’s ability to interpret and make sense of what he or she sees is critical for understanding any social phenomenon.  In this sense, the researcher is an instrument in much the same way that a sociogram, rating scale, or intelligence test is an instrument.[6]
Although, in many respects, the researcher served as the instrument for this study, he needed direction in gathering usable data; therefore, he developed an interview guide.[7]

        In conducting the interviews, the candidate did not utilize an established guide, for such a measurement does not exist.  Instead, the researcher developed an interview guide for use in his interviews.  The interview guide allowed the candidate to make his interviews “more systematic and comprehensive by delimiting in advance the issues to be explored.”[8]  Additionally, using an interview guide allowed the candidate to compare more easily the data obtained from the different participants.

        Including all sub-questions, the interview guide contained forty-nine questions.  The first set of questions allowed the researcher to explore with the participants their current situation in light of the research problem; the questions allowed the elders to reflect on their past and see how that past might impact the way they currently work with inactive Christians.  The second set of questions explored the elders’ current situation with inactive Christians.  The final set of questions examined what the elders would like to change about the way they work with inactive Christians.

        The candidate used his interview guide for standardized open-ended interviews.[9]  In using standardized open-ended interviews, the researcher made certain each participant was asked the same questions in the same order as all the others.  Utilizing the interview guide in a standardized manner provided the student with several benefits.[10]  Variation among the interviews was kept to a minimum, allowing the researcher to categorize data more easily.  The standardized open-ended interviews also allowed the candidate to keep the interview highly focused which provided for efficient use of time.


In quantitative studies, an instrument’s validity matters, for validity allows researchers to know “the extent to which an empirical measure adequately reflects the real meaning of the concept under consideration.”[11]  Therefore, without validity, researchers would never know with certainty whether their instruments actually measured what they purported to record.

However, because of divergent methodological suppositions, research standards differ significantly in qualitative from quantitative studies.  Two writers argued, “Advocates of the antirealist position argue that qualitative research represents a distinctive paradigm and as such it cannot and should not be judged by conventional measures of validity, generalisability, and reliability.”[12]  The term “trustworthiness” better describes the present study.  In summarizing the research of others, another author wrote, “At the extreme, some qualitative researchers have suggested that the traditionalquantitative criteria of reliability and validity are not relevant to qualitative research.”[13]  At the same time, scientific research must meet the rigors of the discipline in which it takes place.  Thorleif Lund, professor of Special Needs Education at the University of Oslo, wrote the following:
If ‘‘something’’ is to be counted as knowledge, it has to attain a satisfactory level of certainty or validity. Hence, validity issues are fundamental ones in knowledge construction, and such issues are focused upon in methodological and substantive research.[14]
Therefore, the present study required conformity to the standards of qualitative research.

Yet, in qualitative research, a debate rages as to which terminology should be applied to the whole issue of validity.[15]  The candidate chose to use the terminology of “trustworthiness,” for trustworthiness asks the question, “How can an inquirer persuade his or her audience (including self) that the findings of an inquiry are worth paying attention to, worth taking account of?”[16]  Therefore, the researcher’s obstacle is to establish the merits of the study.  The study has much trustworthiness or merit.

The researcher sought interpretative trustworthiness, “the degree to which the research participants’” viewpoints, thoughts, feelings, intentions, and experiences are accurately understood.”[17]  The candidate utilized two techniques to obtain such trustworthiness.  First, the student used a method commonly called “member checking.”  Member checking serves to ensure “the themes or categories make sense, whether they are developed with sufficient evidence, and whether the overall account is realistic and accurate.”[18]  To carry out this procedure, the candidate provided all participants a copy of the data analysis and asked to make sure the student reached justifiable conclusions.

The second procedure the student used to infuse his report with interpretative trustworthiness involved “using many low inference descriptors” in presenting the data.[19]  Using low inference descriptors involves utilizing many verbatim quotations throughout the report.  Therefore, Chapter IV contains many verbatim quotes from the participants.

The researcher also sought honest answers from the participants.[20]  Each elder contacted had the opportunity to refuse to participate, and six did refuse participation.  The researcher gave the remaining nine elders ample time to read the informed consent and understand its promises—the right to withdraw from the study and the pledge of absolute confidentiality.  The researcher has a high degree of certainty that the participants gave honest answers.  Three elders revealed the names of inactive members with whom they were working, and two elders even told the candidate data they wished to remain out of the written report.

Another method the candidate used to enhance the trustworthiness of this report was to examine previous research findings.[21]  While the researcher knows of no other phenemological study along the same lines as this study, the research outlined Chapter II discusses the responsibility of elders, causes of inactivity, and how churches can reclaim inactive members.  Chapter V discusses the results of this study in light of previous researcher.



        The researcher planned to employ a three-interview model proposed by Irving Seidman, professor of qualitative research at the University of Massachusetts at Amherst.[22]  However, the researcher conducted only one interview with each participant, for he was able to ask all the relevant questions while providing participants enough time to supply usable data.

        Because the candidate used one interview rather than the planned three, the project does lack some of the credibility it would otherwise have had.[23]  Had the student interviewed each participant three times, he could easily have examined the data gleaned from each elder for internal consistency.  In other words, the researcher could have explored whether or not what an elder told him in one interview matched what he told him in another interview.  The candidate sought to overcome that deficiency by the “member checking” technique described above.

        Each interview lasted approximately ninety minutes.  “Given that the purpose of this approach is to have the participants reconstruct their experience, put it in the context of their lives, and reflect on its meaning, anything shorter than ninety minutes for each interview seems too short.”[24]  Employing a set time limit lessened the participants’ anxiety; the participants understood how much time was asked of them.[25]  Interviewees generally appreciate the ninety-minute period, for the time frame expresses an interest in what the participant has to say.[26]

        The researcher divided each interview into three sections.  The first section sought “to put the participant’s experience in context by asking him or her to tell as much as possible about him or herself in light of the topic up to the present time.”[27]  In the present study, the student attempted to understand the elders’ past experiences with elders.  The candidate focused on the following in the first section: previous experiences the participants had with church leaders, how they had witnessed other church leaders handle inactive Christians, and how they became elders.

        The second section concentrated “on the concrete details of the participants’ present experience” relating to shepherding inactive Christians.[28]  The interview sought details of how the elders deal with inactive Christians in the congregations where they serve.  The researcher directed attention on the following in the second section: the procedures of the congregations the elders serve in relation to inactive Christians and experiences they have encountered in dealing with inactive Christians.

        The third section encouraged participants “to reflect on the meaning of their experience.”[29]  Reflecting on the meaning of working with inactive Christians required the elders to “look at how the factors in their lives interacted to bring them to their present situation.”[30]  The researcher targeted the following: what participants would change about the way they have worked with inactive Christians and how the elders plan to work with inactive Christians in the future.


        “Listening is the most important skill in interviewing.”[31]  Therefore, the student’s listening skills performed a vital role in the quality of the data he gathered.  The student used the skills listed in the Personal Evangelism Improvement Course by Leon Estep, professor of ministerial leadership at Regions University.  Estep encourages the following listening skills:[32]

        1.        Recognize with Socrates: “Speak in order that I may see you.”

        2.        Prepare to listen, which requires: concentration—patience—comprehension—participation—honesty—eagerness—sharing—abandoning prejudices—openness                                   —humility—acknowledging understanding or not understanding.

        3.        Be interested in speaker’s topic.

        4.        Hear new ideas readily.

        5.        Accept speaker’s personality.

        6.        Properly deal with emotionally laden words.

        7.        Wait to hear all speaker has to say.

        8.        Hear central idea—hear feelings instead of words—listen below the level of the dictionary definition of words or self-meanings.

        9.        Utilize notes—summarize.

        10.      Question self/speaker as: “What is speaker trying to say?” etc.

        11.      Listen “in between lines.”

        12.      Expend energy.

        13.      No-No: Do not take mental detours – Do not finish sentences for speaker – do not interrupt—do not dominate.

        14.      Use time variation between speaking speed and thought speed to better understand speaker.
        The researcher used truly open-ended questions, for “qualitative inquiry—strategically, philosophically, and therefore, methodologically—aims to minimize the imposition of predetermined responses when gathering data.”[34]  In order to ask truly open-ended questions, the candidate avoided phrasing questions as a dichotomy, where the respondent could answer “yes” or “no.”[35]

        In asking truly open-ended questions, the candidate needed several skills to elicit valuable data.  The candidate inquired of participants when he failed to understand what an elder had said.[36]  The researcher solicited more information when he feared the participants had not revealed the whole story or when they had simply given generalizations.[37]  While the student did not interrupt the participants, he followed up on what participants said when doing so would not stop their current train of thought.[38]  The candidate also avoided reinforcing the participants’ answers with “uh-huh” or “O. K.” or “yes,” lest such responses might blur future answers.[39]

Data Collection

        The researcher recorded each interview with the various participants, for the interview responses provided the raw data for the researcher’s study.[40]  If the candidate transposed his own wording for the participants’ wording, he would have framed their experience from his own perspective, not their own;[41] tape recording the interviews allowed the student to possess all his raw data.  To assure the quality of the recordings, the candidate used an electrical outlet, rather than batteries, for the outlet offered far more reliability; the candidate always took extra cassettes to the interview; the candidate employed an external microphone; and the candidate placed the recorder on a stable surface.[42]

        Even though the candidate recorded the interviews, he took notes during the interactions.  Taking notes provided several important benefits.  First, note taking conveyed to the participants that what they said had value to the investigator.[43]  Second, note taking aided the student in focusing on what the participants told him.[44]  Third, note taking allowed the researcher to keep track of what material the participant has covered and enabled the researcher to return to this material at an appropriate time.[45]  Fourth, note taking provided the researcher with an immediate “feed-back loop,” that is, the candidate could write down his impression of what a participant had said and then check the accuracy of his impressions.

        Once the candidate possessed the raw data, he transcribed the interviews himself.  Although the candidate could have hired a transcriptionist, transcribing his own interviews provided the researcher an opportunity to become quite familiar with the interviews.[46]  Some researchers choose to transcribe only those parts of interviews that they initially find interesting; however, valuable data can be lost in the process.[47]  Therefore, the student copied the entire interview so that he could possess all data in the project’s data analysis stage.

Data Analysis

        The candidate used four steps in organizing his data.[48]  After transcribing the interviews, the candidate organized the data by cutting and pasting in Microsoft Word significant sections of the transcripts.  Second, the researcher read through all transcripts several times to get a feel for the data as a whole.  At this point, he began to make notes in the data for possible categories in which the data might fit.  Third, the candidate identified the themes of the interviews and classified his data by those identified themes.  Finally, the researcher organized the data as they occur in Chapter IV.

        These tasks took place specifically through hermeneutics, “the theory and practice of interpretation,”[49] a vital task in analyzing verbal data.  The candidate identified themes within the transcripts by utilizing factors of significance delineated by Joyce G. Love in her doctoral dissertation at Nova University.[50]  First, the researcher examined the transcripts individually and together for repetition, themes mentioned by more than one participant or mentioned by a single participant more than once.  The candidate also listened again to the actual recordings while reading the transcripts that he might be conscious of a change in tone or volume.  At this point, the researcher also utilized notes taken during the interviews which described the participants’ nonverbal communication.  The student also took careful note of historical narratives which informed him of what had shaped the participants’ current perceptions.

        While reading the transcripts as a whole, the researcher began coding them in Microsoft Word.[51]  Instead of using numbers, the candidate decided to code the data by colors around the hypotheses formulated before the study began.  He used the following color scheme: red for hypothesis one, yellow for hypothesis two, blue for hypothesis three, bright green for hypothesis four, rose for hypothesis five, and orange for hypothesis six.  The researcher then used cut and paste function in Microsoft Word to create documents around each of the six hypotheses.

        The candidate followed a quite similar method for coding data for findings he did not expect to find and for findings in light of the literature search.  To categorize and code these data, the researcher used the comment function in Microsoft Word to keeptrack of data as he stumbled across them.  Those data were then placed in single documents arranged by subject headings using the cut and paste feature of Microsoft Word.

        The next chapter presents the data in light of the hypotheses formed in the first chapter.


1. Rubin and Babbie, Research Methods, 389 (see chap. 1, no. 39).

2. Mac Lynn, Churches of Christ in the United States, 2000 ed. (Nashville, TN: 21st Century Christian, 2000).

3. The letter the researcher sent is located in Appendix A.

4. See Irving Seidman, Interviewing as Qualitative Research, 2nd ed.  (New York and London: Teachers College Press, 1998), 40-41.

5. See Appendix B for a copy of the informed consent.

6. Paul D. Leedy and Jeanne Ellis Ormrod, Practical Research, 8th ed. (Upper Saddle River, NJ: Pearson Education, 2005),133.

7. Please see Appendix C for a copy of the interview guide.

8. Michael Quinn Patton, Qualitative Research & Evaluation Methods, 3rd ed. (Thousand Oaks, CA: Sage Publications, 2002).

9. Ibid., 344-347.

10. Ibid., 346.

11. Rubin and Babbie, Research Methods, 193.

12. Nicholas Mays and Catherine Pope, “Qualitative Research in Health Care: Assessing Quality in Qualitative Research,” British Medical Journal 320 (2000): 50.

13. R. Burke Johnson, “Examining the Validity Structure of Qualitative Research,” Education 118 (1997): 282.

14. Thorleif Lund, “The Qualitative-Quantitative Distinction: Some Comments,” Scandinavian Journal of Educational Research 49 (2005): 120.

15. E.g., see Jeffrey P. Aguinaldo, “Rethinking Validity in Qualitative Research from a Social Constructionist Perspective: From ‘Is This valid research?’ to ‘What is this research valid for?’” The Qualitative Report 9 (2004): 127-136;  Nahid Golafshani, “Understanding Reliability and Validity in Qualitative Research,” The Qualitative Report 8 (2003): 597-607; and Glyn Winter, “A Comparative Discussion of the Notion of ‘Validity’ in Qualitative and Quantitative Research,” The Qualitative Report 4, nos. 3 &4 (2000), (accessed November 28, 2006).

16. Yvonna S. Lincoln and Egon G. Guba, Naturalistic Inquiry (Beverly Hills, CA: SAGE , 1985), 290.

17. Johnson, “Examining the Validity Structure of Qualitative Research,” 284.

18. John W. Creswell and Dana L. Miller, “Determining Validity in Qualitative Inquiry,” Theory into Practice 39 (2000): 127.

19. Johnson, “Examining the Validity Structure of Qualitative Research,” 284.

20. Andrew K. Shelton, “Strategies for Ensuring Trustworthiness in Qualitative Research Projects,” Education for Information 22 (2004): 66-67.

21. Ibid., 69.

22. Seidman, Interviewing as Qualitative Research.

23. Ibid., 17.

24. Ibid., 14.

25. Ibid.

26. Ibid.

27. Ibid., 11.

28. Ibid., 12.

29. Ibid.

30. Ibid.

31. Ibid., 63.

32. Leon Estep, Personal Evangelism Improvement Course (Ozark, AL: privately printed, 1980), B-6 – B-7.

33. The concept of feedback added to the above illustration from Estep is adapted from the discussion of the transactional model of communication found in Michael Z. Hackman and Craig E. Johnson, Leadership: A Communication Perspective (Prospect Heights, IL: Waveland Press, 2000), 9-10.  Hackman and Johnson write, “Effective communicators pay close attention to the messages being sent to them as they talk with others” (p. 10).

34. Patton, Qualitative Research, 353.

35. Ibid.

36. Seidman Interviewing as Qualitative Research, 66.

37. Ibid., 67-68.

38. Ibid., 70-71.

39. Ibid., 74.

40. Patton, Qualitative Research, 380.

41. Ibid.

42. Ibid., 382.

43. Ibid., 383.

44. Seidman, Interviewing as Qualitative Research, 64.

45. Ibid.

46. Ibid., 98.

47. Ibid.

48. These four steps are recommended in Leedy and Ormrod, Practical Research, 150-151.

49. Margo Paterson and Joy Higgs, “Using Hermeneutics as a Qualitative Research Approach in Professional Practice,” The Qualitative Report 10 (2005): 342.

50. Joyce G. Love, “The Hermeneutics of Transcript Analysis,” The Qualitative Report 2, no. 1 (1994), (accessed November 28, 2006).

51. For another approach to coding in a word processor, see John H. Carney, Joseph F. Joiner, and Helen Tragou, “Categorizing, Coding, and Manipulating Qualitative Data Using the WordPerfect®± Word Processor,” The Qualitative Report 3, no. 1 (1997), http:/ (accessed November 28, 2006). Copyright © Dr. Justin Imel