Dedoose Publications

PUBLICATIONS

Dedoose has been field-tested and journal-proven by leading academic institutions and market researchers worldwide. Thousands of prominent researchers across the US and abroad have benefited from early versions of Dedoose in their qualitative and mixed methods work and have laid an outstanding publication and report trail along the way.

Education Based Publications

Managing Data in CAQDAS

Fielding, Nigel & Lee, Ray M. (1998)

Chapter 4 in Fielding & Lee, Computer Analysis and Qualitative Research, pp. 86-118

from COMPUTER ASSISTED QUALITATIVE DATA ANALYSIS SOFTWARE: A PRACTICAL PERSPECTIVE FOR APPLIED RESEARCH, JOSEPH B. BAUGH, ANNE SABER HALLCOM, and MARILYN E. HARRIS Computer assisted qualitative data analysis software (CAQDAS) holds a chequered reputation to date in academia, but can be useful to develop performance metrics in the field of corporate social and environmental responsibility and other areas of contemporary business. Proponents of using CAQDAS cite its ability to save time and effort in data management by extending the ability of the researcher to organize, track and manage data. Opponents decry the lack of rigor and robustness in the resultant analyses. Research reveals that these opinions tend to be divided by “the personal biography and the philosophical stance of the analyst” (Catterall & Maclaran, 1998, p. 207), as well as “age, computer literacy, and experience as a qualitative researcher” (Mangabeira, Lee & Fielding, 2004, p. 170). A more recent article (Atherton & Elsmore 2007) discussed the continuing debate on CAQDAS in qualitative research: The two perspectives both indicate that CAQDAS should be used with care and consideration; in ways that explicitly demonstrate a “fit” between the ethos and philosophical perspective(s) underpinning a research study, on the one hand, and the means of ordering and manipulating the data within CAQDAS on the other. (p. 75) Despite the ongoing literary debate on the merits of CAQDAS, the use of computer-aided qualitative data analysis has become acceptable to most qualitative researchers (Lee & Esterhuizen; Morison & Moir, 1998; Robson, 2002). However, writers advise that researchers avoid the trap of letting the software control the data analysis (Catterall & Maclaran, 1998). Morison and Moir counseled that CAQDAS is merely one tool in the qualitative data analysis toolbox. No tool should replace the researcher's capacity to think through the data and develop his or her emergent conclusions (Atherton & Elsmore, 2007). On the other hand, Morison and Moir among others (e.g., Blank, 2004; Catterall & Maclaran, 1998; Mangabeira et al., 2004) found the use of qualitative data analysis software can also free up significant amounts of time formerly used in data management and encoding allowing the researcher to spend more time in deeper and richer data evaluation. Qualitative research studies to develop performance metrics can create huge amounts of raw data (Miles & Huberman, 1994; Robson, 2002). Organizing, tracking, encoding, and managing the data are not trivial tasks and the effort should not be underestimated by the applied researcher. Two methodologies exist to handle these activities and manage the data during the data analysis phase. The first methodology is a manual process, which must be done at times to avoid missing critical evidence and provide trustworthiness in the process (Malterud, 2001), while the second methodology indicates the use of technology for managing the data and avoid being overwhelmed by the sheer amount of raw data (Lee & Esterhuizen, 2000). It is the experience of the authors that some manual processing must be interspersed with CAQDAS. This provides an intimacy with the data which leads to the drawing of credible and defensible conclusions. Thus, a mixed approach that melds manual and automated data analyses seems most appropriate. A basic approach for applying traditional qualitative research methodologies lies in the ability of CAQDAS to support data reduction through the use of a “provisional start list” (Miles & Huberman, 1994, p. 58) of data codes that are often developed manually from the research question. A rise in the use of CAQDAS for applied research and other nonacademic research fields has been identified (Fielding & Lee, 2002). Since CAQDAS is becoming more prevalent in nonacademic researcher populations and can be useful for developing performance metrics for corporate social and environmental responsibility and solving other complex business issues, it seems prudent at this juncture to discuss how to use the software appropriately rather than rehash the argument for or against using CAQDAS. Selection of and training with an appropriate CAQDAS package can help the researcher manage the mountains of data derived from qualitative research data collection methods (Lee & Esterhuizen, 2000).
Education Based Publications

Qualitative Interviewing

Patton, Michael Quinn (1980)

Thousand Oaks: Sage Publications, In Michael Quinn Patton, Qualitative Evaluation Methods, pp. 195-263

We interview people to find out from them those things we cannot directly observe. This issue is not whether observational data are more desireable, valid, or meaningful than self-report data. The fact is tahtw e cannot observe everything. We cannot observe felings, thoughts, intentions, behaviors that took place at some previous point in time, situations that preclude the presence of an observer, or how people have organized the world and the meanings they attach to what goes on in the world. We have to ask people questions about those things. Thus, the purpose of interviewing is to allow us to enter into the other person's perspective. Qualitative interviewing begins with the assumption that the perspective of others is meaningful, knowable, and able to be made explicit. We interview to find out what is in and on someone else's mind, to gather their stories. Program evaluation interviews, for example, aim to capture the perspectives of program participants, staff, and others associated with the program. What does the program look and feel like to the people involved? What are their experiences? What thoughts do people knowledgeable about the program have concerning the program? What are their expectations? What changes do participants perceive in themselves as a result of their involvement in the program? It is the responsibility of the evaluator to provide a framework within which people can respond comfortably, accurately, and honestly to these kinds of questions. Evaluations can enhance the use of qualitative data by generating relevant and high quality findings. As Hermann Sudermann said in Es Lebe das Leben I, ‘I know how to listen when clever men are talking. That is the secret of what you8 call my influence.’ Evaluators must learn how to listen when knowledgeable people are talking. That may be the secret of their influence. An evaluator or qualitative or mixed method research interviewer faces the challenge of making it possible for the person being interviewed to bring the interviewer into his or her world. The quality of the information obtained during an interview is largely dependent on the interviewer. This chapter discusses ways of obtaining high-quality information by talking with people who have that information. We’ll be delving into the ‘art of hearing’ (Rubin and Rubin 1995). This chapter presents three different types of interviews. Later sections consider the content of interviews: what questions to ask and how to phrase questions. The chapter ends with a discussion of how to record the responses obtained during interviews. This chapter emphasizes skill and technique as ways of enhancing the quality of interview data, but no less important is a genuine interest in and caring about the perspectives of other people. If what people have to say about the world is generally boring to you, then you will never be a great interviewer. On the other hand, a deep and genuine interest in learning about people is insufficient without disciplined and rigorous inquiry based on skill and technique.
Education Based Publications

Students' Perceptions of Characteristics of Effective College Teachers: A Validity Study of a Teaching Evaluation Form Using a Mixed Methods Analysis

Onwuegbuzie, A. J., Witcher, A. E., Collins, K. M. T., Filer, J. D., Wiedmaier, C. D., & Moore, C. W. (2007)

American Educational Research Journal, 44(1): 113-160

This study used a multistage mixed-methods analysis to assess the content-related validity (i.e., item validity, sampling validity) and construct-related validity (i.e., substantive validity, structural validity, outcome validity, generalizability) of a teaching evaluation form (TEF) by examining students’ perceptions of characteristics of effective college teachers. Participants were 912 undergraduate and graduate students (10.7% of student body) from various academic majors enrolled at a public university. A sequential mixed-methods analysis led to the development of the CARE-RESPECTED Model of Teaching Evaluation, which represented characteristics that students considered to reflect effective college teaching—comprising four meta-themes (communicator, advocate, responsible, empowering) and nine themes (responsive, enthusiast, student centered, professional, expert, connector, transmitter, ethical, and director). Three of the most prevalent themes were not represented by any of the TEF items; also, endorsement of most themes varied by student attribute (e.g., gender, age), calling into question the content- and construct-related validity of the TEF scores. Also cited by Harris, Ingle, Rutledge, 2014, 'How Teacher Evaluation Methods Matter for Accountability A Comparative Analysis of Teacher Effectiveness Ratings by Principals and Teacher Value-Added Measures.' Abstract Policymakers are revolutionizing teacher evaluation by attaching greater stakes to student test scores and observation-based teacher effectiveness measures, but relatively little is known about why they often differ so much. Quantitative analysis of thirty schools suggests that teacher value-added measures and informal principal evaluations are positively, but weakly, correlated. Qualitative analysis suggests that some principals give high value-added teachers low ratings because the teachers exert too little effort and are “lone wolves” who work in isolation and contribute little to the school community. The results suggest that the method of evaluation may not only affect which specific teachers are rewarded in the short term, but shape the qualities of teacher and teaching students experience in the long term.
Education Based Publications

Focus Groups

Morgan, David L. (2004)

S. N. Hesse-Biber and P. Leavy (Eds.), Approaches to Qualitative Research: A Reader on Theory and Practice, pp. 263-285. New York, NY: Oxford University Press

Written a long-time authority on focus group, presents a brief history of focus group application up to, and including, information on the variety of current uses across many disciplines. Great section on the uses of focus groups in combination with other methods with a full compare/contrast discussion. Finally, goes into the specifics on ‘how to’ plan and conduct effective group data collection.
Education Based Publications

Quantitative and Qualitative Inquiry in Educational Research: Is There a Paradigmatic Difference Between Them?

Niglas, Katrin (1999)

Paper presented at the European Conference on Educational Research, Lahti, Finland, September 22-25

Discusses the distinctions between qualitative and quantitative methodological approaches in educational research. Seeks to compare and contrast the characteristics and assumptions of these approaches toward dispelling the notion of paradigm ‘wars’ and in the interest of improving the quality of research in education.
Education Based Publications

Unleashing Frankenstein’s Monster? The Use of Computers in Qualitative Research.

Hesse-Biber, Sharlene (2004)

H. R. Bernard (Ed.), Handbook of Methods in Cultural Anthropology, pp. 549-593. In S. N. Hesse-Biber and P. Leavy (Eds.), Approaches to Qualitative Research: A Reader on Theory and Practice, pp. 535-545.

The use of qualitative data analysis software has been increasing in recent years. A number of qualitative researchers have raised questions concerning the effect of such software in the research process. Fears have been expressed that the use of the computer for qualitative analysis may interfere with the relationship between the researcher and the research process itself by distancing the researcher from both the data and the respondent. Others have suggested that the use of a quantitative tool, the computer, would lead to data dredging, quantification of results, and loss of the "art" of qualitative analysis. In this study of 12 qualitative researchers, including both faculty members and graduate students, we have found that these fears are exaggerated. Users of qualitative data analysis software in most cases use the computer as an organizational, time-saving tool and take special care to maintain close relationships with both the data and the respondents. It is an open question, however, whether or not the amount of time and effort saved by the computer enhance research creativity. The research findings are mixed in this area. At issue is the distinction between creativity and productivity when computer methods are used. Computer packages targeted at qualitative and mixed methods researcg data are readily available and the methodology sections of research articles indicate that they are being utilised by some health researchers. The purpose of this article is to draw together concerns which have been expressed by researchers and critics and to place these within the perspective of 'framing' (MacLachlan & Reid, 1994). Here, the focus becomes the frame that these computer programs impose on qualitative data. Inevitably, all data sets are disturbed by the techniques of collection and the conceptual and theoretical frames imposed, but computer framing not only distorts physically but also imposes an often minimally acknowledged frame constructed by the metaphors and implicit ideology of the program. This frame is in opposition to most of the recent changes in qualitative data interpretation, which have emphasised context, thick description and exposure of the minimally disturbed voices of participants.
Education Based Publications

Scientific Foundations of Qualitative Research

Ragin, Charles C., Nagel, Joane, & White, Patricia (2004)

National Science Foundation Report

Report generated by a NSF workshop on qualitative research methods. Two main sections: 1) provide a general guidance for developing qualitative research project and 2) recommendations for strengthening qualitative research. This report is organized into two major sections — general guidance for developing qualitative research projects and recommendations for strengthening qualitative research. The intent of the first section of the report is to serve as a primer to guide both investigators developing qualitative proposals and reviewers evaluating qualitative research projects. The second section of the report presents workshop recommendations for designing, evaluating, supporting, and strengthening qualitative research.
Education Based Publications

Journal of Mixed Methods Research

Bryman, Alan (2007)

Sage Publications

This article is concerned with the possibility that the development of mixed methods research is being hindered by the tendency that has been observed by some researchers for quantita-tive and qualitative findings either not to be integrated or to be integrated to only a limited extent. It examines findings from 20 interviews with U.K. social researchers, all of whom are practitioners of mixed methods research. From these interviews, a wide variety of possible barriers to integrating mixed methods findings are presented. The article goes on to suggest that more attention needs to be given to the writing of mixed methods articles.
Education Based Publications

Concordance Between Ethnographer and Folk Perspectives: Observed Performance and Self-Ascription of Sibling Caretaking Roles

Weisner, T. S., Gallimore, R., & Tharp, R. (1982)

Human Organization, 41(3): 237-244

!NEEDS BETTER SUMMARY! Compares observer to cultural member view of roles in care-taking
Education Based Publications

Designing Qualitative Studies

Patton, Michael Quinn (2001)

Thousand Oaks: Sage Publications, In Michael Quinn Patton, Qualitative Research & Evaluation Methods, 3rd edition, pp.209-257

Practical guide to study design with good attention to taxonomy of research approaches by purpose and sampling issues.
1-10 of 61