Dedoose Publications


Dedoose has been field-tested and journal-proven by leading academic institutions and market researchers worldwide. Thousands of prominent researchers across the US and abroad have benefited from early versions of Dedoose in their qualitative and mixed methods work and have laid an outstanding publication and report trail along the way.

Education Based Publications

Students' Perceptions of Characteristics of Effective College Teachers: A Validity Study of a Teaching Evaluation Form Using a Mixed Methods Analysis

Onwuegbuzie, A. J., Witcher, A. E., Collins, K. M. T., Filer, J. D., Wiedmaier, C. D., & Moore, C. W. (2007)

American Educational Research Journal, 44(1): 113-160

This study used a multistage mixed-methods analysis to assess the content-related validity (i.e., item validity, sampling validity) and construct-related validity (i.e., substantive validity, structural validity, outcome validity, generalizability) of a teaching evaluation form (TEF) by examining students’ perceptions of characteristics of effective college teachers. Participants were 912 undergraduate and graduate students (10.7% of student body) from various academic majors enrolled at a public university. A sequential mixed-methods analysis led to the development of the CARE-RESPECTED Model of Teaching Evaluation, which represented characteristics that students considered to reflect effective college teaching—comprising four meta-themes (communicator, advocate, responsible, empowering) and nine themes (responsive, enthusiast, student centered, professional, expert, connector, transmitter, ethical, and director). Three of the most prevalent themes were not represented by any of the TEF items; also, endorsement of most themes varied by student attribute (e.g., gender, age), calling into question the content- and construct-related validity of the TEF scores. Also cited by Harris, Ingle, Rutledge, 2014, 'How Teacher Evaluation Methods Matter for Accountability A Comparative Analysis of Teacher Effectiveness Ratings by Principals and Teacher Value-Added Measures.' Abstract Policymakers are revolutionizing teacher evaluation by attaching greater stakes to student test scores and observation-based teacher effectiveness measures, but relatively little is known about why they often differ so much. Quantitative analysis of thirty schools suggests that teacher value-added measures and informal principal evaluations are positively, but weakly, correlated. Qualitative analysis suggests that some principals give high value-added teachers low ratings because the teachers exert too little effort and are “lone wolves” who work in isolation and contribute little to the school community. The results suggest that the method of evaluation may not only affect which specific teachers are rewarded in the short term, but shape the qualities of teacher and teaching students experience in the long term.
Education Based Publications

Integrating Quantitative and Qualitative Research: How is it Done?

Bryman, Alan (2006)

Qualitative Research, 6(1), 97-113

Draws on a content analysis of methods and design from 232 articles using combined methods. Examine and discusses the rationales provide for employing mixed-methods and whether they correspond to actual practice.
Education Based Publications

Mixed Methods Sampling - A Typology with Examples

Teddlie, Charles, & Yu, Fen (2007)

Journal of Mixed Methods Research, 1(1): 77-100

Discusses mixed methods sampling techniques in creative and effective ways.
Education Based Publications

Unleashing Frankenstein’s Monster? The Use of Computers in Qualitative Research.

Hesse-Biber, Sharlene (2004)

H. R. Bernard (Ed.), Handbook of Methods in Cultural Anthropology, pp. 549-593. In S. N. Hesse-Biber and P. Leavy (Eds.), Approaches to Qualitative Research: A Reader on Theory and Practice, pp. 535-545.

The use of qualitative data analysis software has been increasing in recent years. A number of qualitative researchers have raised questions concerning the effect of such software in the research process. Fears have been expressed that the use of the computer for qualitative analysis may interfere with the relationship between the researcher and the research process itself by distancing the researcher from both the data and the respondent. Others have suggested that the use of a quantitative tool, the computer, would lead to data dredging, quantification of results, and loss of the "art" of qualitative analysis. In this study of 12 qualitative researchers, including both faculty members and graduate students, we have found that these fears are exaggerated. Users of qualitative data analysis software in most cases use the computer as an organizational, time-saving tool and take special care to maintain close relationships with both the data and the respondents. It is an open question, however, whether or not the amount of time and effort saved by the computer enhance research creativity. The research findings are mixed in this area. At issue is the distinction between creativity and productivity when computer methods are used. Computer packages targeted at qualitative and mixed methods research data are readily available and the methodology sections of research articles indicate that they are being utilised by some health researchers. The purpose of this article is to draw together concerns which have been expressed by researchers and critics and to place these within the perspective of 'framing' (MacLachlan & Reid, 1994). Here, the focus becomes the frame that these computer programs impose on qualitative data. Inevitably, all data sets are disturbed by the techniques of collection and the conceptual and theoretical frames imposed, but computer framing not only distorts physically but also imposes an often minimally acknowledged frame constructed by the metaphors and implicit ideology of the program. This frame is in opposition to most of the recent changes in qualitative data interpretation, which have emphasized context, thick description and exposure of the minimally disturbed voices of participants.
Education Based Publications

A Framework for the Study

Creswell, John W. (1994)

J. W. Creswell, Research Design: Qualitative and Quantitative Approaches, pp. 1-19.

How do you decide whether to use a qualitative or a quantitative approach for the design of a research study? How do you write up the results of a study for a scholarly journal article or dissertation? This book addresses these issues by providing a guide to major design decisions, such as deciding a paradigm, stating a purpose for the study, identifying the research questions and hypotheses, using theory, and defining and stating the significance of the study. Research Design is aimed at upper division to graduate level research methods courses that are taught to prepare students to plan and write up independent research studies. In the past two decades, research approaches have multiplied to a point at which investigators or inquirers have many choices. For those designing a proposal or plan, I recommend that a general framework be adopted to provide guidance about all facets of the study, from assessing the general philosophical ideas behind the inquiry to the detailed data collection and analysis procedures. Using an extant framework also allows researchers to lodge their plans in ideas well grounded in the literature and recognized by audiences (e.g., faculty committees) that read and support proposals for research. What frameworks exist for designing a proposal? Although different types and terms abound In the literature, I will focus on three: quantitative, qualitative, and mixed methods approaches. 'The first has been available to the social and human scientist for years, the second has emerged primarily during the last three or four decades, and the last is new and still developing in form and substance. This chapter introduces the reader to the three approaches to research. I suggest that to understand them, the proposal developer needs to consider three framework elements: philosophical assumptions about what constitutes knowledge claims; general procedures of research called strategies of inquhy and detailed procedures of data collection, analysis, and writing. called methods. Qualitative, quantitative, and mixed methods approaches frame each of these elements differently, and these defiances are identified and discussed in this chapter. 'Then typical scenarios that combine the three elements are advanced, followed by the reasons why one would choose one approach over another in designing a study. 'This discussion will not be a philosophical treatise on the nature of knowledge, but it will provide a practical grounding in some of the philosophical ideas behind research.
Education Based Publications

Focus Groups

Morgan, David L. (2004)

S. N. Hesse-Biber and P. Leavy (Eds.), Approaches to Qualitative Research: A Reader on Theory and Practice, pp. 263-285. New York, NY: Oxford University Press

Written by a long-time authority on focus group, presents a brief history of focus group application up to, and including, information on the variety of current uses across many disciplines. Great section on the uses of focus groups in combination with other methods with a full compare/contrast discussion. Finally, goes into the specifics on ‘how to’ plan and conduct effective group data collection. My own preference (Morgan, 1996) is for a more inclusive approach that broadly defines focus groups as a research technique that collects data through group interaction on a topic determined by the researcher. In essence, it is the researcher's interest that provides the focus, whereas the data themselves come from the group interaction. One reason for favoring an inclusive approach is that the exclusive approaches do not really exclude very much. Other than focus groups, the primary categories of group interviews in the existing typologies are things that are manifestly different from focus groups. On the one hand, there are nominal groups and Delphi groups (Stewart & Shamdasani, 1990), which do not involve actual group interaction. On the other hand, there is the observation of naturally occurring groups, which typically do not involve the researcher in determining the topic of discussion. Thus, little is gained by excluding these categories of data collection because they already fall outside the broad definition of focus groups offered here. Among the more specific criteria that could be used to distinguish focus groups from other types of group interviews, both Frey and Fontana (1989) and Khan and Manderson (1992) assert that focus groups are more formal. In particular, they argue that focus groups are likely to involve inviting participants to the discussion and they also stress the distinctive role of the moderator. Although there is no doubt that group interviews vary along a continuum from more formally structured interaction to more informal gatherings, I do not believe it is possible to draw a line between formal and informal group interviews in a way that defines some as focus groups and others as something else. Instead, I find it more useful to think that the degree of formal structure in a focus group is a decision that the research makes according to the specific purposes of the research project. In particular, the use of either a more formal or a less formal approach will depend on the researcher's goals, the nature of the research setting, and the likely reaction of the participants to the research topic. Among the other criteria that have been offered as distinguishing features of focus groups are their size and the use of specialized facilities for the interview (McQuarrie, 1996). Again, however, these supposedly exclusive criteria are mostly a matter of degree. Who is to say when a group is too large or too small to be called a focus group or when a setting is too casual to qualify? Rather than generate pointless debates about what is or is not a focus group, I prefer to treat focus groups as a "broad umbrella" or "big tent" that can include many different variations. Of course, this approach requires researchers to make choices about doing focus groups one way rather than another. Fortunately, this need to make explicit decisions about data collection strategies is a familiar concern to social scientists, and it comes under the heading of "research design." As social scientists have gained increasing experience with focus groups, we also have produced insights into the situations in which different research designs are either more or less likely to be effective (e.g., Krueger, 1993; Morgan, 1992.a, 1995).
Education Based Publications

HIV/STD Stigmatization Fears as Health-Seeking Barriers in China

Lieber, E. et all. (2006)

Internationally, stigma prohibits effective HIV/STD identification, prevention, and care. Interviews with 106 persons in an urban center in Eastern China, some known to have engaged in stigmatized risk acts (sex workers, STD clinic patients) and some vulnerable for stigmatization fears to influence health-seeking behaviors (market employees, rural-to-urban migrants). Interviews focused on community norms, values, beliefs, and emotional and behavioral reactions to HIV/STD stigmatization related events. Attributions for infection were found to mark individual’s failure to adhere to sexuality norms; define a condition warrantingthe avoidance of infected persons and dismissal by medical professionals; and promote anticipation of negative emotions (i.e., shame, fear, and embarrassment) and devalued social roles and status.
Education Based Publications

Measures of Interobserver Agreement: Calculation Formulas and Distribution Effects

House, Alvin E., House, Betty J., & Campbell, Martha B. (1981)

Calculation formulas and distribution effects Journal of Behavioral Assessment, 3(1): 37-57

Discusses issues, types, and calculations for inter-rater reliability. Seventeen measures of association for observer reliability (interobserver agreement) are reviewed and computational formulas are given in a common notational system. An empirical comparison of 10 of these measures is made over a range of potential reliability check results.
Education Based Publications

Mixed Methods Research: A Research Paradigm Whose Time has Come

Johnson, R. B., & Onwuegbuzie, A. J. (2004)

Educational Researcher, 33(7): 14-26

Positions mixed methods as natural complement to traditional qualitative and quantitative research, to present pragmatism as attractive philosophical for mixed methods research, and provide framework for designing and conducting mixed methods research. In doing this, we briefly review the paradigm “wars” and incompatibility thesis, we show some commonalities between quantitative and qualitative research, we explain the tenets of pragmatism, we explain the fundamental principle of mixed research and how to apply it, we provide specific sets of designs for the two major types of mixed methods research (mixed-model designs and mixed-method designs), and, finally, we explain mixed methods research as following (recursively) an eight-step process.
Education Based Publications

Pragmatism and the Choice of Research Strategy

Tashakkori, Abbas & Teddlie, Charles (1998)

A. Tashakkori & C. Teddlie, Mixed Methodology: Combining Qualitative and Quantitative Approaches, pp. 3-19.

Introduces and traces the history of the methodological paradigm wars and brings readers up to the state of affairs (albeit, 1998). Discuss the ‘warring’ positions and the evolution of thinking regarding pragmatism and the development of mixed methods approaches to social science research.
1-10 of 60