Dedoose Publications

PUBLICATIONS

Dedoose has been field-tested and journal-proven by leading academic institutions and market researchers worldwide. Thousands of prominent researchers across the US and abroad have benefited from early versions of Dedoose in their qualitative and mixed methods work and have laid an outstanding publication and report trail along the way.

Education Based Publications

Research Design Issues for Mixed Method and Mixed Model Studies

Tashakkori, Abbas & Teddlie, Charles (1998)

A. Tashakkori & C. Teddlie, Mixed Methodology: Combining Qualitative and Quantitative Approaches, pp. 40-58

Discusses the concept of triangulation from various perspectives and the variety of approaches to implementing mixed methods research. Builds on Patton’s (1990) discussion of ‘mixed form’ design to a broader model in order to develop a taxonomy for distinguishing various mixed method designs and approaches.
Education Based Publications

Introduction to Mixed Method and Mixed Model Studies in the Social and Behavioral Sciences: Paradigm Wars and Mixed Methodologies.

Tashakkori, Abbas & Teddlie, Charles (1998)

A. Tashakkori & C. Teddlie, Mixed Methodology: Combining Qualitative and Quantitative Approaches, pp. 20-39.

Encourages a focus on the research question as a guide to deciding on methods to apply in a particular study and comfort in cross the boundaries between pure interpretations of particular paradigmatic characteristics. Describes the nature and limitation of various mono-methods. Suggests that incorporating a pragmatic approach with a variety of appropriate methods helps gain a broader and more comprehensive perspective on the research question.
Education Based Publications

Integrating Quantitative and Qualitative Research: How is it Done?

Bryman, Alan (2006)

Qualitative Research, 6(1), 97-113

This article seeks to move beyond typologies of the ways in which quantitative and qualitative research are integrated to an examination of the ways that they are combined in practice. Draws on a content analysis of methods and design from 232 articles using combined methods. Examine and discusses the rationales provide for employing mixed-methods and whether they correspond to actual practice.
Education Based Publications

Mixed Methods Sampling - A Typology with Examples

Teddlie, Charles, & Yu, Fen (2007)

Journal of Mixed Methods Research, 1(1): 77-100

Discusses mixed methods sampling techniques in creative and effective ways.
Education Based Publications

Measures of Interobserver Agreement: Calculation Formulas and Distribution Effects

House, Alvin E., House, Betty J., & Campbell, Martha B. (1981)

Calculation formulas and distribution effects Journal of Behavioral Assessment, 3(1): 37-57

Discusses issues, types, and calculations for inter-rater reliability. Seventeen measures of association for observer reliability (interobserver agreement) are reviewed and computational forumals are given in a common notational system. An empirical comparison of 10 of these measures is made over a range of potential reliability check results.
Education Based Publications

Managing Data in CAQDAS

Fielding, Nigel & Lee, Ray M. (1998)

Chapter 4 in Fielding & Lee, Computer Analysis and Qualitative Research, pp. 86-118

from COMPUTER ASSISTED QUALITATIVE DATA ANALYSIS SOFTWARE: A PRACTICAL PERSPECTIVE FOR APPLIED RESEARCH, JOSEPH B. BAUGH, ANNE SABER HALLCOM, and MARILYN E. HARRIS Computer assisted qualitative data analysis software (CAQDAS) holds a chequered reputation to date in academia, but can be useful to develop performance metrics in the field of corporate social and environmental responsibility and other areas of contemporary business. Proponents of using CAQDAS cite its ability to save time and effort in data management by extending the ability of the researcher to organize, track and manage data. Opponents decry the lack of rigor and robustness in the resultant analyses. Research reveals that these opinions tend to be divided by “the personal biography and the philosophical stance of the analyst” (Catterall & Maclaran, 1998, p. 207), as well as “age, computer literacy, and experience as a qualitative researcher” (Mangabeira, Lee & Fielding, 2004, p. 170). A more recent article (Atherton & Elsmore 2007) discussed the continuing debate on CAQDAS in qualitative research: The two perspectives both indicate that CAQDAS should be used with care and consideration; in ways that explicitly demonstrate a “fit” between the ethos and philosophical perspective(s) underpinning a research study, on the one hand, and the means of ordering and manipulating the data within CAQDAS on the other. (p. 75) Despite the ongoing literary debate on the merits of CAQDAS, the use of computer-aided qualitative data analysis has become acceptable to most qualitative researchers (Lee & Esterhuizen; Morison & Moir, 1998; Robson, 2002). However, writers advise that researchers avoid the trap of letting the software control the data analysis (Catterall & Maclaran, 1998). Morison and Moir counseled that CAQDAS is merely one tool in the qualitative data analysis toolbox. No tool should replace the researcher's capacity to think through the data and develop his or her emergent conclusions (Atherton & Elsmore, 2007). On the other hand, Morison and Moir among others (e.g., Blank, 2004; Catterall & Maclaran, 1998; Mangabeira et al., 2004) found the use of qualitative data analysis software can also free up significant amounts of time formerly used in data management and encoding allowing the researcher to spend more time in deeper and richer data evaluation. Qualitative research studies to develop performance metrics can create huge amounts of raw data (Miles & Huberman, 1994; Robson, 2002). Organizing, tracking, encoding, and managing the data are not trivial tasks and the effort should not be underestimated by the applied researcher. Two methodologies exist to handle these activities and manage the data during the data analysis phase. The first methodology is a manual process, which must be done at times to avoid missing critical evidence and provide trustworthiness in the process (Malterud, 2001), while the second methodology indicates the use of technology for managing the data and avoid being overwhelmed by the sheer amount of raw data (Lee & Esterhuizen, 2000). It is the experience of the authors that some manual processing must be interspersed with CAQDAS. This provides an intimacy with the data which leads to the drawing of credible and defensible conclusions. Thus, a mixed approach that melds manual and automated data analyses seems most appropriate. A basic approach for applying traditional qualitative research methodologies lies in the ability of CAQDAS to support data reduction through the use of a “provisional start list” (Miles & Huberman, 1994, p. 58) of data codes that are often developed manually from the research question. A rise in the use of CAQDAS for applied research and other nonacademic research fields has been identified (Fielding & Lee, 2002). Since CAQDAS is becoming more prevalent in nonacademic researcher populations and can be useful for developing performance metrics for corporate social and environmental responsibility and solving other complex business issues, it seems prudent at this juncture to discuss how to use the software appropriately rather than rehash the argument for or against using CAQDAS. Selection of and training with an appropriate CAQDAS package can help the researcher manage the mountains of data derived from qualitative research data collection methods (Lee & Esterhuizen, 2000).
Education Based Publications

Qualitative Interviewing

Patton, Michael Quinn (1980)

Thousand Oaks: Sage Publications, In Michael Quinn Patton, Qualitative Evaluation Methods, pp. 195-263

We interview people to find out from them those things we cannot directly observe. This issue is not whether observational data are more desireable, valid, or meaningful than self-report data. The fact is tahtw e cannot observe everything. We cannot observe felings, thoughts, intentions, behaviors that took place at some previous point in time, situations that preclude the presence of an observer, or how people have organized the world and the meanings they attach to what goes on in the world. We have to ask people questions about those things. Thus, the purpose of interviewing is to allow us to enter into the other person's perspective. Qualitative interviewing begins with the assumption that the perspective of others is meaningful, knowable, and able to be made explicit. We interview to find out what is in and on someone else's mind, to gather their stories. Program evaluation interviews, for example, aim to capture the perspectives of program participants, staff, and others associated with the program. What does the program look and feel like to the people involved? What are their experiences? What thoughts do people knowledgeable about the program have concerning the program? What are their expectations? What changes do participants perceive in themselves as a result of their involvement in the program? It is the responsibility of the evaluator to provide a framework within which people can respond comfortably, accurately, and honestly to these kinds of questions. Evaluations can enhance the use of qualitative data by generating relevant and high quality findings. As Hermann Sudermann said in Es Lebe das Leben I, ‘I know how to listen when clever men are talking. That is the secret of what you8 call my influence.’ Evaluators must learn how to listen when knowledgeable people are talking. That may be the secret of their influence. An evaluator or qualitative or mixed method research interviewer faces the challenge of making it possible for the person being interviewed to bring the interviewer into his or her world. The quality of the information obtained during an interview is largely dependent on the interviewer. This chapter discusses ways of obtaining high-quality information by talking with people who have that information. We’ll be delving into the ‘art of hearing’ (Rubin and Rubin 1995). This chapter presents three different types of interviews. Later sections consider the content of interviews: what questions to ask and how to phrase questions. The chapter ends with a discussion of how to record the responses obtained during interviews. This chapter emphasizes skill and technique as ways of enhancing the quality of interview data, but no less important is a genuine interest in and caring about the perspectives of other people. If what people have to say about the world is generally boring to you, then you will never be a great interviewer. On the other hand, a deep and genuine interest in learning about people is insufficient without disciplined and rigorous inquiry based on skill and technique.
Geography Based Publications

Health geography II ‘Dividing’ health geography

Rosenberg, Mark (2015)

Over the years, various observers of health geography have sought to ‘divide’ the sub-discipline mainly along theoretical lines or to argue for a broadening of its theoretical base. Paralleling the growing theoretical pluralism within health geography has been a growing methodological pluralism. As in other parts of human geography, health geographers have embraced historical research, quantitative and qualitative methods, and computer mapping and geographic information science (GIS). Analysing recent contributions by health geographers, the question I seek to answer is whether the growing theoretical and methodological pluralism has paradoxically led to increasing divisions in the topics of study based mainly, but not solely, on what methods are employed in the research. While there are topical overlaps (e.g. quantitative and qualitative studies of particular vulnerable groups), it is less obvious as to how research using one methodology is informing research using the other methodology.
Education Based Publications

Intercoder Reliability for Validating Conclusions Drawn from Open-Ended Interview Data

Kurasaki, Karen S. (2000)

Field Methods, 12(3): 179-194

Intercoder reliability is a measure of agreement among multiple coders for how they apply codes to text data. Intercoder reliability can be used as a proxy for the validity of constructs that emerge from the data. Popular methods for establishing intercoder reliability involve presenting predetermined text segments to coders. Using this approach, researchers run the risk of altering meanings by lifting text from its original context, or making interpretations about the length of codable text. This article describes a set of procedures that was used to develop and assess intercoder reliability with free-flowing text data, in which the coders themselves determined the length of codable text segments. Discusses procedures for developing and assessing intercoder reliability with free-flowing text.
Geography Based Publications

Community through the eyes of children: blending child-centered research and qualitative geovisulization

Jung, Jin-Kyu (2014)

Community is an ambiguous concept, and the meanings of community as a subject of study have received a great deal of attention across various disciplines. This paper discusses how children's diverse meanings of community shape and are shaped by the social, cultural, and physical environments of their everyday lives. To explore these meanings I combine principles of child-centered research and qualitative geovisualization into a research methodology. I demonstrate that this integration displays the transformative nature of qualitative analysis and visualization to support interpretive analysis of various forms of qualitative and spatial data together, and offers us a hybrid methodological framework for gaining insights into the diverse meanings of community held by the children. The main case study is drawn from a multi-year research collaboration called the Children's Urban Geography (ChUG), in which I participated along with children who lived in a relatively poor but emerging multi-cultural Hispanic neighborhood in Buffalo, NY.
11-20 of 45