Dedoose Blog


Showing Posts for Selected Category: Inter Rater Reliability, Matching Posts: 3

Inter-Rater Reliability

Inter-rater reliability (IRR) enables the researcher to work toward a more consistent code tree, read this article to learn what is important to keep in mind when assessing IRR and to find some useful resources for further reading.

Best Practices in Excerpting and Coding and Capitalizing on Dedoose Features

Sometimes excerpting and coding your qualitative data to maximize their value when integrating with quantitative data and carrying out your qualitative and mixed methods data analysis isn’t as straightforward as it may seem on the surface. At Dedoose we see lots of variation in these practices and are offering up this blog to provide some guidance based on what we know from our academic research and teaching experiences. We’ll hope this is helpful to a wide swath of you out there in the Dedoose user community and the value of context for your data analysis and capitalizing on Dedoose analytic features is crystal clear.

Inter-Rater Reliability – Calculating Kappa

Reliability is the "consistency" or "repeatability" of your measures (William M.K. Trochim, Reliability) and, from a methodological perspective, is central to demonstrating that you’ve employed a rigorous approach to your project. There are a number of approaches to assess inter-rater reliability—see the Dedoose user guide for strategies...