Data Sets

ARTICLES - Inter Rater Reliability

Interrater Reliability – Calculating Kappa

Description: Reliability is the “consistency” or “repeatability” of your measures (William M.K. Trochim, Reliability) and, from a methodological perspective, is central to demonstrating that you’ve employed a rigorous approach to your project.

Best Practices in Excerpting and Coding and Capitalizing on Dedoose Features

Description: Qualitative data allow us to learn about the rich, nature, complex, and contextualized ways in which our research participants experience their lives…the ‘how’ and ‘why’ of life, beyond the ‘what?’ So, simply, context is king.

Inter-Rater Reliability

Description: Inter-rater reliability (IRR) enables the researcher to work toward a more consistent code tree, read this article to learn what is important to keep in mind when assessing IRR and to find some useful resources for further reading.

1-3 of 3