This document discusses techniques for analyzing content including coding methods where codes should be exhaustive and mutually exclusive to capture key characters or concepts, and assessing inter-coder reliability using Cohen's kappa statistic which typically falls between 0.5-0.7 to show agreement between coders.
This document discusses techniques for analyzing content including coding methods where codes should be exhaustive and mutually exclusive to capture key characters or concepts, and assessing inter-coder reliability using Cohen's kappa statistic which typically falls between 0.5-0.7 to show agreement between coders.
Copyright:
Attribution Non-Commercial (BY-NC)
Available Formats
Download as DOCX, PDF, TXT or read online from Scribd
This document discusses techniques for analyzing content including coding methods where codes should be exhaustive and mutually exclusive to capture key characters or concepts, and assessing inter-coder reliability using Cohen's kappa statistic which typically falls between 0.5-0.7 to show agreement between coders.
Copyright:
Attribution Non-Commercial (BY-NC)
Available Formats
Download as DOCX, PDF, TXT or read online from Scribd