Skip to main content icon/video/no-internet

Intercoder Reliability Techniques: Holsti Method

Content analysis is a widely applied research method in multiple disciplines in social science such as media studies, communication, psychology, marketing, and sociology. Content analysis can be applied to analyze the manifest meaning of content of various documents in an objective, systematic, and quantitative way. With the increasing access to various online media content, content analysis is gaining popularity in mass communication research in particular.

Intercoder reliability (also referred to as intercoder or interrater agreement) is an important methodological issue in content analysis. Intercoder reliability refers to the level of agreement among two or more independent coders when they use the same coding scheme to evaluate characteristics of communication messages or artifacts. Intercoder reliability has been used as a critical criterion to evaluate the validity and reliability of a content analysis research report since it can infer the reliability or reproduction level of the coding scheme. Coding schemes with a high level of intercoder reliability have the practical benefit of being used by multiple coders to conduct parts of an overall coding work. Dividing a large amount of coding work into smaller parts can expedite the coding process and make large content analysis projects manageable. Low level of intercoder reliability is considered to be an important methodological limitation in content analysis because it may indicate poor operationalization of key concepts, unreasonable categories, and insufficient coder training.

Intercoder reliability is measured by having two or more coders use the same coding scheme or categories to code the content of a set of documents (e.g., news article, stories, online posts), and then calculate the level of agreement among the coders. A general practice of testing intercoder reliability is to test a 10% representative sample of the overall data. Coders typically receive training to understand the coding scheme and then code the data independently. Different indices have been used to calculate intercoder reliability; the commonly used and discussed ones in communication and media research include Holsti’s method, Scott’s pi (π), Cohen’s kappa (κ), and Krippendorff’s alpha (α).

This entry introduces the procedure and formula of Holsti’s method of intercoder reliability, discusses the strengths and limitations of this method, and how this method can be applied and reported. Examples are provided to demonstrate the application of this method.

Holsti’s Method Formula

There are two ways to measure simple agreement: Holsti’s method and percent agreement (also referred to as raw percent agreement or crude agreement). These two methods are very similar, as can be shown by the mathematical computation of the two methods. Holsti’s method of intercoder reliability is equal to percent agreement when two coders code the same units of documents (as recommended by Holsti’s method). The formula of percent agreement is as follows:

Percent agreement = A / N .

A is the number of agreements between the two coders, and N is the number of units coded by the two coders (the maximum number of agreements that can be achieved).

Holsti proposed his formula to calculate intercoder reliability in 1969, and it is slightly different from the formula of percent agreement. In this method, if two coders independently code a set of documents using the same coding scheme, the coefficient of reliability of those two coders is the percentage of agreements of the total number of coding decisions made by the two coders. The following formula can be used to calculate the intercoder

...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles

Sage Recommends

We found other relevant content for you on other Sage platforms.

Loading