Jump to content


Photo

Coding comparison and kappa coefficient


  • Please log in to reply
1 reply to this topic

#1 Knukes

Knukes

    Casual Member

  • Members
  • Pip
  • 3 posts

Posted 10 January 2017 - 01:31 PM

Hi,

 

I'm comparing inter-rater reliability in the coding of a transcript between me and my colleague. I ran a comparison query at all nodes based on sentence calculations. However, the kappa coefficient is negative for almost all nodes -- even ones which we have coded the exact same way. I can manually see that we have each coded the source at that particular node the same way, yet, for example the percent agreement is 96% and the % disagreement is 4% giving a kappa of -0.02. Furthermore, when I check the "show coding comparison content" box, none of the code is green -- even though we have coded at that node in 100% agreement and 0% disagreement.

 

How do I make sense of this comparison?

Thanks.



#2 QSR Support

QSR Support

    Advanced Member

  • QSR Staff
  • PipPipPip
  • 2,081 posts
  • Gender:Not Telling

Posted 11 January 2017 - 02:17 PM

Hello Knukes,

 

Coding comparison takes into account not only the content which was coded by both users, but also the content which was not coded by both users.

 

For example, if the source is a document with 1000 characters, where:

  • 50 of these characters have been coded by both users
  • 150 of these characters have been coded by only one of these users, and
  • the remaining 800 characters have not been coded by either user

then the percentage agreement is calculated as (800 + 50) ÷ 1000 = 85%.

 

So, although you can see that there is content which both users have coded in the same way, there will be content which has not been coded by either user or has been coded by only 1 user. This will change the % value of agreement as demonstrated the above example.

 

With regards to the coding comparison content, please check the values in the table, especially the A and B (%) column. You will be able to see the coded content in green only if there is a positive value in this column. So although you may have 96% in the agreement column,  the A and B (%) value could still be zero. 

 

The values of the columns are calculated using the below method:

  • Agreement Column = sum of columns A and B and Not A and Not B

  • A and B = the percentage of data item content coded to the selected node by both Project User Group A and Project User Group B

  • Not A and Not B = the percentage of data item content coded by neither Project User Group A and Project User Group B

 

Please refer to the following link for detailed information: http://help-nv10.qsr...iniTOCBookMark9

 

Kind Regards,

Sameer S


QSR Support

QSR International Pty Ltd
2nd Floor, 651 Doncaster Road | Doncaster Victoria 3108 Australia

Find answers to your support questions or raise new support requests online at:

http://www.qsrintern...om/support.aspx





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users