Skip to main content

Table 3 Hubert and Arabie’s Rand index, Kappa and Cramer’s V statistics for men and women identified as EI-PR (ExBefore vs. ExAfter) and all men and women (Inclusion vs. InclusionNN) based on the revised-Goldberg and pTEE methods

From: The effect of different methods to identify, and scenarios used to address energy intake misestimation on dietary patterns derived by cluster analysis

Agreement between ExBefore and ExAfter

Revised-Goldberg

pTEE

Rand Indexa

Kappab

Cramer’s Vc

Rand Indexa

Kappab

Cramer’s Vc

EI-PR Men (n = 5128)

EI-PR Men (n = 4751)

0.34

0.53

0.57

0.33

0.52

0.57

EI-PR Women (n = 8627)

EI-PR Women (n = 8184)

0.53

0.71

0.71

0.44

0.63

0.64

Agreement between Inclusion and InclusionNN

Revised-Goldberg

pTEE

Rand Indexa

Kappab

Cramer’s Vc

Rand Indexa

Kappab

Cramer’s Vc

All Men (n = 9847)

All Men (n = 9847)

0.34

0.53

0.57

0.33

0.52

0.57

All Women (n = 16,241)

All Women (n = 16,241)

0.53

0.71

0.71

0.44

0.63

0.64

  1. EI-PR Energy Intake Plausible Reporters, ExBefore Energy intake misreporters are excluded prior to completing the cluster analysis, ExAfter Energy intake misreporters are excluded after completing the cluster analysis, Inclusion Energy intake misreporters are included in the cluster analysis, InclusionNN Energy intake misreporters are excluded before the cluster analysis but added to the ExBefore cluster solution using the nearest neighbor method, pTEE Predicted Total Energy Expenditure
  2. a Hubert and Arabie’s adjusted Rand index is a modified version of the Rand index that determines the similarity between 2 cluster assignments by counting the number of pairwise agreements and disagreements between cluster assignments. Hubert and Arabie’s adjusted Rand index can take negative values, and its upper bound is 1. The closer the Hubert and Arabie’s adjusted Rand index’s positive values are to 1, the better the agreement between cluster assignments
  3. b Kappa statistic is a measure of interrater agreement and is used in this study as a measure of agreement between cluster assignments. Kappa statistic generally ranges between 0 and 1, although its lower bound can be negative if the observed probability of agreement is less than the expected one. Complete agreement is encountered when the Kappa statistic equals 1; therefore, it should be maximized
  4. c Cramer’s V statistic measures the strength of association between cluster assignments and varies between 0 and 1, except in the case of 2 clusters where values range from −1 to 1. Cramer’s V statistic should have values far away from 0, as values closer to − 1 or 1 indicate stronger association between cluster assignments