Skip to main content
. 2014 Oct 2;14:113. doi: 10.1186/1471-2288-14-113

Table 5.

Overall summary of three clustering techniques

TwoStep Latent Gold SNOB
Method Distance-based, agglomerative hierarchical cluster analysis Finite mixture modeling to probabilistically identify latent classes Finite mixture modeling to probabilistically identify latent classes
Stopping rule to identify number of subgroups Automated using either ‘Bayesian information criterion’ or ‘Akaike’s information criterion’ Analyst choice using various criteria, including ‘Bayesian information criterion’, unexplained variance, Chi-square p-value Automated using ‘Minimum message length’ principle
Suitable data types Ordinal data require recoding as dichotomous or handled as if interval data All types All types
Report classification probability of individuals No Yes Yes
Sensitivity to subgroups Least Middle Most
Reproducibility Very high Very high Very high
Accuracy Very high Very high Very high
Cost Most expensive Less expensive Free
Support Extensive documentation, fee-based support available Extensive documentation and some free support available Some documentation but minimal support available
Interpretability of presentation of results Results are presented numerically and graphically (charts of certainty of the subgroup structure, bar and pie charts of cluster frequencies, and charts displaying the importance of specific variables to subgroups) Results are presented numerically and graphically (including a tri-plot displaying the relationships between subgroups) Results are mostly numeric (although a tree diagram is produced showing the relationship between ‘mother’ and ‘daughter’ subgroups)
Learning curve (subjective judgement) Easy Middle Hard