Table 6.
Full data
|
Listwise deletion
|
Log weights
|
CART weights
|
Prune weights
|
RF weights
|
MI
|
||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
30% | 50% | 30% | 50% | 30% | 50% | 30% | 50% | 30% | 50% | 30% | 50% | 30% | 50% | |
N = 100 | ||||||||||||||
Linear | .045 | .055 | .060 | .070 | .055 | .095 | .040 | .070 | .050 | .080 | .050 | .075 | .040 | .115 |
One split | .035 | .050 | .035 | .050 | .055 | .085 | .035 | .060 | .070 | .075 | .075 | .095 | .060 | .120 |
Two splits | .095 | .045 | .045 | .040 | .075 | .100 | .060 | .055 | .090 | .140 | .085 | .140 | .110 | .135 |
Three splits | .095 | .045 | .095 | .055 | .095 | .095 | .095 | .050 | .075 | .105 | .075 | .115 | .095 | .170 |
N = 250 | ||||||||||||||
Linear | .045 | .035 | .025 | .050 | .040 | .050 | .045 | .060 | .050 | .050 | .050 | .075 | .040 | .075 |
One split | .055 | .075 | .055 | .065 | .100 | .085 | .080 | .085 | .095 | .100 | .090 | .105 | .110 | .105 |
Two splits | .060 | .060 | .040 | .035 | .055 | .065 | .050 | .060 | .080 | .065 | .100 | .075 | .095 | .125 |
Three splits | .075 | .035 | .085 | .075 | .075 | .085 | .060 | .065 | .110 | .085 | .095 | .105 | .085 | .105 |
N = 500 | ||||||||||||||
Linear | .025 | .055 | .045 | .040 | .050 | .050 | .055 | .055 | .055 | .045 | .055 | .055 | .055 | .055 |
One split | .050 | .075 | .030 | .040 | .065 | .065 | .060 | .045 | .050 | .065 | .050 | .050 | .070 | .065 |
Two splits | .070 | .055 | .090 | .050 | .085 | .045 | .085 | .065 | .085 | .050 | .095 | .055 | .095 | .055 |
Three splits | .035 | .060 | .060 | .060 | .070 | .055 | .060 | .090 | .055 | .070 | .065 | .065 | .055 | .085 |
Note. Log = logistic regression; CART = classification and regression trees; Prune = pruned CART analysis; RF = random forests; MI = multiple imputation.