Skip to main content
. 2011 Nov 18;12:450. doi: 10.1186/1471-2105-12-450

Table 2.

Two-stage variable backward elimination procedure for Random KNN

Stage 1: Geometric Elimination
q ← proportion of the number features to be dropped each time;
p ← number of features in data;
niln(4p)ln(1-q); /* number of iterations, minimum dimension 4*/
initialize rknn_list[m]; /* stores feature supports for each Random KNN */
initialize acc[m]; /* stores accuracy for each Random KNN */
for i from 1 to ni do
 if i == 1 then
  rknn ← compute supports via Random KNN from all variables of data;
else
  pp(1-q);
  rknn ← compute supports via Random KNN from p top important variables of rknn;
end if
rknn list[i] ← rknn;
acc[i] ← accuracy of rknn;
end for
max= argmax1kni(acc[k]);
pre_max = max - 1;
rknn knn_list[pre_max]; /* This Random KNN goes to stage 2 */
Stage 2: Linear Reduction
d ← number features to be dropped each time;
p ← number of variables of rknn;
ni(p-4)d; /* number of iterations */
for i from 1 to ni do
if i ≠ 1 then
  p p - d;
end if
rknn ← compute supports via Random KNN from p top important variables of rknn;
acc[i] ← accuracy of rknn;
rknn_list[i] ←rknn;
end for
bestargmax1kni(acc[k]);
best_rknn rknn_list[best]; /* This gives final random KNN model */
return best_rknn;