Skip to main content
. Author manuscript; available in PMC: 2012 Nov 19.
Published in final edited form as: J Speech Lang Hear Res. 2012 Jan 9;55(3):892–902. doi: 10.1044/1092-4388(2011/11-0088)

Figure 2.

Figure 2

Figure 2

Figure 2

A) Schematic of a multilayer perceptron neural network. Each parameter of interest in the input vector has a corresponding node in the input layer. The hidden layer contains the nodes, the number of which was varied during the experiment. The output vectors are the possible classifications of data, which were normal and disordered in this study. B) Schematic of a learning vector quantization neural network. The four codebook vectors (1–4) represent average positions in the four possible classes of data. The regions established for these classes are outlined. In our study, there were only two classes. C) Schematic of a support vector machine neural network. Support vectors on the periphery of the data clusters (circled) help construct the hyperplane by locating the maximum margin between the classes.