Skip to main content
. 2018 Jul 13;19:90. doi: 10.1186/s13059-018-1462-9

Table 1.

Explanation of technical terms

Term Description Reference(s)
Beam search A heuristic search algorithm. In Chiron, the beam search decoder with beam width W maintains a list of the W most probable sequences up to position i and constructs the probabilities of all possible sequence extensions for i + 1. [32]
Connectionist Temporal Classification (CTC) decoder A type of neural network output and scoring for labeling sequence data with RNNs. It does not require presegmented training data and postprocessed outputs. [56]
Convolutional Neural Network (CNN) A type of neural network often used for image analysis. It can recognize patterns by applying different filters to an image. [57]
Forward algorithm An algorithm that computes the probability P(x) of a sequence x given a certain HMM. [58]
Hidden Markov Model (HMM) A stochastic model that models a sequence of unobserved events underlying a sequence of observations. HMMs assume that an event only depends on the previous event. [58, 59]
Long-short-term memory (LSTM) unit A type of RNN that can be used as a building block in bigger networks. It has specific input, output, and forgot gates that allow it to retain or discard information that was passed on from a previous state. [60, 61]
Partial Order Alignment (POA) graph A graph representation of a multiple alignment that allows each base in the alignment to have multiple predecessors. Different paths through the graph represent different alignments. [62]
Recurrent Neural Network (RNN) A type of neural network that takes information passed on from previous states into account. [63]
Training data A dataset that is used to optimize (i.e., train) the parameters of a model. Training is required for both HMMs and RNNs. The training dataset thus determines the performance of the model. [58, 63]
Viterbi decoding An algorithm that finds the most likely sequence of events given a certain HMM. [58]