Skip to main content
. 2010 Apr 9;11:179. doi: 10.1186/1471-2105-11-179

Table 1.

Definition of terms used in describing the MEME algorithm

n number of input sequences
L length of input sequences

X = {X1, ..., Xn} the set of n input sequences

w width of a MEME motif

m = L - w + 1 number of positions for a site

γ probability of a site in any sequence

θ PSPM model of motif;

P = {Pi,j} position-specific prior (PSP)

w0 width for which input PSP is defined

Z = {Zi,j} missing information variables for i ∈ [1, n],j ∈ [-L, L]

Z(t) expectation of Z at EM iteration t

Inline graphic = Pr(Zi,j = 1|ϕ(t)) prior probability given PSP & model

ϕ(t) model parameters at EM iteration t

ϕ = {θ, γ, P} all sequence model parameters