Skip to main content
. Author manuscript; available in PMC: 2016 Mar 23.
Published in final edited form as: Adv Neural Inf Process Syst. 2015;28:3599–3607.

Algorithm 1.

CT-HMM Parameter learning (Soft/Hard)

1: Input: data O = (o0, …, oV ) and T = (t0, …, tV ), state set S, edge set L, initial guess of Q
2: Output: transition rate matrix Q = (qij)
3: Find all distinct time intervals tΔ, Δ = 1, …, r, from T
4: Compute P(tΔ) = eQtΔ for each tΔ
5: repeat
6:  Compute p(v, k, l) = p(s(tv) = k, s(tv+1) = lO, T,Q) for all v, and the complete/state-
 optimized data likelihood l by using Forward-Backward (soft) or Viterbi (hard)
7:  Create soft count table C(Δ, k, l) from p(v, k, l) by summing prob. from visits of same tΔ
8:  Use Expm, Unif or Eigen method to compute E[nijO,T,Q] and E[τiO,T,Q]
9:  Update qij=E[nijO,T,Q]E[τiO,T,Q], and qii = −Σij qij
10: until likelihood l converges