|
A subset of the extraction patterns that tend to extract the seed words |
|
The candidate nouns extracted by are placed in
|
|
Total correlation, also called multi-information, it quantifies the redundancy or dependency among a set of random variables. |
|
Kullback–Leibler divergence, also called relative entropy, is a measure of how probability distribution is different from a second, reference probability distribution [50]. |
|
Probability densities of
|
|
The mutual information between two random variables |
|
’s dependence on can be written in terms of a linear number of parameters which are just the estimate marginals |
|
The Kronecker delta, a function of two variables. The function is 1 if the variables are equal, and 0 otherwise. |
|
A constant used to ensure the normalization of for each . It can be calculated by summing , an initial parameter. |