Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2011 Apr 1.
Published in final edited form as: Stat Probab Lett. 2010 Apr 1;80(7-8):670–677. doi: 10.1016/j.spl.2009.12.025

Compatibility of conditionally specified models

Hua Yun Chen 1
PMCID: PMC2861368  NIHMSID: NIHMS168118  PMID: 20436935

Abstract

A conditionally specified joint model is convenient to use in fields such as spatial data modeling, Gibbs sampling, and missing data imputation. One potential problem with such an approach is that the conditionally specified models may be incompatible, which can lead to serious problems in applications. We propose an odds ratio representation of a joint density to study the issue and derive conditions under which conditionally specified distributions are compatible and yield a joint distribution. Our conditions are the simplest to verify compared with those proposed in the literature. The proposal also explicitly construct joint densities that are fully compatible with the conditionally specified densities when the conditional densities are compatible, and partially compatible with the conditional densities when they are incompatible. The construction result is then applied to checking the compatibility of the conditionally specified models. Ways to modify the conditionally specified models based on the construction of the joint models are also discussed when the conditionally specified models are incompatible.

Keywords: Density decomposition, Density representation, Odds ratio function, Semiparametric models

1 Introduction

A joint model of data is often desired in many statistical applications, especially in Bayesian inference. However, when the dimension of the data is high, specifying a joint model that captures many features of the data can be much harder than specifying conditional models that capture the different features separately. See for example Besag (1974 Besag (1994) for the spatial data modeling, Hobert and Casella (1998) for Gibbs sampling, and Van Buuren, Boshuizen, and Knook (1999), Van Buuren (2007) and Raghunathan, et al. (2001) for the missing data imputation. One issue with the conditional specification approach is that a joint model that is compatible with all the specified conditional models may not exist. The incompatibility of the conditionally specified models may lead to serious issues on the statistical inference and interpretation in the spatial data analysis, on the convergence of the Gibbs sampling, and on the validity of the imputed values.

The study of the compatibility of conditionally specified models is usually done through the study of the compatibility of the conditionally specified densities. The first issue in the study is to check the compatibility of the densities. This issue has been extensively studied by Besag (1974), Arnold and Press (1989), Hobert and Casella (1998), and more recently Wang and Ip (2008) among others. Besag (1974) concentrated on the form of joint densities that satisfy specific conditional dependence requirements. Arnold and Press (1989) and Arnold and Gokhale (1994) mainly dealt with the case with two variables. Arnold, Castillo, and Sarabia (2001) and discussants gave an excellent introduction to the problem and related issues. Gorieroux and Montfort (1979) discussed the uniqueness of the determined joint distribution. Hobert and Casella (1998) studied the issue in Gibbs sampling and the impact of the incompatibility on the convergence of the Gibbs sampler. Most of the results obtained on checking compatibility are related to the density ratios formulated in Besag (1974), which can be cumbersome when the dimension of the data is moderate to high. Wang and Ip (2008) simplified the conditions for checking compatibility based on the dependence functions rather than density ratio. The second issue in the study is on the modification of the incompatible conditionally specified models so that the modified conditionally specified models do determine a proper joint model for the data. This issue was less discussed in the literature.

In this article, we propose a new way to study the compatibility of conditionally specified densities through the odds ratio representation of a joint density. The odds ratio function was used jointly with the marginal densities to study bivariate distributions by Osius (2005). Arnold and Gokhale (1994) also implicitly used it through the log-linear model representation of the distribution of a r × c contingency table. It is unclear how to generalize their approaches to handle high dimensional distributions. Joe (1997, chapter 9) gave a closest result to what we obtained in this paper on the compatibility check. However, like results obtained by others, his result is also based on the density ratio and does not yield a concise expression for the joint density. Our representation differs from those in the literature in that we use odds ratio functions along with conditional densities at a fixed condition to represent the density. The representation is very easy to be generalized to high dimensional problems. We obtain necessary and sufficient conditions for the compatibility in terms of odds ratio functions. The conditions we obtained resemble those of Hobert and Casella (1998) and Wang and Ip (2008), but are simpler and much more transparent. We show that it is sufficient to verify only a few selected permutations on the indices of the odds ratio functions in the compatibility check. Compared with the results of Hobert and Casella (1988) and Wang and Ip (2008), the number of equations to check in our result is much smaller. Furthermore, we propose a simple and natural way to construct the joint density when the conditionally specified densities are compatible. When the conditionally specified densities are incompatible, we propose ways to modify the conditionally specified densities such that the modified conditionally specified densities determine a proper joint density. This will be done through constructing a joint density that is partially compatible with the originally given conditional densities. This means that some but not all of the originally conditionally specified densities may be compatible with the constructed joint density. The odds ratio framework we use can be viewed intuitively in the following way. We first decompose each of the conditionally specified densities into smaller parts, i.e., the odds ratio function and the conditional density at a fixed condition. Those parts are then reassembled into a functioning machine, the joint density. In the process of assembling, some parts may have to be modified to fit into the functioning machine. If all the parts were well manufactured (compatible), no modification to the parts is needed in the assembly and different ways of assembling yield the same machine. However, if modifications to the parts are necessary in the process of assembling, the functioning machines manufactured may vary depending on the order of the parts assembled and how the parts are modified. One of the advantages of the proposed framework is that the result can be easily generalized from the conditionally specified densities to the conditionally specified models, i.e., families of conditionally specified densities.

The remainder of the article is organized in the following way. In section 2, we propose to represent a joint density in terms of the odds ratio functions and conditional densities at a fixed condition. We show that the components of the decomposition have the property of variation independence. In section 3, we apply the odds ratio representation of a joint density to study the issue of compatibility in conditionally specified densities and models. A sufficient and necessary condition for the compatibility of the conditionally specified densities is obtained. Results are then applied to the conditionally specified models. Construction of joint models that are partially compatible with the conditionally specified models is demonstrated by a typical example in section 4. The article concludes with a brief discussion on the extension of the proposed approach to the compatibility problem with more complicated ways of model specification.

2 Decomposition of a joint density in the odds ratio framework

Let Y be the random vector whose distribution is of interest. Suppose that Y is divided into t groups as Yj,j = 1, ···, t, where Yj has dimension dj, j = 1, ···, t. As in Besag (1974), we make the following positivity assumption on the joint distribution of Yj,j = 1, ···, t. Namely, if the marginal density of Yj, p(yj) > 0, for all j, then the joint density of (Y1, ···, Yt), p(y1, ···, yt) > 0.

Consider first the case with t = 2. Let dyj denote the reference measure the density of Yj was obtained. For a given joint density p(y1,y2), as in Chen (2003; 2004; 2007), define the odds ratio function

η(y1,y10;y2,y20)=p(y1,y2)p(y10,y20)p(y10,y2)p(y1,y20),

where ( y10,y20) is a point in the sample space. In the following, we suppress ( y10,y20) from the odds ratio expression and use η(y1;y2) to denote the odds ratio function. It is easy to see from the definition that

p(y1,y2)=η(y1;y2)g2(y2y10)g1(y1y20)p(y1,y20)dy1p(y10,y2)dy2p(y10,y20),

where g1 and g2 are respectively conditional densities of y1 given y2, and of y2 given y1. Since ∫ ∫ p(y1, y2)dy1 dy2 = 1, it follows that

p(y1,y2)=η(y1;y2)g2(y2y10)g1(y1y20)η(y1;y2)g2(y2y10)g1(y1y20)dy1dy2. (1)

Note that the odds ratio functions are the same for the two conditional densities and the joint density, that is,

η1(y1;y2)=g1(y1y2)g1(y10y20)g1(y10y2)g1(y1y20),η2(y2;y1)=g2(y2y1)g2(y20y10)g2(y20y1)g2(y2y10),

and η(y1; y2) = η1(y1; y2) = η2(y2; y1). By similar arguments (Chen, 2003, 2007), we can represent g2(y2|y1) and g1(y1|y2)respectively as

g1(y1y2)=η(y1;y2)g1(y1y20)η(y1;y2)g1(y1y20)dy1,

and

g2(y2y1)=η(y1;y2)g2(y2y10)η(y1;y2)g2(y2y10)dy2,

One of the advantages of these representations is that they clarify the common components shared by different densities. Namely, the odds ratio function and two conditional density at fixed conditions. The three components in the expressions uniquely identify the densities and they are variation independent if no restriction is imposed on the joint density. Now, if we are given an odds ratio function and two densities respectively for Y1 and Y2 that are subject to weak integrable conditions, we can easily construct a joint density using the three components. Before we preceed, we give sufficient conditions for a given function of (y1,y2) to be an odds ratio function, i.e., the ratio of the odds of a density function. Let η(y1; y2) be a given function of (y1,y2). If it is an odds ratio function, there should exist ( y10,y20) in the domain of η such that η(y1;y20)=1=η(y10;y2) for all y1 and y2 such that ( y10, y2) and (y1, y20) are in the domain of η. The following lemma gives easily verifiable sufficient conditions for a given function to be an odds ratio function.

Lemma 1

For a given function η(y1; y2) ≥ 0, suppose that Inline graphic = {y1|η(y1;y2) > 0} is the same for any given y2 and there exists ( y10,y20) such that η(y1;y20)=1=η(y10;y2) for all y1 and y2. Denote the common domain by Inline graphic. Assume that there is a density f0 (y1)on Inline graphic with respect to a known measure denoted by dy1 (e.g., count or Lebesgue measure) such that 0 < ∫ η(y1; y2) f0 (y1)dy1 < +∞, for all y2. Then η(y1; y2) is an odds ratio function.

Proof

Define

f(y1y2)=η(y1;y2)f0(y1)dy1η(y1;y2)f0(y1)dy1.

It is easy to check that f (y1 |y2) thus defined is a density function on Inline graphic and η is the odds ratio function of this density.

Lemma 2

Given an odds ratio function η(y1; y2) > 0 and two density functions f1 and f2, if ∫ ∫ η(y1; y2) f1 (y1) f2 (y2) dy1 dy2 < +∞, then p(y1,y2) defined in (1) with g1(y1y20)=f1(y1) and q2(y2y10)=f2(y2) is a joint density function with conditional densities for Y1 given Y2 and for Y1 given Y2 being respectively g1(y1|y2) and g2(y2|y1).

Proof

Straightforward.

One consequence of the lemmas is that, given two conditional densities g1(y1|y2) and g1(y2|y1), there exists a joint density p(y1,y2) with g1 and g2 as the conditional densities if and only if the two odds ratio functions computed from the two conditional densities are equal and the integrable condition on the odds ratio function is satisfied. The consistency check in the case of t = 2 is relatively easy to be performed and various authors had derived somewhat equivalent results. However, note that the generalizable of the approach to carry out compatibility check and to construct a joint density in high dimensional problems is much more important in practice (Besag, 1974, 1996; Besag and Kooperberg, 1995). The odds ratio representation we proposed here is very easy to be extended to deal with high dimensional problems while other approaches either may encounter difficulties in the generalization or can be cumbersome in the generalized form.

For the general case with t > 2, a joint density on (y1, ···, yt) can be first represented as

p(y1,,yt)=ηt(yt;{yt1,,y1})gt(ytyt10,,y10)gt1(yt1,,y1yt0)ηt(yt;{yt1,,y1})gt(ytyt10,,y10)gt1(yt1,,y1yt0)dytdyt1dy1. (2)

We can then apply a similar representation to gt1(yt1,,y1yt0) and repeat this step. Eventually, we obtain

p(y1,,yt)=j=2tηj(yj;{yj1,,y1}yt0,,yj+10)j=1tgj(yjyj0)j=2tηj(yj;{yj1,,y1}yt0,,yj+10)j=1t{gj(yjyj0)dyj}, (3)

where and henceforth yj = (yl, lj) and yj0=(yl0,lj), and

ηj(yj;{yj1,,y1}yt0,,yj+10)=gj(yjyt0,,yj+10,yj1,,y1)gj(yj0yj0)gj(yjyj0)gj(yj0yt0,,yj+10,yj1,,y1)=η(yj;{yt0,,yj+10,yj1,,y1}).

The joint density can also be equivalently represented as

p(y1,,yt)=j=2tηj(yj;{yt0,,yj+10,yj1,,y1})j=1tgj(yjyj0)j=2tηj(yj;{yt0,,yj+10,yj1,,y1})j=1t{gj(yjyj0)dyj}. (4)

3 Compatibility of conditionally specified densities and models

Consider first the compatibility of conditionally specified densities. More specifically, given a set of conditional densities gj(yj|yj), j = 1, ···, t, we want to know if there exists a joint distribution such that all the given conditional densities are its corresponding conditional densities. Let

ηj(yj;yj)=gj(yjyj))gj(yj0yj0)gj(yj0yj)gj(yjyj0).

Then gj(yj|yj) is determined by ηj(yj; yj) and gj(yjyj0) as

gj(yjyj)=ηj(yj;yj)gj(yjyj0)ηj(yj;yj)gj(yjyj0)dyj.

Theorem 1

For a given set of conditional densities gj(yj|yj), there exists a joint density p(y1, ···, yt) with conditional densities being gj, j = 1, ···, t, if and only if

l=2tηl(yl;{yt0,,yl+10,yl1,,y1})=l=2tησ(l)(yσ(l);{yσ(t)0,,yσ(l+1)0,yσ(l1),,yσ(1)}) (5)

for those permutation σj such that σj({t, ···, 1}) = {j, t, ···, j + 1, j − 1, ···, 1}, j = 1, ···, t − 1 (or equivalently for all permutation σ on (1, ···, t)), and

j=2tηj(yj;{yt0,,yj+10,yj1,,y1})j=1t{gj(yjyj0)dyj}<+. (6)

Proof

If gj (yj|yj), j = 1, ···, t are compatible, then it follows from the odds ratio representation of a joint density that (6) holds, and from the representations derived in the previous section that the joint density can be represented as

pσ(y1,,yt)=j=2tησ(j)(yσ(j),{yσ(t)0,,yσ(j+1)0,yσ(j1),,yσ(1)})j=1tgj(yjyj0)l=2tησ(l)(yσ(l),{yσ(t)0,,yσ(l+1)0,yσ(l1),,yσ(1)})l=1t{gl(ylyl0)dyl}, (7)

for any permutation σ. That is, pσ for all permutation σ are equal. By setting yj=yj0 for all j in the numerator of the right-hand side of (7), it follows from that ηj(yj0;{yt0,,yj+10,yj10,,y10})=1 that

j=1tgj(yj0yj0)C(σ)=pσ(y10,,yt0)=p(y10,,yt0),

where C(σ) denotes the denominator of the right-hand side of (7). This implies that C(σ) is a constant independent of σ. Next, by canceling C(σ) in the denominator and j=1tgj(yjyj0) from the numerator in the equations, we obtain that (5) holds for all the permutations.

To show the reverse, assume that (5) is true for all σj, j = 1, ···, t − 1. Then pσj, j = 1, ···, t are the same because of (5). Denote the common density by p(y1, ···, yt). Since the conditional density for Yσj (t) = Yj given the other variables computed from the joint density pσj is

p(y1,,yt)p(y1,,yt)dyj=pσj(y1,,yt)pσj(y1,,yt)dyσj(t)=ηj(yj;yj)gj(yjyj0)ηj(yj;yj)gj(yjyj0)dyj=gj(yjyj).

The second equality is true because σ(t) = j and all the other odds ratio functions and conditional densities at a fixed condition are canceled out from the numerator and denominator. Hence gj(yj|yj) corresponds to the conditional densities from the single density p(y1, ···, yt). It then follows that the {gj(yj|yj), j = 1, ···, t} are compatible.

Compared to the conditions in Theorems 1 and 2 of Hobert and Casella (1998), our Theorem 1 has two improvements. First, t − 1 rather than t! − 1 equations need to be checked to see if the conditional densities are compatible. The reduction is substantial even for t = 3. Second, the conditions in terms of odd ratio functions are simpler because redundancy in terms of the conditional densities at a fixed condition in the conditional density expressions is canceled. Compared with theorem 4 of Wang and Ip (2008), our result is also simpler because we need to check t − 1 equations based on odds ratio functions rather than t(t − 1)/2 equations based on the conditional densities.

Condition (5) in Theorem 1 is relatively easy to check. For example, when t = 2, the condition reduces to η1(y1;y2) = η2(y2;y1). When t = 3, the conditions become

η3(y3;{y2,y1})η2(y2;y1y30)=η2(y2;{y1,y3})η3(y3;y1y20)=η1(y1;{y2,y3})η3(y3;y2y10),

or equivalently

η3(y3;{y2,y1})η2(y2;{y30,y1})=η2(y2;{y1,y3})η3(y3;{y1,y20})=η1(y1;{y3,y2})η3(y3;{y2,y10}).

Note that, to find incompatibility in the conditional models, we only need to show that one of the equalities does not hold or the integral does not converge.

The joint densities constructed in Theorem 1 is very useful in checking the compatibility of conditionally specified models. The compatibility of conditionally specified models can be defined as that there exists a joint model such that each of the conditionally specified models can be obtained as the corresponding conditional model of the joint model. Given a set of conditionally specified models {gj(yj|yj,θj),θj ∈ Θj}, j = 1, ···, t, it is not difficult to construct a joint model {pσ(y1, ···, yt, θ), θ ∈ Θ} from Theorem 1, where Θ is the range of θ. We can then recompute the conditional models of the joint model. If all the conditional models based on the joint model are equivalent to the conditionally specified models after possible reparametrization, we can conclude that the conditionally specified models are compatible. Otherwise, they are incompatible. For incompatible conditionally specified models, we can likewise construct different joint models based on the conditionally specified models. Each of the constructed joint models is partially compatible with the conditionally specified models.

4 Modifications to incompatible conditionally specified models based on the joint model construction

When the conditionally specified models are compatible, the constructed joint model is very useful in reparameterizing the conditional models. This is almost always needed in practice. In the case that the conditionally specified models are incompatible, the construction approach may yield several joint models, each of which is partially compatible with the conditionally specified models. Those constructed joint models can also be useful in practice. We use the following conditionally specified normal models to illustrate the idea.

Suppose that Y1, ···, Yt are a set of variables modeled by conditional normal models. For the simplicity of presentation, assume that Y1, ···, Yt are all continuous variables. However, similar discussions can be applied to the case with all discrete variables, or a mixture of discrete and continuous variables equally well. Assume the fully conditional models are

Yj=βj0+kjβjkYk+εj,

where εj ~ N(0, σj), j = 1, ··· t. Note that εj, j = 1, ···, t are usually not independent (Besag, 1974). The regression form implies that the conditional densities are

gj(yjyj)=12πσjexp{12σj2(yjβj0kjβjkyk)2}.

The odds ratio function and the conditional density at a fixed condition are respectively

ηj(yj;yj)=exp{kjβjkσj2(yjyj0)(ykyk0)},

and gj(yj|yj0). A joint model obtained based on Theorem 1 has density proportional to

exp{j=2tk=1j1βjkσj2(yjyj0)(ykyk0)}j=1t12πσjexp{12σj2(yjβj0kjβjkyk0)2}

From this joint density, it can be derived that the conditional density for Yj given Yj is

gj(yjyj)=12πσjexp{12σj2(yjβj0k=1j1βjkykk=j+1tβjkyk0k=j+1tβkjσj2σk2(ykyk0))2}.

Without further restrictions on the parameters, gj and gj define the same model with different parametrizations. This implies that the conditionally specified models are compatible. When the parameters satisfy the constraints,

βkj/σk2=βjk/σj2

for all kj, the equivalent models become identical. In practice, if the data were generated from the joint normal models and no restriction is imposed on the parameters in the conditionally specified models, the consistent estimator of the parameters based on the conditionally specified models automatically satisfies the constraints. However, taking the constraints into consideration in estimation will increase the estimation efficiency.

Conditionally specified normal models may also include interaction and/or second order terms. This seemly harmless and natural addition to the conditional models, however, can instantly destroy the compatibility of the conditional models. The question now becomes how to amend the conditionally specified models to make them compatible. To illustrate the idea, suppose that the conditionally specified normal model includes one high order term in the model for Y1 given Y2, ···, Yt. The other conditionally specified models remain unchanged. That is,

Y1=β10+k=2tβ1kYk+α12Y22+ε1.

It is easy to see that the odds ratio function obtained from this model is

logη1{y1;(y2,,yt)}=k=2tβ1kσ12(y1y10)(ykyk0)+α12σ12(y1y10)(y22y202),

and the odds ratio functions from the other conditional models are

logηj{yj;(yl,lj)}=kjβjkσj2(yjyj0)(ykyk0),

for j = 2, ···, t. All the conditional densities at a fixed condition that are obtained from the odds ratio decomposition of the conditional densities are normal densities.

The foregoing conditionally specified models are incompatible because the conditional odds ratios functions of Y1 and Y2 given the rest of variables from g1 and g2 respectively are not equal when α12 ≠ 0. It is hard to modify the conditionally specified models directly in an attempt to correct the conflicts in those models. By applying Theorem 1 to this problem, we can construct many joint models which are partially compatible with the conditionally specified models. Note that if the second order terms in the model Y1 given Y−1 is very important to keep, we can select Y1 before Y2 in the order of the odds ratio function combination to keep this high order association term there. To resolve the incompatibility, we can redefine log η2 as

logη2(y2;y2)=k2β2kσ22(y2y20)(ykyk0)+α12σ12(y1y10)(y22y202),

In addition, we need to modify g2(y2|y−20) to make it different from the normal density so that the integral in the denominator of the odds ratio representation of the joint density is finite. This can be done easily by modify g2(y2|y−20) to

g(y2y20)exp(y24/σ2).

As a result, g(y2|y−2) is modified to

g(y2y2)=η2{Y2;Y2)g(y2y20)η2{Y2;Y2)g(y2y20)dy2.

The modified conditional models can now be seen compatible because each of the modified conditional models can be obtained as the conditional models of the joint model p*(y1, ···, yt), which is

l=1,l2t1ηl(yl;{yt,,y(l+1),y(l1)0,,y10})η2(y2;{yt,,y3,y10})l2gl(ylyl0)g2(y2y20)l=1,l2t1ηl(yl,{yt,,y(l+1),y(l1)0,,y10})η2(y2;{yt,,y3,y10})l2gl(ylyl0)g2(y2y20)dyl},

when βkj/σk2=βjk/σj2 for all kj. Note also that different orders of the odds ratio combination in the foregoing expression do not alter the joint model. Rather they determine different parameterizations of the joint model. The joint model can also be expressed as

p(y1,,yt)=l=1t1ηl(yl,{yt,,y(l+1),y(l1)0,,y10})l2gl(ylyl0)g2(y2y20)l=1t1ηl(yl,{yt,,y(l+1),y(l1)0,,y10})l2gl(ylyl0)g2(y2y20)dyl}.

However, the order of the odds ratio combination in this expression cannot be changed arbitrarily without changing the joint model.

It can be seen that p* is a joint model that is compatible with all of the conditionally specified models except g2(y2|y−2). Note that we have enforced the change of g2(y2|y−20) here. It is also possible to allow observed data to determine what distribution to use. It can also be seen from the above arguments that when the conditionally specified models are close to compatible, the modification of the originally conditionally specified models can be kept small in making the conditional models compatible. As a result, most features of the originally conditionally specified models are retained. On the other hand, if the conditionally specified models are in a high degree of conflicts, many modifications to the conditional models may be needed to make the models compatible. Even if some features in the conditionally specified models are eliminated in the modifications, however, Theorem 1 gives us some flexibility in determining which features to keep. Theorem 1 also suggests other modifications for the example. For example, if the order of Y1 and Y2 in the odds ratio combination of the last displayed expression is switched, the constructed joint model would be the joint normal model with the second order term removed. This construction may be unfavorable given the importance of the second order term in the conditional model.

The foregoing example provides us with a general strategy to modify conditionally specified models to achieve compatibility and to obtain the joint models. In summary, when the conditionally specified models are incompatible, the following guidelines may be applied to the modification of the incompatible conditional densities to achieve compatibility.

  1. We first modify the odds ratio functions of the conditionally specified models. In doing so, we can choose the important features in the models and try to maximally keep them in the joint model.

  2. To make the integral in the denominator of the potential joint density finite, we may need to modify the conditional density at a fixed condition also.

From a practical point of view, we may modify the conditional densities at a fixed condition to be unspecified. Such a modification can generate many interesting models to fit the data. The interpretation of those models can also be attractive in practice.

5 Discussion

We proposed to study the compatibility issue in conditionally specified models in the framework of the odds ratio representation of a joint density. It allows us to obtain the simplest conditions to check in verifying the compatibility of a given set of conditionally specified models. The framework also suggests ways to construct joint models that are partially or fully compatible with the conditional models. Modified conditionally specified models can be obtained based on the constructed joint models. Note that, in the definition and derivation, each yj, j = 1, ··· t, can be a vector. The results obtained in this paper apply to blocks of variables in the conditional specification equally well. We have mainly concentrated on the fully conditionally specified models. We note that the framework proposed in this paper can also be applied to studying the compatibility issue in more complicated model specifications, such as the partially conditional specification and the overlapped block conditional specification. The framework can be applied to variables which are discrete, or continuous, or a mixture of both.

Footnotes

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

References

  1. Arnold BC, Castillo E, Sarabia JM. Conditionally Specified distributions: An introduction (with discussion) Statistics Science. 2001;16:249–274. [Google Scholar]
  2. Arnold BC, Gokhale DV. On uniform marginal representation of contingency tables. Statistics and Probability Letters. 1994;21:311–316. [Google Scholar]
  3. Arnold BC, Press S. Compatible conditional distributions. Journal of the American Statistical Association. 1989;84:152–156. [Google Scholar]
  4. Besag J. Spatial interaction and the statistical analysis of lattice systems (with discussion) Journal of the Royal Statistical Society, Ser B. 1974;36:192–236. [Google Scholar]
  5. Besag JE. On statistical analysis of dirty pictures (with discussion) Journal of the Royal Statistical Society, Ser B. 1996;48:259–302. [Google Scholar]
  6. Besag JE, Kooperberg C. On conditional and intrinsic autoregressions. Biometrika. 1995;82:733–746. [Google Scholar]
  7. Chen HY. A note on prospective analysis of outcome-dependent samples. The Journal of Royal Statistical Society, Ser B. 2003:575–584. [Google Scholar]
  8. Chen HY. Nonparametric and Semiparametric models for missing covariates in parametric regressions. The Journal of American Statistical Association. 2004;99:1176–1189. [Google Scholar]
  9. Chen HY. A semiparametric odds ratio model for measuring association. Biometrics. 2007;63:413–421. doi: 10.1111/j.1541-0420.2006.00701.x. [DOI] [PubMed] [Google Scholar]
  10. Gourieroux C, Montfort A. On the characteristerization of a joint probability distribution by conditional distributions. Journal of Econometrics. 1979;10:115–118. [Google Scholar]
  11. Hobert JP, Casella G. Functional compatibility, Markov Chains, and Gibbs sampling with improper posteriors. Journal of Computational and Graphic Statistics. 1998;7:42–60. [Google Scholar]
  12. Joe H. Multivariate Models and Dependence Concepts. New York: Chapman and Hall; 1997. [Google Scholar]
  13. Osius G. The association between two random elements: A complete characterization and odds ratio models. Metrika. 2005;60:261–277. [Google Scholar]
  14. Raghunathan TE, Lepkowski JM, van Hoewyk J, Solenberger P. A multivariate technique for multiply imputing missing values using a sequence of regression models. Survey Methodology. 2001;27:85–95. [Google Scholar]
  15. Van Buuren S, Boshuizen HC, Knook DL. Multiple imputation of missing blood pressure covariates in survival analysis. Statistics in Medicine. 1999;18:681–694. doi: 10.1002/(sici)1097-0258(19990330)18:6<681::aid-sim71>3.0.co;2-r. [DOI] [PubMed] [Google Scholar]
  16. Van Buuren S. Multiple imputation of discrete and continuous data by fully conditional specification. Statistical Methods In Medical Research. 2007;16:219–242. doi: 10.1177/0962280206074463. [DOI] [PubMed] [Google Scholar]
  17. Wang YJ, Ip EH. Conditionally specified continuous distributions. Biometrika. 2008;95:735–746. [Google Scholar]

RESOURCES