sequential approach to bayesian linear inverse problems in ... · examples of applications with...
TRANSCRIPT
![Page 1: Sequential approach to Bayesian linear inverse problems in ... · Examples of applications with synthetic and real data, are ... and Sorenson, 1972), geophysics (Grana and Della Rossa,](https://reader034.vdocuments.site/reader034/viewer/2022042101/5e7e31906f7df33aa2024db6/html5/thumbnails/1.jpg)
1
Sequential approach to Bayesian linear inverse problems in reservoir modeling using Gaussian mixture models Dario Grana Department of Geophysics Stanford University Tapan Mukerji Department of Energy Resources Engineering Stanford University
Abstract
We present here a method for generating realizations of the posterior probability density
function of a Gaussian Mixture linear inverse problem in the combined discrete‐continuous
case. This task is achieved by extending the sequential simulations method to the mixed
discrete‐continuous problem. The sequential approach allows us to generate a Gaussian
Mixture random field that honors the covariance functions of the continuous property and the
available observed data. The traditional inverse theory results, well known for the Gaussian
case, are first summarized for Gaussian Mixture models: in particular the analytical expression
for means, covariance matrices, and weights of the conditional probability density function are
derived. However, the computation of the weights of the conditional distribution requires the
evaluation of the probability density function values of a multivariate Gaussian distribution, at
each conditioning point. As an alternative solution of the Bayesian inverse Gaussian Mixture
problem, we then introduce the sequential approach to inverse problems and extend it to the
Gaussian Mixture case. The Sequential Gaussian Mixture Simulation (SGMixSim) approach is
presented as a particular case of the linear inverse Gaussian Mixture problem, where the linear
operator is the identity. Similar to the Gaussian case, in Sequential Gaussian Mixture Simulation
the means and the covariance matrices of the conditional distribution at a given point
correspond to the kriging estimate, component by component, of the mixture. Furthermore,
Sequential Gaussian Mixture Simulation can be conditioned by secondary information to
account for non‐stationarity. Examples of applications with synthetic and real data, are
presented in the reservoir modeling domain where realizations of facies distribution and
reservoir properties, such as porosity or net‐to‐gross, are obtained using Sequential Gaussian
![Page 2: Sequential approach to Bayesian linear inverse problems in ... · Examples of applications with synthetic and real data, are ... and Sorenson, 1972), geophysics (Grana and Della Rossa,](https://reader034.vdocuments.site/reader034/viewer/2022042101/5e7e31906f7df33aa2024db6/html5/thumbnails/2.jpg)
2
Mixture Simulation approach. In these examples, reservoir properties are assumed to be
distributed as a Gaussian Mixture model. In particular, reservoir properties are Gaussian within
each facies, and the weights of the mixture are identified with the point‐wise probability of the
facies.
Introduction
Inverse problems are common in many different domains such as physics, engineering, and
earth sciences. In general, solving an inverse problem consists of estimating the model
parameters given a set of observed data. The operator that links the model and the data can be
linear or nonlinear. In the linear case, estimation techniques generally provide smoothed
solutions. Kriging, for example, provides the best estimate of the model in the least‐squares
sense. Simple kriging is in fact identical to a linear Gaussian inverse problem where the linear
operator is the identity, with the estimation of posterior mean and covariance matrices with
direct observations of the model space. Monte Carlo methods can be applied as well to solve
inverse problems (Mosegaard and Tarantola, 1995) in a Bayesian framework to sample from
the posterior; but standard sampling methodologies can be inefficient in practical applications.
Sequential simulations have been introduced in geostatistics to generate high resolution
models and provide a number of realizations of the posterior probability function honoring
both prior information and the observed values. Deutsch and Journel (1992) and Goovaerts
(1997) give detailed descriptions of kriging and sequential simulation methods. Hansen et al.
(2006) proposes a methodology that applies sequential simulations to linear Gaussian inverse
problems to incorporate the prior information on the model and honor the observed data. We
propose here to extend the approach of Hansen et al. (2006) to the Gaussian Mixture case.
Gaussian Mixture models are convex combinations of Gaussian components that can be used to
describe the multi‐modal behavior of the model and the data. Sung (2004), for instance,
introduces Gaussian Mixture distributions in multivariate nonlinear regression modeling; while
Hastie and Tibshirani (1996) proposes a mixture discriminant analysis as an extension of linear
discriminant analysis by using Gaussian Mixtures and Expectation‐Maximization algorithm
(Hastie et al., 2009). Gaussian Mixture models are common in statistics (see, for example,
Hasselblad, 1966 and Dempster et al., 1977) and they have been used in different domains:
digital signal processing (Reynolds et al., 2000 and Gilardi et al., 2002), engineering (Alspach
and Sorenson, 1972), geophysics (Grana and Della Rossa, 2010) and reservoir history matching
(Dovera and Della Rossa, 2011). In this paper we first present the extension of the traditional
results valid in the Gaussian case to the Gaussian Mixture case; we then propose the sequential
approach to linear inverse problems under the assumption of Gaussian Mixture distribution;
![Page 3: Sequential approach to Bayesian linear inverse problems in ... · Examples of applications with synthetic and real data, are ... and Sorenson, 1972), geophysics (Grana and Della Rossa,](https://reader034.vdocuments.site/reader034/viewer/2022042101/5e7e31906f7df33aa2024db6/html5/thumbnails/3.jpg)
3
and we finally show some examples of applications in reservoir modeling. If the linear operator
is the identity, then the methodology provides an extension of the traditional Sequential
Gaussian Simulation (SGSim, see Deutsch and Journel, 1992 and Goovaerts, 1997) to a new
methodology that we call Sequential Gaussian Mixture Simulation (SGMixSim). The applications
we propose refer to mixed discrete‐continuous problems of reservoir modeling and they
provide, as main result, sets of models of reservoir facies and porosity. The key point of the
application is that we identify the weights of the Gaussian Mixture describing the continuous
random variable (porosity) with the probability of the reservoir facies (discrete variable).
Theory: Linearized Gaussian Mixture Inversion
In this section we provide the main propositions of linear inverse problems with Gaussian
Mixtures (GMs). We first recap the well‐known analytical result for posterior distributions of
linear inverse problems with Gaussian prior; then we extend the result to the Gaussian
Mixtures case. In the Gaussian case, the solution of the linear inverse problem is well‐known
(Tarantola, 2005). If m is a random vector Gaussian distributed, ),(~ mmN Σμm , with mean mμ
and covariance mΣ ; and G is a linear operator that transforms the model m into the
observable data d
εGmd (1)
where ε is a random vector that represents an error with Gaussian distribution ),( Σ0N
independent of the model m; then the posterior conditional distribution of m |d is Gaussian
with mean and covariance given by
)()( 1| m
Tm
Tmmdm GμdΣGGΣGΣμμ
(2)
mT
mT
mmdm GμΣGGΣGΣΣΣ 1| )( (3)
This result is based on two well known properties of the Gaussian distributions: (A) the linear
transform of a Gaussian distribution is again Gaussian; (B) if the joint distribution (m, d) is
Gaussian, then the conditional distribution m |d is again Gaussian. These two properties can
be extended to the Gaussian Mixtures case. We assume that x is a random vector distributed
according to a Gaussian Mixture with Nc components ),(~)(1
kx
kx
N
k k Nf c Σμx , where πk are
the weights and the distributions ),( kx
kxN Σμ represent the Gaussian components with means
kxμ and covariances k
xΣ evaluated in x . By applying property (A) to the Gaussian components
![Page 4: Sequential approach to Bayesian linear inverse problems in ... · Examples of applications with synthetic and real data, are ... and Sorenson, 1972), geophysics (Grana and Della Rossa,](https://reader034.vdocuments.site/reader034/viewer/2022042101/5e7e31906f7df33aa2024db6/html5/thumbnails/4.jpg)
4
of the mixture, we can conclude that, if L is a linear operator, then Lxy is distributed
according to a Gaussian Mixture. Moreover, the pdf of y is given by
),(~)(1
Tkx
kx
N
k k Nf c LLΣLμy . Similarly we can extend property (B) to conditional Gaussian
Mixture distributions. The well‐known result of the conditional multivariate Gaussian
distribution has already been extended to multivariate Gaussian Mixture models (see, for
example, Alspach and Sorenson, 1972). In particular, if ),( 21 xx is a random vector whose joint
distribution is a Gaussian Mixture
L
kkk ff
12121 ),(),( xxxx (4)
where kf are the Gaussian densities, then the conditional distribution of 12 | xx is again a
Gaussian Mixture
L
kkk ff
11212 )|()|( xxxx (5)
and its parameters (weights, means, and covariance matrices) can be analytically derived. The
coefficients k are given by
),;()(,)(
)()(
1111
1 1
11
kkkL
j jj
kkk Nf
f
fxx Σμxx
x
xx
(6)
and the means and the covariance matrices are
kx
kx
kxx
kx
kxx 1112212 1
1
,| μxΣΣμμ
(7)
Tkxx
kx
kxx
kx
kxx 12112212 ,
1
,| ΣΣΣΣΣ
(8)
where kxx 12 ,Σ is the cross‐covariance matrix. By combining these propositions, the main result of
linear inverse problems with Gaussian Mixture can be derived.
Theorem 1. Let m be a random vector distributed according to a Gaussian Mixture,
),(~1
km
km
N
k k Nc Σμm with Nc components and with means k
mμ , covariances kmΣ , and weights
k , for k = 1, . . . , Nc. Let G be a linear operator, and ε a Gaussian random vector independent
of m with 0 mean and covariance Σ , such that εGmd , then the posterior conditional
distribution dm | is a Gaussian Mixture.
![Page 5: Sequential approach to Bayesian linear inverse problems in ... · Examples of applications with synthetic and real data, are ... and Sorenson, 1972), geophysics (Grana and Della Rossa,](https://reader034.vdocuments.site/reader034/viewer/2022042101/5e7e31906f7df33aa2024db6/html5/thumbnails/5.jpg)
5
Moreover, the posterior means and covariances of the components are given by
)()( 1|
km
Tkm
Tkm
km
kdm GμdΣGGΣGΣμμ
(9)
km
Tkm
Tkm
km
kdm GμΣGGΣGΣΣΣ 1| )( (10)
where kmμ and k
mΣ , are respectively the prior mean and covariance of the kth Gaussian
component of m . The posterior coefficients k of the mixture are given by
L
j jj
kkk
f
f
1)(
)()(
d
dd
(11)
where the Gaussian densities )(dkf have means km
kd Gμμ and covariances ΣGGΣΣ Tk
mkd .
Theory: Sequential Approach
Based on the results presented in the previous section, we introduce here the sequential
approach to linearized inversion in the Gaussian Mixture case. The main result for the Gaussian
case is presented in Hansen et al. (2006). The solution of the linear inverse problem with the
sequential approach requires some additional notation. Let im represent the ith element of the
random vector m, and let sm represent a known sub‐vector of m. This notation will generally be
used to describe the neighborhood of im in the context of sequential simulations. Finally we
assume that the measured data d are known having been obtained as a linear transformation
of m according to some linear operator G .
Theorem 2. Let m be a random vector distributed according to a Gaussian Mixture,
),(~1
km
km
N
k k Nc Σμm , with Nc components and with means k
mμ , covariances kmΣ , and weights
k , for k = 1, . . . , Nc. Let G be a linear operator between the model m and the random data
vector d such that εGmd , with ε a random error vector independent of m with 0 mean
and covariance εΣ . Let sm be the subvector with direct observations of the model m , and mi
the ith element of m. Then the conditional distribution of ),(| dmsim is again Gaussian Mixture.
Moreover, the means and variances of the components of the posterior conditional distribution
are:
k
kkTk
iTk
ikm
km ii
m
msdmmmdm
Gμd
AμmΣGΣAAΣA
ss
1),(),(| )]([ (12)
![Page 6: Sequential approach to Bayesian linear inverse problems in ... · Examples of applications with synthetic and real data, are ... and Sorenson, 1972), geophysics (Grana and Della Rossa,](https://reader034.vdocuments.site/reader034/viewer/2022042101/5e7e31906f7df33aa2024db6/html5/thumbnails/6.jpg)
6
Ti
k
Ti
kkTk
iTk
ik
mk
m ii AGΣ
AAΣΣGΣAAΣA
m
mdmmmdm ss
1),(
)(2)(2),|( )]([ (13)
where
kmi
km ii
μA
Ti
kkmi
AAΣm)(2 (14)
and
εT
mT
m
Tm
Tm
dmΣGAΣAGΣ
GAΣAAΣΣ
s kk
kkk
),( . (15)
The posterior coefficients of the mixture are given by
L
j jj
kkk
f
f
1),(
),(),(
dm
dmdm
s
ss
(16)
where the Gaussian components ),( dmskf have means
k
kk
m
mdm
Gμ
Aμμ
s ),(
(17)
and covariances k),( dms
Σ .
In the case where the linear operator is the identity, the associated inverse problem reduces to
the estimation of a Gaussian Mixture model with direct observations of the model space at
given locations. In other words, if the linear operator is the identity, the theorem provides an
extension of the traditional Sequential Gaussian Simulation (SGSim) to the Gaussian Mixture
case. We call this methodology Sequential Gaussian Mixture Simulation (SGMixSim), and we
show some applications in the next section.
Application
We describe here some examples of applications with synthetic and real data, in the context of
reservoir modeling. First, we present the results of the estimation of a Gaussian Mixture model
with direct observations of the model space as a special case of Theorem 2 (SGMixSim). In our
example, the continuous property is the porosity of a reservoir, and the discrete variable
![Page 7: Sequential approach to Bayesian linear inverse problems in ... · Examples of applications with synthetic and real data, are ... and Sorenson, 1972), geophysics (Grana and Della Rossa,](https://reader034.vdocuments.site/reader034/viewer/2022042101/5e7e31906f7df33aa2024db6/html5/thumbnails/7.jpg)
7
represents the corresponding reservoir facies, namely shale and sand. This means that we
identify the weights of the mixture components with the facies probabilities. The input
parameters are then the prior distribution of porosity and a variogram model for each
component of the mixture. The prior is a Gaussian Mixture model with two components and its
parameters are the weights, the means, and the covariance matrices of the Gaussian
components. We assume facies prior probabilities equal to 0.4 and 0.6 respectively, and for
simplicity we assume the same variogram model (spherical and isotropic) with the same
parameters for both. We then simulate a 2D map of facies and porosity according to the
proposed methodology (Figure 1). The simulation grid is 70 × 70 and the variogram range of
porosity is 4 grid blocks in both directions. The simulation can be performed with or without
conditioning hard data; in the example of Figure 1, we introduced four porosity values at four
locations that are used to condition the simulations, and we generated a set of 100 conditional
realizations (Figure 1). When hard data are assigned, the weights of the mixture components
are determined by evaluating the prior Gaussian components at the hard data location and
discrete property values are determined by selecting the most likely component.
As we previously mentioned, the methodology is similar to Hansen et al. (2006), but the use of
Gaussian Mixture models allows us to describe the multi‐modality of the data and to simulate
at the same time both the continuous and the discrete variable. SGMixSim requires a spatial
model of the continuous variable, but not a spatial model of the underlying discrete variable:
the spatial distribution of the discrete variable only depends on the conditional weights of the
mixture. However, if the mixture components have very different probabilities and very
different variances (i.e. when there are relatively low probable components with relatively high
variances), the simulations may not accurately reproduce the global statistics. If we assume, for
instance, two components with prior probabilities equal to 0.2 and 0.8, and we assume at the
same time that the variance of the first component is much bigger than the variance of the
second one, then the prior proportions may not be honored. This problem is intrinsic to the
sequential simulation approach, but it is emphasized in case of multi‐modal data. For large
datasets or for reasons of stationarity, we often use a moving searching neighborhood to take
into account only the points closest to the location being simulated (Goovaerts, 1997). If we use
a global searching neighborhood (i.e. the whole grid) the computational time, for large
datasets, could significantly increase. In the localized sequential algorithm, the neighborhood is
selected according to a fixed geometry (for example, ellipsoids centered on the location to be
estimated) and the conditioning data are extracted by the linear operator (Theorem 2) within
the neighborhood. When no hard data are present in the searching neighborhood and the
sample value is drawn from the prior distribution, the algorithm could generate isolated points
![Page 8: Sequential approach to Bayesian linear inverse problems in ... · Examples of applications with synthetic and real data, are ... and Sorenson, 1972), geophysics (Grana and Della Rossa,](https://reader034.vdocuments.site/reader034/viewer/2022042101/5e7e31906f7df33aa2024db6/html5/thumbnails/8.jpg)
8
within the simulation grid. For example, a point drawn from the first component could be
surrounded by data, subsequently simulated, belonging to the second component, or vice
versa. This problem is particularly relevant in the case of multi‐modal data especially in the
initial steps of the sequential simulation (in other words when only few values have been
previously simulated) and when the searching neighborhood is small.
Figure 1: Conditional realizations of porosity and reservoir facies obtained by SGMixSim. The prior distribution of porosity and the hard data values are shown on top. The second and third rows show three realizations of porosity and facies (gray is shale, yellow is sand). The fourth row shows the posterior distribution of facies and the ensemble average of 100 realizations of facies and porosity.
To avoid isolated points in the simulated grid, a post‐processing step has been included (Figure
2). The simulation path is first revisited, and the local conditional probabilities are re‐evaluated
at all the grid cells where the sample value was drawn from the prior distribution. Then we
draw again the component from the weights of the re‐evaluated conditional probability. Finally,
we introduce a kriging correction of the continuous property values that had low probabilities
in the neighborhood.
![Page 9: Sequential approach to Bayesian linear inverse problems in ... · Examples of applications with synthetic and real data, are ... and Sorenson, 1972), geophysics (Grana and Della Rossa,](https://reader034.vdocuments.site/reader034/viewer/2022042101/5e7e31906f7df33aa2024db6/html5/thumbnails/9.jpg)
9
Figure 2: Comparison of SGMixSim results shown in Figure 1 with and without post‐processing.
Next, we show two applications of linearized sequential inversion with Gaussian Mixture
models obtained by applying Theorem 2. The first example is a rock physics inverse problem
dealing with the inversion of acoustic impedance in terms of porosity. The methodology
application is illustrated by using a 2D grid representing a synthetic system of reservoir
channels (Figure 3). In this example we made the same assumptions about the prior distribution
as in the previous example. As in traditional sequential simulation approaches, the spatial
continuity of the inverted data depends on the range of the variogram and the size of the
searching neighborhood; however, Figure 3 clearly shows the multi‐modality of the inverted
data. Gaussian Mixture models can describe not only the multi‐modality of the data, but they
can better honor the data correlation within each facies.
Figure 3: Linearized sequential inversion with Gaussian Mixture models for the estimation of porosity map from acoustic impedance values. On top we show the true porosity map and the acoustic impedance map; on the bottom we show the inverted porosity and the estimated facies map.
![Page 10: Sequential approach to Bayesian linear inverse problems in ... · Examples of applications with synthetic and real data, are ... and Sorenson, 1972), geophysics (Grana and Della Rossa,](https://reader034.vdocuments.site/reader034/viewer/2022042101/5e7e31906f7df33aa2024db6/html5/thumbnails/10.jpg)
10
The second example is the acoustic inversion of seismic amplitudes in terms of acoustic
impedance. In this case, in addition to the usual input parameters (prior distribution and
variogram models), we have to specify a low frequency model of impedance, since seismic
amplitudes only provide relative information about elastic contrasts and the absolute value of
impedance must be computed by combining the estimated relative changes with the low
frequency model (often called prior model in seismic modeling). Once again, the discrete
variable is identified with the reservoir facies classification. In this case shales are characterized
by high impedance values, and sand by low impedances. The results are shown in Figure 3. We
observe that even though we used a very smoothed low frequency model, the inverted
impedance log has a good match with the actual data (Figure 4), and the prediction of the
discrete variable is satisfactory compared to the actual facies classification performed at the
well. In particular, if we perform 50 realizations and we compute the maximum a posteriori of
the ensemble of inverted facies profiles, we perfectly match the actual classification (Figure 4).
However, the quality of the results depends on the separability of the Gaussian components in
the continuous property domain.
Figure 4: Sequential Gaussian Mixture inversion of seismic data (ensemble of 50 realizations). From left to right: acoustic impedance logs and seismograms (actual model in red, realization 1 in blue, inverted realizations in gray, dashed line represents low frequency model), inverted facies profile corresponding to realization 1, maximum a posteriori of 50 inverted facies profiles and actual facies classification (sand in yellow, shale in gray).
Finally we applied the Gaussian Mixture linearized sequential inversion to a layer map extracted
from a 3D geophysical model of a clastic reservoir located in the North Sea (Figure 5). The
![Page 11: Sequential approach to Bayesian linear inverse problems in ... · Examples of applications with synthetic and real data, are ... and Sorenson, 1972), geophysics (Grana and Della Rossa,](https://reader034.vdocuments.site/reader034/viewer/2022042101/5e7e31906f7df33aa2024db6/html5/thumbnails/11.jpg)
11
application has been performed on a map of P‐wave velocity corresponding to the top horizon
of the reservoir. The parameters of the variogram models have been assumed from existing
reservoir studies in the same area. In Figure 4 we show the map of the conditioning velocity
and the corresponding histogram, two realizations of porosity and facies, and the histogram of
the posterior distribution of porosity derived from the second realization. The two realizations
have been performed using different prior proportions: 30 % of sand in the first realization and
40 % in the second one. Both realizations honor the expected proportions, the multi‐modality
of the data, and the correlations with the conditioning data within each facies.
Figure 5 Application of linearized sequential inversion with Gaussian Mixture models to a reservoir layer. The conditioning data is P‐wave velocity (top left). Two realizations of porosity and facies are shown: realization 1 corresponds to a prior proportion of 30 % of sand, realization 2 corresponds to 40 % of sand. The histograms of the conditioning data and the posterior distribution of porosity (realization 2) are shown for comparison.
Conclusions
In this paper, we proposed a methodology to simultaneously simulate both continuous and
discrete properties by using Gaussian Mixture models. The method is based on the sequential
approach to Gaussian Mixture linear inverse problem, and it can be seen as an extension of
sequential simulations to multi‐modal data. Thanks to the sequential approach used for the
inversion, the method is generally quite efficient from the computational point of view to solve
multi‐modal linear inverse problems and it is applied here to reservoir modeling and seismic
reservoir characterization. We presented four different applications: conditional simulations of
porosity and facies, porosity‐impedance inversion, acoustic inversion of seismic data, and
inversion of seismic velocities in terms of porosity. The proposed examples show that we can
![Page 12: Sequential approach to Bayesian linear inverse problems in ... · Examples of applications with synthetic and real data, are ... and Sorenson, 1972), geophysics (Grana and Della Rossa,](https://reader034.vdocuments.site/reader034/viewer/2022042101/5e7e31906f7df33aa2024db6/html5/thumbnails/12.jpg)
12
generate actual samples from the posterior distribution, consistent with the prior information
and the assigned data observations. Using the sequential approach, we can generate a large
number of samples from the posterior distribution, which in fact are all solutions to the
Gaussian Mixture linear problem.
Acknowledgements
We would like to thank Ernesto Della Rossa and Laura Dovera (Eni E&P) for the helpful
collaboration.
References
Alspach, D.L., and Sorenson, H.W., 1972, Nonlinear Bayesian estimation using Gaussian sum approximation, IEEE Transactions on Automatic Control, 17, 439–448.
Dempster, A.P., Laird, N.M., and Rubin, D.B., 1977, Maximum likelihood from incomplete data via the EM algorithm, Journal of Royal Statistical Society, Series B, Methodology, 39(1), 1–38.
Deutsch, C., and Journel, A.G., 1992, GSLIB: geostatistical software library and user’s guide, Oxford University Press, London.
Dovera, L., and Della Rossa, E., 2011, Multimodal ensemble Kalman filtering using Gaussian mixture models, Computational Geosciences, 15(2), 307–323.
Gilardi, N., Bengio, S., and Kanevski, M., 2002, Conditional Gaussian mixture models for environmental risk mapping, Proceedings of IEEE workshop on neural networks for signal processing, 777–786.
Goovaerts, P., 1997, Geostatistics for natural resources evaluation, Oxford University Press, London.
Grana, D., and Della Rossa, E., 2010, Probabilistic petrophysical‐properties estimation integrating statistical rock physics with seismic inversion, Geophysics, 75(3), O21–O37.
Hansen, T.M., Journel, A.G., Tarantola, A., and Mosegaard, K., 2006, Linear inverse Gaussian theory and geostatistics, Geophysics, 71, R101–R111.
Hasselblad, V., 1966, Estimation of parameters for a mixture of normal distributions, Technometrics 8(3), 431–444.
Hastie, T., and Tibshirani, R., 1996, Discriminant analysis by gaussian mixtures, Journal of Royal Statistical Society, B 58 (1), 155–176.
Hastie, T., Tibshirani, R., and Friedmann, J., 2002, The elements of statistical learning, Springer, Berlin.
Mosegaard, K., and Tarantola, A., 1995, Monte Carlo sampling of solutions to inverse problems, Journal of Geophysical Research, 100, 12431–12447.
Reynolds, D.A., Quatieri, T.F., and Dunn, R.B., 2000, Speaker verification using adapted Gaussian mixture models, Digital Signal Processing, 10(1–3), 19–41.
Sung, H.G., 2004, Gaussian mixture regression and classification, PhD thesis, Rice University.
Tarantola, A., 2005, Inverse problem theory, SIAM, Philadelphia.