bayesian analysis of due selection using gibbs sampling · 2014-11-07 · original article bayesian...

28
Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS Wang J Jensen D Gianola 1 National Institute of Animal Science, Research Center Foulum, PB 39, DK8830 Tjele, Denmark; 2 Department of Meat and Animal Science, University of Wisconsin-Madison, Madison, WI53706-128.!, USA (Received 7 June 1993; accepted 10 February 1994) Summary - A method of analysing response to selection using a Bayesian perspective is presented. The following measures of response to selection were analysed: 1) total response in terms of the difference in additive genetic means between last and first generations; 2) the slope (through the origin) of the regression of mean additive genetic value on generation; 3) the linear regression slope of mean additive genetic value on generation. Inferences are based on marginal posterior distributions of the above-defined measures of genetic response, and uncertainties about fixed effects and variance components are taken into account. The marginal posterior distributions were estimated using the Gibbs sampler. Two simulated data sets with heritability levels 0.2 and 0.5 having 5 cycles of selection were used to illustrate the method. Two analyses were carried out for each data set, with partial data (generations 0-2) and with the whole data. The Bayesian analysis differed from a traditional analysis based on best linear unbiased predictors (BLUP) with an animal model, when the amount of information in the data was small. Inferences about selection response were similar with both methods at high heritability values and using all the data for the analysis. The Bayesian approach correctly assessed the degree of uncertainty associated with insufficient information in the data. A Bayesian analysis using 2 different sets of prior distributions for the variance components showed that inferences differed only when the relative amount of information contributed by the data was small. response to selection / Bayesian analysis / Gibbs sampling / animal model Résumé - Analyse bayésienne de la réponse génétique à la sélection à l’aide de l’échantillonnage de Gibbs. Cet article présente une méthode d’analyse des expériences de sélection dans une perspective bayésienne. Les mesures suivantes des réponses à la sélection ont été analysées: i) la réponse totale, soit la différence des valeurs génétiques additives moyennes entre la dernière et la première génération; ii) la pente (passant par l’origine) de la régression de la valeur génétique additive moyenne en fonction de la génération; iii) la pente de la régression linéaire de la valeur génétique additive moyenne en fonction de la génération. Les inférences sont basées sur les distributions marginales a posteriori des mesures de la réponse génétique défanies ci-dessus, avec prise en compte des

Upload: others

Post on 15-Jul-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

Original article

Bayesian analysis of genetic change dueto selection using Gibbs sampling

DA Sorensen CS Wang J Jensen D Gianola

1 National Institute of Animal Science, Research Center Foulum,PB 39, DK8830 Tjele, Denmark;

2 Department of Meat and Animal Science, University of Wisconsin-Madison,

Madison, WI53706-128.!, USA

(Received 7 June 1993; accepted 10 February 1994)

Summary - A method of analysing response to selection using a Bayesian perspective ispresented. The following measures of response to selection were analysed: 1) total responsein terms of the difference in additive genetic means between last and first generations;2) the slope (through the origin) of the regression of mean additive genetic value ongeneration; 3) the linear regression slope of mean additive genetic value on generation.Inferences are based on marginal posterior distributions of the above-defined measuresof genetic response, and uncertainties about fixed effects and variance components aretaken into account. The marginal posterior distributions were estimated using the Gibbssampler. Two simulated data sets with heritability levels 0.2 and 0.5 having 5 cycles ofselection were used to illustrate the method. Two analyses were carried out for each dataset, with partial data (generations 0-2) and with the whole data. The Bayesian analysisdiffered from a traditional analysis based on best linear unbiased predictors (BLUP) withan animal model, when the amount of information in the data was small. Inferencesabout selection response were similar with both methods at high heritability values andusing all the data for the analysis. The Bayesian approach correctly assessed the degree ofuncertainty associated with insufficient information in the data. A Bayesian analysis using2 different sets of prior distributions for the variance components showed that inferencesdiffered only when the relative amount of information contributed by the data was small.

response to selection / Bayesian analysis / Gibbs sampling / animal model

Résumé - Analyse bayésienne de la réponse génétique à la sélection à l’aide de

l’échantillonnage de Gibbs. Cet article présente une méthode d’analyse des expériencesde sélection dans une perspective bayésienne. Les mesures suivantes des réponses à lasélection ont été analysées: i) la réponse totale, soit la différence des valeurs génétiquesadditives moyennes entre la dernière et la première génération; ii) la pente (passant parl’origine) de la régression de la valeur génétique additive moyenne en fonction de la

génération; iii) la pente de la régression linéaire de la valeur génétique additive moyenneen fonction de la génération. Les inférences sont basées sur les distributions marginales aposteriori des mesures de la réponse génétique défanies ci-dessus, avec prise en compte des

Page 2: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

incertitudes sur les effets fixés et les composantes de variance. Les distributions marginalesa posteriori ont été estimées à l’aide de l’échantillonnage de Gibbs. Deux ensembles dedonnées simulées avec des héritabilités de 0,2 et 0,5 et 5 cycles de sélection ont été utiliséspour illustrer la méthode. Deux analyses ont été faites sur chaque ensemble de données,avec des données incomplètes (génération 0-!! et avec les données complètes. L’analysebayésienne différait de l’analyse traditionnelle, basée sur le BL UP avec un modèle animal,quand la quantité d’information utilisée était réduite. Les inférences sur la réponse àlt sélection étaient similaires avec les 2 méthodes quand l’héritabilité était élevée et quetoutes les données étaient prises en compte dans l’analyse. L’approche bayésienne évaluaitcorrectement le degré d’incertitude lié à une information insuffisante dans les données.Une analyse bayésienne avec 2 distributions a priori des composantes de variance a montreque les inférences ne difj&dquo;éraient que si la part d’information fournie par les données étaitfaible.

réponse à la sélection / analyse bayésienne / échantillonnage de Gibbs / modèleanimal

INTRODUCTION

Many selection programs in farm animals rely on best linear unbiased predictors(BLUP) using Henderson’s (1973) mixed-model equations as a computing device,in order to predict breeding values and rank candidates for selection. With theincreasing computing power available and with the development of efficient algo-rithms for writing the inverse of the additive relationship matrix (Henderson, 1976;(auaas, 1976), ’animal’ models have been gradually replacing the originally used siremodels. The appeal of ’animal’ models is that, given the model, use is made of theinformation provided by all known additive genetic relationships among individuals.This is important to obtain more precise predictors and to account for the effectsof certain forms of selection on prediction and estimation of genetic parameters.A natural application of ’animal’ models has been prediction of the genetic means

of cohorts, for example, groups of individuals born in a given time interval suchas a year or a generation. These predicted genetic means are typically computedas the average of the BLUP of the genetic values of the appropriate individuals.From these, genetic change can be expressed as, for example, the regression ofthe mean predicted additive genetic value on time or on appropriate cumulativeselection differentials (Blair and Pollak, 1984). In common with selection index,it is assumed in BLUP that the variances of the random effects or ratios thereofare known, so the predictions of breeding values and genetic means depend onsuch ratios. This, in turn, causes a dependency of the estimators of genetic changederived from ’animal’ models on the ratios of the variances of the random effectsused as ’priors’ for solving the mixed-model equations. This point was first notedby Thompson (1986), who showed in simple settings, that an estimator of realizedheritability given by the ratio between the BLUP of total response and the totalselection differential leads to estimates that are highly dependent on the value ofheritability used as ’prior’ in the BLUP analysis. In view of this, it is reasonable

Page 3: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

to expect that the statistical properties of the BLUP estimator of response willdepend on the method with which the ’prior’ heritability is estimated.

In the absence of selection, Kackar and Harville (1981) showed that whenthe estimators of variance in the mixed-model equations are obtained with evenestimators that are translation invariant and functions of the data, unbiased

predictors of the breeding value are obtained. No other properties are known andthese would be difficult to derive because of the nonlinearity of the predictor. Inselected populations, frequentist properties of predictors of breeding value basedon estimated variances have not been derived analytically using classical statisticaltheory. Further, there are no results from classical theory indicating which estimatorof heritability ought to be used, even though the restricted maximum likelihood(REML) estimator is an intuitively appealing candidate. This is so because thelikelihood function is the same with or without selection, provided that certainconditions are met (Gianola et at, 1989; Im et at, 1989; Fernando and Gianola, 1990).However, frequentist properties of likelihood-based methods under selection havenot been unambiguously characterized. For example, it is not known whether themaximum likelihood estimator is always consistent under selection. Some propertiesof BLUP-like estimators of response computed by replacing unknown variances bylikelihood-type estimates were examined by Sorensen and Kennedy (1986) usingcomputer simulation, and the methodology has been applied recently to analysedesigned selection experiments (Meyer and Hill, 1991). Unfortunately, samplingdistributions of estimators of response are difficult to derive analytically and onemust resort to approximate results, whose validity is difficult to assess. In summary,the problem of exact inferences about genetic change when variances are unknownhas not been solved via classical statistical methods.

However, this problem has a conceptually simple solution when framed in aBayesian setting, as suggested by Sorensen and Johansson (1992), drawing fromresults in Gianola et at (1986) and Fernando and Gianola (1990). The starting pointis that if the history of the selection process is contained in the data employedin the analysis, then the posterior distribution has the same mathematical formwith or without selection (Gianola and Fernando, 1986). Inferences about breedingvalues (or functions thereof, such as selection response) are made using the marginalposterior distribution of the vector of breeding values or from the marginal posteriordistribution of selection response. All other unknown parameters, such as ’fixedeffects’ and variance components or heritability, are viewed as nuisance parametersand must be integrated out of the joint posterior distribution (Gianola et at, 1986).The mean of the posterior distribution of additive genetic values can be viewed as aweighted average of BLUP predictions where the weighting function is the marginalposterior density of heritability.

Estimating selection response by giving all weight to an REML estimate of her-itability has been given theoretical justification by Gianola et at (1986). When theinformation in an experiment about heritability is large enough, the marginal pos-terior distribution of this parameter should be nearly symmetric; the modal value ofthe marginal posterior distribution of heritability is then a good approximation toits expected value. In this case, the posterior distribution of selection response canbe approximated by replacing the unknown heritability by the mode of its marginal

Page 4: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

posterior distribution. However, this approximation may be poor if the experimenthas little informational content on heritability.A full implementation of the Bayesian approach to inferences about selection

response relies on having the means of carrying out the necessary integrations ofjoint posterior densities with respect to the nuisance parameters. A Monte-Carloprocedure to carry out these integrations numerically known as Gibbs sampling isnow available (Geman and Geman, 1984; Gelfand and Smith, 1990). The procedurehas been implemented in an animal breeding context by Wang et al (1993a; 1994a,b)in a study of variance component inferences using simulated sire models, and inanalyses of litter size in pigs. Application of the Bayesian approach to the analysisof selection experiments yields the marginal posterior distribution of response toselection, from which inferences about it can be made, irrespective of whethervariances are unknown. In this paper, we describe Bayesian inference about selectionresponse using animal models where the marginalizations are achieved by means ofGibbs sampling.

MATERIALS AND METHODS

Gibbs sampling

The Gibbs sampler is a technique for generating random vectors from a jointdistribution by successively sampling from conditional distributions of all randomvariables involved in the model. A component of the above vector is a random

sample from the appropriate marginal distribution. This numerical technique wasintroduced in the context of image processing by Geman and Geman (1984) andsince then has received much attention in the recent statistical literature (Gelfandand Smith, 1990; Gelfand et al, 1990; Gelfand et al, 1992; Casella and George,1992). To illustrate the procedure, let us suppose there are 3 random variables, X,Y and Z, with joint density p(x, y, z); we are interested in obtaining the marginaldistributions of X, Y and Z, with densities p(x), p(y) and p(z), respectively.In many cases, the necessary integrations are difficult or impossible to performalgebraically and the Gibbs sampler provides a means of sampling from such amarginal distribution. Our interest is typically in the marginal distributions andthe Gibbs sampler proceeds as follows. Let (xo, yo, zo) represent an arbitrary set ofstarting values for the 3 random variables of interest. The first sample is:

where p(xly = yo, Z = zo) is the conditional distribution of X given Y = yo andZ = zo. The second sample is:

where x, is updated from the first sampling. The third sample is:

where yl is the realized value of Y obtained in the second sampling. This constitutesthe first iteration of the Gibbs sampling. The process of sampling and updating is

Page 5: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

repeated k times, where k is known as the length of the Gibbs sequence. As k -> oo,the points of the kth iteration (Xk, Yk, Zk) constitute 1 sample point from p(x, y, z)when viewed jointly, or from p(x), p(y) and p(z) when viewed marginally. In orderto obtain m samples, Gelfand and Smith (1990) suggest generating m independent’Gibbs sequences’ from m arbitrary starting values and using the final value at thekth iterate from each sequence as the sample values. This method of Gibbs samplingis known as a multiple-start sampling scheme, or multiple chains. Alternatively, asingle long Gibbs sequence (initialized therefore once only) can be generated andevery dth observation is extracted (eg, Geyer 1992) with the total number of samplessaved being m. Animal breeding applications of the short- and long-chain methodsare given by Wang et al (1993, 1994a,b).

Having obtained the m samples from the marginal distributions, possibly corre-lated, features of the marginal distribution of interest can be obtained appealing tothe ergodic theorem (Geyer, 1992; Smith and Roberts, 1993):

where xi(i = 1, ... , m) are the samples from the marginal distribution of x, u’ is anyfeature of the marginal distribution, eg, mean, variance or, in general, any featureof a function of x, and g(.) is an appropriate operator. For example, if u is the meanof the marginal distribution, g(!) is the identity operator and a consistent estimatorof the variance of the marginal distribution is

(Geyer, 1992). Note that Sm is a Monte-Carlo estimate of u, and the error of thisestimator can be made arbitrarily small by increasing m. Another way of obtainingfeatures of a marginal distribution is first to estimate the density using [11, andthen compute summary statistics (features) of that distribution from the estimateddensity.

There are at least 2 ways to estimate a density from the Gibbs samples. One is touse random samples (xi) to estimate p(x). We consider the normal kernel estimator(eg, Silverman, 1986).

where p(x) is the estimated density at x, and h is a fixed constant (called thewindow width) given by the user. The window width determines the smoothness ofthe estimated curve.

Another way of estimating a density is based on averaging conditional densities(Gelfand and Smith, 1990). From the following standard result:

Page 6: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

and given knowledge of the full conditional distribution, an estimate of p(x) isobtained from:

«

where t/i,2:i;...,t/!,2:nt are the realized values of final Y, Z samples from eachGibbs sequence. Each pair y2, zi constitutes 1 sample, and there are m such pairs.Notice that the xi are not used to estimate p(x). Estimation of the density of afunction of the original variables is accomplished either by applying [2] directly tothe samples of the function, or by applying the theory of transformation of randomvariables in an appropriate conditional density, and then using !3!. In this case, theGibbs sampling scheme does not need to be rerun, provided the needed samples aresaved.

Model

In the present paper we consider a univariate mixed model with 2 variance

components for ease of exposition. Extensions to more general mixed models aregiven by Wang et al (1993b; 1994) and by Jensen et al (1994). The model is:

where y is the data vector of order n by 1; X is a known incidence matrix of order nby p; Z is a known incidence matrix of order n by q; b is a p by 1 vector of uniquelydefined ’fixed effects’ (so that X has full column rank); a is a q by 1 ’random’vector representing individual additive genetic values of animals and e is a vectorof random residuals of order n by 1.

The conditional distribution that pertains to the realization of y is assumed tobe:

where H is a known n by n matrix, which here will be assumed to be the identitymatrix, and aj (a scalar) is the unknown variance of the random residuals.We assume a genetic model in which genes act additively within and between

loci, and that there is effectively an infinite number of loci. Under this infinitesimalmodel, and assuming further initial Hardy-Weinberg and linkage equilibrium, thedistribution of additive genetic values conditional on the additive genetic covarianceis multivariate normal:

In [6], A is the known q by q matrix of additive genetic relationships amonganimals, and Qa (a scalar) is the unknown additive genetic variance in the conceptualbase population, before selection took place.

The vector b will be assumed to have a proper prior uniform distribution:

p(b) oc constant,

where bmin and bmax are, respectively, the minimum and maximum values whichb can take, a priori. Further, a and b are assumed to be independent, a priori.

Page 7: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

To complete the description of the model, the prior distributions of the variancecomponents need to be specified. In order to study the effect of different priors oninferences about heritability and response to selection, 2 sets of prior distributionswill be assumed. Firstly, u2 and aj will be assumed to follow independent properprior uniform distributions of the form:

p(u?) oc constant,

where a2maX and afl!!! are the maximum values which, according to prior know-ledge, a! and !2 can take. In the rest of the paper, all densities which are functionsof b, Qa, and ae will implicity take the value zero if the bounds in 7 and [8] areexceeded.

Secondly, Qa and <7! will be assumed to follow a priori scaled inverted chi-squaredistributions:

where vi and Sf are parameters of the distribution. Note that a uniform prior canbe obtained from [9] by setting vi = -2 and S2 = 0.

Posterior distribution of selection response

Let a represent the vector of parameters associated with the model. Bayes theoremprovides a means of deriving the posterior distributions of a conditional on thedata:

The first term in the numerator of the right-hand side of [10] is the conditional den-sity of the data given the parameters, and the second is the prior joint distributionof the parameters in the model. The denominator in [10] is a normalizing con-stant (marginal distribution of the data) that does not depend on a. Applying !10!,and assuming the set of priors [9] for the variance components, the joint posteriordistribution of the parameters is:

Page 8: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

The joint posterior under a model assuming the proper set of prior uniformdistributions [8] for the variance components is simply obtained by setting vi = -2

and SZ = 0 (i = e, a) in !12!. To make the notation less burdensome, we drop fromnow onwards the conditioning on v and S.

Inferences about response to selection can be made working with a function ofthe marginal posterior distribution of a. The latter is obtained integrating [12] overthe remaining parameters:

where E = (o, a 2,a e 2) and the expectation is taken over the joint posterior distributionof the vector of ’fixed effects’ and variance components. This density cannot bewritten in a closed form. In finite samples, the posterior distribution of a should beneither normal nor symmetric.

Response to selection is defined as a linear function of the vector of additive

genetic values:

where K is an appropriately defined transformation matrix and R can be avector (or a scalar) whose elements could be the mean additive genetic valuesof each generation, or contrasts between these means, or alternatively, regressioncoefficients representing linear and quadratic changes of genetic means with respectto some measure of time, such as generations. By virtue of the central limit theorem,the posterior distribution of R should be approximately normal, even if [13] is not.

Full conditional posterior distributions (the Gibbs sampler)

In order to implement the Gibbs sampling scheme, the full conditional posteriordensities of all the parameters in the model must be obtained. These distributionsare, in principle, obtained dividing [12] by the appropriate posterior density functionor the equivalent, regarding all parameters in [12] other than the one of interest asknown. For the fixed and random effects though, it is easier to proceed as follows.

Page 9: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

Using results from Lindley and Smith (1972), the conditional distribution of 0given all other parameters is multivariate normal:

where

and 0 satisfies:

which are the mixed-model equations of Henderson (1973).

and using standard multivariate normal theory, it can be shown for any such

partition that:

where 01 is given by:

Expressions [19] and [20] can be computed in matrix or scalar form. Let bi bea scalar corresponding to the ith element in the vector of ’fixed effects’, b_i. be bwithout its ith element, xi be the ith column vector in X, and X_.L be that part ofmatrix X with xi excluded. Using [19] we find that the full conditional distributionof bi given all other parameters is normal.

where bi satisfies:

For the ’random effects’, again using [19] and letting a-i be a without its ith

element, we find that the full conditional distribution of the scalar ai given all theother parameters is also normal:

where zi is the ith column of Z, cii = (Var(ai!a_,,))-1 Qa is the element in the ithrow and column of A-1, and ai satisfies:

In [24], ci,_i is the row of A-’ corresponding to the ith individual with the ithelement excluded.

Page 10: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

The full conditional distribution of each of the variance components is readilyobtained by dividing [12] by p(b, a, !-ily), where E-i is E with !2 excluded. Sincethis last distribution does not depend on a2, this becomes (Wang et al, 1994a):

which has the form of an inverted chi-square distribution, with parameters:

When the prior distribution for the variance components is assumed to be

uniform, the full conditional distributions have the same form as in [25], exceptthat, in [26] and subsequent formulae, vi = -2 and S2 = 0 (i = e, a).

Generation of random samples from marginal posterior distributionsusing Gibbs sampling

In this section we describe how random samples can be generated indirectly fromthe joint distribution [12] by sampling from the conditional distributions !21!, [23]and !25!. The Gibbs sampler works as follows:

(i) set arbitrary initial values for b, a and E;(ii) sample from !21!, and update bi, i = 1, ... , p;(iii) sample from [23] and update ai, i = 1, ... , q;(iv) sample from [25] and update aa;(v) sample from [25] and update !e;(vi) repeat (ii) to (v) k (length of chain) times.

As k - oo, this creates a Markov chain with an equilibrium distribution having[12] as density. If along the single path k, m samples are extracted at intervals oflength d, the algorithm is called a single-long-chain algorithm. If, on the other hand,m independent chains are implemented, each of length k, and the kth iteration issaved as a sample, then this is known as the short-chain algorithm.

If the Gibbs sampler reached convergence, for the m samples (b, a, E) 1, ... ,(b, a, E)&dquo;, we have:

where - means distributed with marginal posterior density p(.ly). In any particularsample, we notice that the elements of the vectors b and a, bi and ai, say, are samples

Page 11: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

of the univariate marginal distributions p(bi!y) and p(aily). In order to estimatemarginal posterior densities of the variance components, in addition to the abovequantities the following sums of squares must be stored for each of the m samples:

For estimation of selection response, the quantities to be stored depend on thestructure of K in (14!.

Density estimation

As indicated previously, a marginal density can be estimated, for example, using thenormal density kernel estimator [2] with the m Gibbs samples from the relevantmarginal distribution, or by averaging over conditional densities (equation [3]).Density estimation by the former method is straightforward, by applying (2!. Here,we outline density estimation using (3!.

The formulae for variance components and functions thereof were given by Wanget al (1993, 1994b). For each of the 2 variance components, the conditional densityestimators are:

and for the error variance component:

The estimated values of the density are obtained by fixing Qa and Qe at a numberof points and then evaluating [27] and [28] at each point over the m samples. Noticethat the realized values of Qa and Qe obtained in the Gibbs sampler are not usedto estimate the marginal posterior densities in [27] and [28].

To estimate the marginal posterior density of heritability h2 = a£ /(a£ +!e), thepoint of departure is the full conditional distribution of a£ . This distribution has !eas a conditioning variable, and therefore Qe is treated as a constant. Since the inversetransformation is a! = <T!/(1 - h2), and the Jacobian of the transformation fromQa to h2 is J = Qe/(1 - h2) , then from [27] we obtain:

Page 12: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

The estimation of the marginal posterior density of each additive genetic valuefollows the same principles:

where, for the jth sample:

Equally spaced points in the effective range of ai are chosen, and for each pointan estimate of its density is obtained by inserting, for each of the m Gibbs samples,the realized values for o,2, b, a-i, and k in [30] and [31]. These quantities, togetherwith ai, need to be stored in order to obtain an estimate of the marginal posteriordensity of the ith additive genetic value. This process is repeated for each of theequally spaced points.

Estimation of the marginal posterior density of response depends on the wayit is expressed. In general R in [14] can be a scalar (the genetic mean of a givengeneration, or response per time unit) or it can be a vector, whose elements coulddescribe, for example, linear and higher order terms of changes of additive geneticvalues with time or unit of selection pressure applied. We first derive a generalexpression for estimation of the marginal posterior density of R and then look atsome special cases to illustrate the procedure.

Assume that R contains s elements and we wish to estimate p(Riy) as:

In order to implement (32!, the full conditional distribution on the right-hand sideis needed. This distribution is obtained applying the theory of transformations top(ailb, a- j, E, y), where ai is a vector of additive genetic values of order s, and a_iis the vector of all additive genetic values with ai deleted, such that a = (ai, a_i)’.

Page 13: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

The s additive genetic values in ai must be chosen so that the matrix of thetransformation from ai to R is non-singular. We can then write [14] as:

so that we have expressed R in a part which is a function of ai and another onewhich is not. The matrix Ki is non-singular of order s by s, and from [33], theinverse transformation is

Since the Jacobian of the transformation from ai to R is det(K! 1), lettingVi = Var(ai!b, a_i, E, y), and using standard theory, we obtain the result thatthe conditional posterior distribution of response is normal:

where, using [19] and !20!, it can be shown that:

and

In [35] and [36], Ci is the s by s block of the inverse of the additive geneticrelationship matrix whose rows and columns correspond to the elements in ai, andCZ!_2 is the s by (q &mdash; s) block associating the elements of ai with those of a-i. With

[34] available, the marginal posterior distribution of R can be obtained from !32!.As a simple illustration, consider the estimation of the marginal posterior density

of total response to selection, defined as the difference in average additive geneticvalues between the last generation ( f ) and the first one:

where afj and a1j are additive genetic values of individuals in the final and firstgenerations, and n and n1 are the number of additive genetic values in the finaland first generation, respectively. We will arbitrarily choose af1 and carry out alinear transformation from p(afllb, a- 11,!, y) to p(Rlb, a- 11,!, y). We write [37]in the form of !33):

so that in the notation of [33], we have:

Page 14: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

and the marginal posterior density is estimated as follows:

As a second illustration we consider the case where response is expressed as thelinear regression (through the origin) of additive genetic values on time units. ThusR is a scalar. We assume as before that there are f time units, and the responseexpressed in the form of [33] and [38] is:

is their number. A little manipulation shows that:

EXAMPLES

To illustrate, the methodology was used to analyse 2 simulated data sets. Genotypeswere sampled using a Gaussian additive genetic model. The phenotypic variancewas always 10, and base population heritability was 0.2 and 0.5 in data sets 1and 2, respectively. A phenotypic record was obtained by summing a fixed effect

Page 15: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

(30 fixed effects in the complete data set, but these values are of no importancehere), an additive genetic effect and a residual term. Both additive genetic andresidual effects were assumed to follow normal distributions with null means andvariances based on the 2 heritability levels. In the generation of Mendelian samplingeffects, parental inbreeding was taken into account.

Five cycles of single trait selection based on a BLUP animal model were practised.Each generation resulted from the mating of 8 males and 20 females, and eachfemale produced 3 offspring with records and 2 additional female offspring withoutrecords which were also considered as female replacements. Males were selectedacross generations (the best 8 each cycle) and females, which bred only once,were selected from the highest scoring predicted breeding values available eachgeneration. The total number of animals at the end of the program was 968, ofwhich 720 had records. The total number of sires and dams in each data set was 42and 241. Details of the simulation including a description of the generation of fixedeffects and additive genetic values can be found in Sorensen (1988).Two analyses were carried out for each data set: an analysis based on all records;

and one using data from generations 0-2 only. The rationale for this is that it isnot uncommon in the literature to find reports of selection experiments in progress.Hence, we can illustrate in this manner how inferences are modified as additionalinformation is collected.

For each analysis, marginal posterior densities for the additive genetic variancea , residual variance (Qe), phenotypic variance (aP) and heritability (h2) wereestimated. In addition, marginal posterior densities of 2 expressions for selectionresponse were estimated. These were: 1) difference between final and initial averageadditive genetic values or total response (TR); and 2) linear regression throughthe origin of the linear regression of additive genetic values on generation. Unlessotherwise stated, the analyses reported below were performed assuming thatvariance components follow a priori independent uniform distributions as shown in(8!. The upper bounds of the additive and environmental variances (a 2 2were arbitrarily chosen to be 10 and 20, respectively.

The Gibbs sampler was implemented using a single chain of total length1 205 000, with a warm-up (initial iterations discarded) of length 5 000, and a sub-sampling interval between samples of d = 10. Thus, a total of m = 120 000 sampleswere saved. Calculations were programmed in Fortran in double precision. Randomnumbers were generated using IMSL subroutines (IMSL Inc, 1989).

Plots of the estimated marginal posterior densities of Qa, Qe, h2, TR and 6were obtained with both the average of conditionals and with the normal kernel

density estimator; those for a!, t50 and 61 were generated using the normal kerneldensity estimator only, for technical reasons. The window used for the normalkernel estimator was the range of the effective domain of the parameter, dividedby 75. The effective domain covered 99.9% of the density mass. Each of the plotswas generated by dividing the effective domain of that variable into 100 evenlyspaced intervals. Summary statistics of distributions, such as the mean, medianand variance were computed by Simpson’s integration rules, by further dividingthe effective domain into 1 000 evenly spaced intervals using cubic-spline techniques(IMSL Inc, 1989). Modes were located through grid search. For illustration and notas a means of a definite comparison between methods, a standard classical analysis

Page 16: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

was carried out. First, REML estimates of variance components were obtained.These REML estimates were then used in lieu of the true values in Henderson’smixed-model equations to obtain estimates of additive genetic values. Finally,estimated genetic responses to selection were computed based on appropriateaverages of the estimated additive genetic values. Results of this classical analysisare reported later under the heading ST in figures 5-8. The estimated marginalposterior densities of the variance components and of heritability are shown infigures 1 and 2 for the analysis based on partial (PART) and whole (WHOLE)data, respectively, when base population heritability was 0.2. Corresponding plotsfor base heritability 0.5 are given in figures 3 and 4.

Page 17: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

For the data simulated with h2 = 0.2, the posterior distributions reflectedconsiderable uncertainty about the values of the parameters. As expected, thevariances of the posterior distributions were smaller in WHOLE (fig 2) than inPART (fig 1). In particular, the PART analysis of U2 and h2 showed highlyskewed distributions; the mean, the mode and the median differed from eachother. The REML estimates in the PART analysis for the additive genetic varianceand heritability were very close to zero. Asymptotic confidence intervals were notcomputed, but they would probably include a region outside the parameter space.The Bayesian analysis indicated with considerable probability that both parametersare non-zero. When WHOLE was considered (fig 2), the degree of uncertainty

Page 18: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

was reduced, but point estimates in the Bayesian analysis of genetic variance andof heritability were different from those in the REML estimates. For example,the posterior mean, mode and median of heritability were 0.13, 0.08 and 0.12,respectively, in comparison with the REML estimate of 0.07. The lack of symmetryof the posterior distribution of Qa and h2 clearly suggests that the simulatedselection experiment does not have a high degree of resolution concerning inferencesabout genetic parameters. This point would not be as forcefully illustrated in

a standard REML analysis, where, at best, one would get joint approximateasymptotic confidence intervals for the parameters of interest.

Page 19: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

In the case of h2 = 0.5 (fig 3 and 4), it should be noted that the distributionswere less skewed than for h2 = 0.2, although still skewed for PART (fig 3). Boththe Bayesian and the REML analyses suggested a moderate to high heritability,but the fact that the posterior distribution of heritability for PART (fig 3) wasvery skewed indicates that more data should be collected for accurate inferencesabout heritability. The WHOLE analysis (fig 4) yielded nearly symmetric posteriordistributions of heritability with smaller variance, centered at 0.55 (the REMLestimate was 0.543). It should be noted that the residual variance was poorlyestimated in PART (fig 3), but the Bayesian analysis clearly depicted the extentof uncertainty about this parameter. This also illustrates the fact that a variancecomponent smaller in value relative to others in the model is harder to estimate,

Page 20: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

contrary to the common belief in animal breeding that residual variance is alwayswell estimated.

The assessment of response to selection for the population simulated with h2 =0.2 is given in figures 5 and 6 for the PART and WHOLE data analyses, respectively.The true response in the simulated data at a given generation, computed as theaverage of true genetic values within the generaton in question, was TR = 0.643units and TR = 1.783 units, for PART and WHOLE. The regressions of trueresponse on generation were 0.321 and 0.343, for PART and WHOLE, respectively.The hypothesis of no response to selection cannot be rejected in the PART analysis(fig 5). The classical approach gave an estimated response of zero. However, theBayesian analysis documents the extent of uncertainty, and assigns considerableposterior probability to the proposition that selection may be effective; the posteriorprobability of response (eg, Ql > 0 or TR = t2 - to > 0) was much larger thanthat of the complement. The strength of additional evidence brought up in thewhole analysis (fig 6) supports the hypothesis that selection was effective. Forexample, the difference in genetic mean between generation 0 and 5 was centeredaround 1.852, though the variance of the posterior distribution was as large as1.075. The classical analysis gave an estimated value of ST = 1.296 units, but noexact measure of variation can be associated to this estimate. Approximate resultscould be obtained, for example, using Monte-Carlo simulations, but this was notattempted in the present work.

In the simulation with h2 = 0.5, the true response in the simulated data,computed as the average of true genetic values from the relevant generations,was TR = 3.258 units and TR = 7.148 units for the PART and WHOLEdata sets, respectively. The associated regressions on generation were 1.629 and1.436, respectively. The Bayesian and classical analyses indicated response toselection beyond reasonable doubt (fig 7 and 8). Note, however, that the posteriordistribution of TR = t2 - to in PART was skewed (fig 7). In figure 8, all the

posterior distributions were nearly symmetric, and the classical and Bayesiananalyses were essentially in agreement. In spite of this symmetry, considerableuncertainty remained about the true rate of response to selection. For example,values of t5 - to ranging from 3 to 11 units have appreciable posterior density(fig 8). In the light of this, results from a similar experiment where realized geneticchange is, say, 50% higher or lower than our posterior mean of about 6.7 units,could not be interpreted as yielding conflicting evidence.

In order to study the influence of different priors on inferences about heritabilityand response to selection, part of the above analyses using the same data, wasrepeated assuming that variance components followed, a priori, the inverse chi-square distributions (9!. The ’degree of belief’ parameters vi and the a priori valueof the variance components S2, were assessed by the method of moments froman independently simulated data set with base population heritability of 0.5, andconsisting of 3 cycles of selection. An REML estimate of heritability from this datawas 0.46 with an asymptotic variance of 2.94 x 10-2. The method of momentsas described in Wang et al (1994b), produced the following estimates: va = 11;ve = 464; Sa = 4.50; Se = 5.19. A reanalysis of the PART and WHOLE replicate(heritability = 0.5) yielded the results shown in table I. The figures in the tableclearly illustrate how increasing the amount of information in the data modifies

Page 21: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS
Page 22: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

the input contributed by the prior. Thus in the PART dataset, the uniform priorand the informative prior (which implies a prior value of heritability of 0.46) leadto clear differences in the expected value of the marginal posterior distribution ofheritability and selection response. However, in the WHOLE data set, inferencesusing either prior are very similar. The figures also illustrate how the variance ofmarginal posterior distributions is reduced when an informative prior is used.

The examples indicate the power of the Bayesian analysis to reveal uncertaintyin response to selection when the information contained in the data about the

Page 23: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

appropriate parameters is small (eg, PART analysis at h2 = 0.2). Other plots, suchas marginal posterior distribution of generation means, for example, are possibleand could illustrate the evolution of the drift variance as the experiment progresses.

DISCUSSION

We have presented a way of analysing response to selection in experimental or non-experimental settings, from a Bayesian viewpoint. The end-point of the analysis

Page 24: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

is the construction of a marginal posterior distribution of a measure of selectionresponse which, in this study, was defined as a linear function of additive geneticvalues. The marginal distribution can be viewed as a weighted average of aninfinite number of conditional distributions, where the weighting function is the

marginal posterior distribution of variances and other parameters, which are notnecessarily of interest. The approach relies heavily on the result (Gianola andFernando, 1986) that if the data on which selection was based are included in theanalysis, the Bayesian approach accounts for selection automatically, in the sense

Page 25: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

that the joint posterior or any marginal posterior distribution is the same with orwithout selection. This is a particular case of what has been known as ’ignorableselection’ (Little and Rubin, 1987; Im et al, 1989) and it implies that selectioncan be ignored if the information of any data-based selection is contained in the

Bayesian model. Furthermore, inferences pertain to parameters, eg, heritability, ofthe base population.

As shown with the simulated data, the marginal posterior distributions of vari-ances and functions thereof are often not symmetric. This fact is taken into accountwhen computing the marginal posterior distributions of measures of selection re-sponse. In this way, the uncertainty associated with all nuisance parameters in themodel is taken into account when drawing inferences about response to selection.This is in marked contrast with the estimation of response that is obtained usingBLUP with an animal model, which assumes the variance ratio known, and gives100% weight to an estimate of this ratio.

The analysis of the simulated data used uniform distributions for the fixedeffects and variance components. The choice of appropriate prior distributionsis a contentious issue in Bayesian inference, especially when these priors are

supposed to convey vague initial knowledge. Injudicious choice of noninformativeprior distributions can lead to improper posterior distributions, and this may not beeasy to recognize, especially when the analysis is based on numerical methods, as inthe present work. The subject is still a matter of debate and an important conceptis that of reference priors (Bernardo, 1979; Berger and Bernardo, 1992), whichhave the property that they contribute with minimum information to the posteriordistribution while the information arising from the likelihood is maximized. In thesingle parameter case, reference priors often yield the Jeffreys priors (Jeffreys, 1961).The uniform prior distributions we have chosen for the fixed effects and the variancecomponents represent a state of little prior knowlege, but are clearly not invariantunder transformations and must be viewed as simple ad iaoc approximations to theappropriate reference prior distributions. We have illustrated, however, that whenthe amount of information contained in the data is adequate inferences are affectedlittle by the choice of priors. It is not generally a simple matter to decide whenone is in a setting of adequate information, and it may be therefore revealing tocarry out analyses with different priors to study how inferences are affected (Berger,1985). If use of different priors leads to very different results, this indicates that theinformation in the likelihood is weak and more data ought to be collected in orderto draw firmer conclusions.

The richness of the information that can be extracted using the Bayesianapproach is at the cost of computational demands. The demand is in terms of

computer time rather than in terms of programming complexity. However, animportant limitation of iterative simulation methods is that drawing from theappropriate posterior density takes place only in the limit, as the number of drawsbecomes infinite. It is not easy to check for convergence to the correct distribution,and there is little developed theory indicating how long the Gibbs chain should be.The rate of convergence can be exceedingly slow, especially when random variablesare highly correlated, which is the case with animal models. Further, there is oftena possibility that the Gibbs sequence may get ’trapped’ near zero which is an

absorbing state, in which case one must reinitialize the chain (Zeger and Karim,

Page 26: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

1991). Gibbs sampling convergence is an area of active research where a number ofpartial answers based on pragmatic approaches have been suggested. These includechecking for serial correlations, monitoring the behavior of several series of runs ofthe chains starting from a wide range of initial values and checking both within-and between-series variation (Gelman and Rubin, 1992). Another possibility, givena Gibbs sample of size m, is to vary the length of the sequence and to overlay plotsof the estimated densities to see if these are distinguishable (Gelfand et al, 1990).Convergence can probably be accelerated if correlated random variables (such asadditive genetic values in animal models) are blocked together, at the expense ofhaving to sample from multivariate conditional distributions (Smith and Roberts,1993). In general though, selection experiments are scarce and expensive, and thecost of the additional computation needed for carrying out the Bayesian analysisin often marginal relative to the whole cost.

The great appeal of this method is that it yields a Monte-Carlo estimate of a fullmarginal posterior distribution of a parameter of interest, from which probabilitiesthat the parameter lies between specified values can be easily computed. Thisis particularly relevant in the case where the asymptotic normality of posteriordistributions is difficult to justify, which can often be the case in selection experi-ments. The pictorial representation of marginal posterior distributions, and theassociated possibility of making precise probability statements, provide for a veryrich inferential framework. Errors incurred when estimating a posterior density byMonte-Carlo methods can be made arbitrarily small by increasing the chain lengthin the Gibbs sampling, at least in principle.

The Bayesian analysis can also be useful to study the design of selection exper-iments. The literature on experimental design relies heavily on assumptions suchas absence of nuisance parameters and knowledge of base population parameters.Studies about the use of animals models in selection experiments have concentratedon analysis rather than design issues. This is partly because in a frequentist set-ting it is difficult to compute the sampling variance of the estimator of response,especially when variances used as priors have been estimated from the data athand. Using the Bayesian approach, a variety of designs could be studied and theirefficiency compared by means of analyses of predictive distributions.

ACKNOWLEDGMENTS

The topic of this research emerged through conversations which one of us (DAS) heldwith V Ducrocq (INRA). WG Hill (Edinburgh) suggested the study of the influence ofdifferent priors on inferences about response to selection, and JM Bernardo (Valencia)suggested parameterizing the priors for fixed effects and variance components as properuniform distributions. Both made insightful comments on an earlier draft of this paper.Part of this research was financed by the Danish Agricultural and Veterinary ResearchCouncil.

REFERENCES

Blair HT, Pollak EJ (1984) Estimation of genetic trend in a selected population with andwithout the use of a control population. J Anim Sci 58, 878-886

Page 27: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

Berger JO (1985) Statistical Decision Theory and Bayesian analysis. Springer-VerlagBerger JO, Bernardo JM (1992) Reference priors in a variance components problem. In:

Proc of the Indo-USA Workshop on Bayesian Analysis in Statistics and Econometrics.Springer-Verlag

Bernardo JM (1979) Reference posterior distributions for Bayesian inference (with discus-sion). J R Stat Soc Series B 41, 113-147

Casella G, George EI (1992) Explaining the Gibbs sampler. Am Stat 46, 167-174Fernando RL, Gianola D (1990) Statistical inferences in populations undergoing selection

or nonrandom mating. In: Advances in Statistical Methods for Genetic Improvement ofLivestock (D Gianola, K Hammond, eds) Springer-Verlag, 437-453

Gelfand AE, Smith AFM (1990) Sampling-based approaches to calculating marginaldensities. J Am Stat Assoc 85, 398-409

Gelfand A, Hills SE, Racine-Poon A, Smith AFM (1990) Illustration of Bayesian inferencein normal data models using Gibbs sampling. J Am Stat Assoc 85, 972-985

Gelfand A, Smith AFM, Lee TM (1992) Bayesian analysis of constrained parameter andtruncated data problems using Gibbs sampling. J Am Stat Assoc 87, 523-532

Gelman A, Rubin DB (1992) A single series from the Gibbs sampler provides a false senseof security. In: Bayesian Statistics IV (JM Bernardo, JO Berger, AP Dawid, AFMSmith, eds) Oxford University Press, 625-631

Geman S, Geman D (1984) Stochastic relaxation, Gibbs distributions and Bayesianrestoration of images. IEEE Transactions on Pattern Analysis and Machine Intdlligence6, 721-741

Geyer CJ (1992) Practical Markov chain Monte Carlo (with discussion). Stat Sci 7, 467-511 1Gianola D, Fernando RL (1986) Bayesian methods in animal breeding. J Anim .Sci 63,

217-244

Gianola D, Foulley JL, Fernando RL (1986) Prediction of breeding values when variancesare not known. Genet Sel Evol 18, 485-498

Gianola D, Fernando RL, Im S, Foulley JL (1989) Likelihood estimation of quantitativegenetic parameters when selection occurs: models and problems. Genome 31, 768-777

Henderson CR (1973) Sire evaluation and genetic trends. In: Proc Anim Breed and GenetSymP in honor of Dr JL Lush. ASAS and ADSA, Champaign

Henderson CR (1976) A simple method for computing the inverse of a numerator relationmatrix used in prediction of breeding values. Biometrics 32, 69-83

Im S, Fernando RL, Gianola D (1989) Likelihood inferences in animal breeding underselection: a missing data theory view point (with discussion). Genet Sel Evol 21, 399-414

IMSL, Inc (1989) IMSL STAT/LIBRARY: FORTRAN Subroutines for Statistical AnalysisJensen J, Wang CS, Sorensen DA, Gianola D (1994) Marginal inferences of variance and

covariance components for traits influenced by maternal and direct genetic effects usingthe Gibbs sampler. Acta Agric Scand (in press)

Jeffreys H (1961) Theory of Probability. Clarendon Press, Oxford (3rd edition)Kackar RN, Harville DA (1981) Unbiasedness of two-stage estimation and prediction

procedures for mixed linear models. Commun Stat Theor Math A10, 1249-1261Lindley DV, Smith AFM (1972) Bayes estimates for the linear model. J R Stat Soc Series

B, 34, 1-18Little RJA, Rubin DB (1987) Statistical Analysis with Missing Data. John Wiley & Sons,

Inc, New YorkMeyer K, Hill WG (1991) Mixed-model analysis of a selection experiment for food intake

in mice. Gen Rest 57, 71-81Cauaas RL (1976) Computing the diagonal elements and inverse of a large numerator

relationship matrix. Biometrics 32, 949-953

Page 28: Bayesian analysis of due selection using Gibbs sampling · 2014-11-07 · Original article Bayesian analysis of genetic change due to selection using Gibbs sampling DA Sorensen CS

Silverman BW (1986) Density Estimation for Statistics and Data Analysis. Chapman andHall, London

Smith AFM, Roberts GO (1993) Bayesian computation via the Gibbs sampler and relatedMarkov chain Monte-Carlo methods (with discussion). J R Stat Soc Series B 55, 3-23

Sorensen DA, Kennedy BW (1986) Analysis of selection experiments using mixed-modelmethodology. J Anim Sci 63, 245-258

Sorensen DA, Johansson K (1992) Estimation of direct and correlated responses toselection using univariate animal models. J Anim Sci 70, 2038-2044

Thompson R (1986) Estimation of realized heritability in a selected population usingmixed-model methods. Genet Sel Evol 18, 475-483

Wang CS, Rutledge JJ, Gianola D (1993) Marginal inferences about variance componentsin a mixed linear model using Gibbs sampling. Genet Sel Evol 21, 41-62

Wang CS, Rutledge JJ, Gianola D (1994a) Bayesian analysis of mixed linear models viaGibbs sampling with an application to litter size in Iberian pigs. Genet Sel Evol 26,91-115

Wang CS, Gianola D, Sorensen DA, Jensen J, Cristensen A, Rutledge JJ (1994b) Responseto selection for litter size in Danish Landrace pigs: A Bayesian analysis. Theor ApplGenet, in press

Zeger SL, Karim MR (1991) Generalized linear models with random effects: a Gibbssampling approach. J Am Stat Assoc 86, 79-86