Summary: | Quantitative trait loci (QTL) /association mapping aims to identify the genomic loci associated with the complex traits. From a statistical perspective, multiple linear regression is often used to model, estimate and test the effects of molecular markers on a trait. With genotype data derived from contemporary genomics techniques, however, the number of markers typically exceed the number of individuals, and it is therefore necessary to perform some sort of variable selection or parameter regularization to provide reliable estimates of model parameters. In addition, many quantitative traits are changing during their development process of life. Accordingly, a longitudinal study that jointly maps the repeated measurements of the phenotype over time may increase the statistical power to identify QTLs, compared with the single trait analysis. In this thesis, a series of Bayesian variable selection/regularization linear methods were developed and applied for analyzing quantitative traits measured at either single or multiple time points. The first work provided an overview of the principal frequentist regularization methods for analyzing single traits. The second work also focused on single trait analysis, where a variational Bayesian (VB) algorithm was derived for estimating parameters in several Bayesian regularization methods. The VB methods can be quickly implemented on large data sets in contrast to the classical Markov Chain Monte Carlo methods. In the third work, the Bayesian regularization method was extended to a non-parametric varying coefficient model to analyze longitudinal traits. Particularly, an efficient VB stepwise algorithm was used for variable selection, so that the method can be quickly implemented even on data sets with a large number of time points and/or a large number of markers. The fourth work is an application of variable selection methods on forest genetics data collected from Northern Sweden. From several conifer wood properties traits with multiple time points, four QTLs located at ...
|