Show simple item record

dc.identifier.urihttp://hdl.handle.net/11401/77332
dc.description.sponsorshipThis work is sponsored by the Stony Brook University Graduate School in compliance with the requirements for completion of degree.en_US
dc.formatMonograph
dc.format.mediumElectronic Resourceen_US
dc.language.isoen_US
dc.publisherThe Graduate School, Stony Brook University: Stony Brook, NY.
dc.typeDissertation
dcterms.abstractHigh-dimensional datasets are now ubiquitous in biomedical research. Feature selection is an essential step in mining high-dim data to reduce noise, avoid overfitting and improve the interpretation of statistical models. In the last few decades, numerous feature selection methods and algorithms have been proposed for various response types, connections in predictors and requirements on sparsities; and penalized methods, such as LASSO and its variations, are the most efficient and popular ones in this area. In addition, genomic features, such as gene expressions, are usually connected through an underlying biological network, which is an important supplement to the model in improving performance and interpretability. In this study, we first extend the group LASSO to a network-constrained classification model and develop a modified proximal gradient algorithm for the model fitting. In this algorithm, group lasso regularization is used to induce model sparsity, and a network constraint is imposed to induce the smoothness of the coefficients using underlying network structure. The applicability of the proposed method is verified by analyzing both numerical examples and real gene expression data in TCGA. We further work on the feature selection problem with Bayesian hierarchical structure. R. Tibshirani, who introduced LASSO in 1996, also proposed that linear LASSO can be considered as a Bayesian model with Laplace prior on coefficient parameters, which shed lights on the feature selection problem in Bayesian models. Compared to frequentist approaches, Bayesian model copes better with complex hierarchical structures of the data. On one hand, we compare the performance of Laplace, horseshoe and Gaussian priors in linear Bayesian models with extensive simulations. On the other, we extend the projection predictive feature selection scheme to group-wise selection and benchmark its feature selection performance and prediction accuracy with standard Bayesian methods. All Bayesian posterior parameters are estimated using Hamiltonian Monte Carlo implemented in Stan.
dcterms.available2017-09-20T16:52:32Z
dcterms.contributorWang, Xuefengen_US
dcterms.contributorKuan, Pei Fenen_US
dcterms.contributorZhu, Weien_US
dcterms.contributorYu, Xiaxia.en_US
dcterms.creatorTian, Xinyu
dcterms.dateAccepted2017-09-20T16:52:32Z
dcterms.dateSubmitted2017-09-20T16:52:32Z
dcterms.descriptionDepartment of Applied Mathematics and Statisticsen_US
dcterms.extent122 pg.en_US
dcterms.formatMonograph
dcterms.formatApplication/PDFen_US
dcterms.identifierhttp://hdl.handle.net/11401/77332
dcterms.issued2017-05-01
dcterms.languageen_US
dcterms.provenanceMade available in DSpace on 2017-09-20T16:52:32Z (GMT). No. of bitstreams: 1 Tian_grad.sunysb_0771E_13316.pdf: 715549 bytes, checksum: cfed7bbd53e1a5e107783f6c72287e68 (MD5) Previous issue date: 1en
dcterms.publisherThe Graduate School, Stony Brook University: Stony Brook, NY.
dcterms.subjectBayesian, feature selection, LASSO, Network constraint, proximal gradient, Stan
dcterms.subjectStatistics
dcterms.titleGroup LASSO for Prediction of Clinical Outcomes in Cancer
dcterms.typeDissertation


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record