Show simple item record

dc.contributor.advisorFishkind, Donniell E.en_US
dc.creatorSussman, Daniel Lewisen_US
dc.date.accessioned2014-12-23T04:40:05Z
dc.date.available2014-12-23T04:40:05Z
dc.date.created2014-05en_US
dc.date.issued2014-01-14en_US
dc.date.submittedMay 2014en_US
dc.identifier.urihttp://jhir.library.jhu.edu/handle/1774.2/37075
dc.description.abstractThe eigendecomposition of an adjacency matrix provides a way to embed a graph as points in finite dimensional Euclidean space. This embedding allows the full arsenal of statistical and machine learning methodology for multivariate Euclidean data to be deployed for graph inference. Our work analyzes this embedding, a graph version of principal component analysis, in the context of various random graph models with a focus on the impact for subsequent inference. For the stochastic blockmodel, with a finite number of blocks of stochastically equivalent vertices, Sussman, et al (2012), Fishkind, et al (2013) and Lyzinski, et al (2013) show that clustering the embedded points using k-means accurately partitions the vertices into the correct blocks, even when the embedding dimension is misspecified or the number of blocks is unknown. For the more general random dot product graph model, an example of a latent position model, Sussman, et al (2013) shows that the latent positions are consistently estimated by the embedding which then allows for accurate learning in a supervised vertex classification framework. Tang, et al (2012) strengthens these results to more general latent position models. Athreya, et al (2013) provide distributional results, akin to a central limit theorem, for the residuals between the estimated and true latent positions which provides the potential for deeper understanding of these methods. In summary, these papers demonstrate that for a broad class of graph models and inference tasks, adjacency-spectral embedding allows for accurate graph inference via standard multivariate methodology.en_US
dc.format.mimetypeapplication/pdfen_US
dc.languageen
dc.publisherJohns Hopkins University
dc.subjectgraphen_US
dc.subjectspectral decompositionen_US
dc.subjectembeddingen_US
dc.titleFoundations of Adjacency Spectral Embeddingen_US
dc.typeThesisen_US
thesis.degree.disciplineApplied Mathematics & Statisticsen_US
thesis.degree.grantorJohns Hopkins Universityen_US
thesis.degree.grantorWhiting School of Engineeringen_US
thesis.degree.levelDoctoralen_US
thesis.degree.namePh.D.en_US
dc.type.materialtexten_US
thesis.degree.departmentApplied Mathematics and Statisticsen_US
dc.contributor.committeeMemberPriebe, Carey E.en_US
dc.contributor.committeeMemberAthreya, Avantien_US
dc.contributor.committeeMemberNaiman, Daniel Q.en_US


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record