Description: A First Course in Bayesian Statistical Methods by Peter D. Hoff This compact, self-contained introduction to the theory and application of Bayesian statistical methods is accessible to those with a basic familiarity with probability, yet allows advanced readers to grasp the principles underlying Bayesian theory and method. FORMAT Paperback LANGUAGE English CONDITION Brand New Publisher Description * A self-contained introduction to probability, exchangeability and Bayes rule provides a theoretical understanding of the applied material. * Numerous examples with R-code that can be run "as-is" allow the reader to perform the data analyses themselves. * The development of Monte Carlo and Markov chain Monte Carlo methods in the context of data analysis examples provides motivation for these computational methods. Back Cover This book provides a compact self-contained introduction to the theory and application of Bayesian statistical methods. The book is accessible to readers having a basic familiarity with probability, yet allows more advanced readers to quickly grasp the principles underlying Bayesian theory and methods. The examples and computer code allow the reader to understand and implement basic Bayesian data analyses using standard statistical models and to extend the standard models to specialized data analysis situations. The book begins with fundamental notions such as probability, exchangeability and Bayes rule, and ends with modern topics such as variable selection in regression, generalized linear mixed effects models, and semiparametric copula estimation. Numerous examples from the social, biological and physical sciences show how to implement these methodologies in practice. Monte Carlo summaries of posterior distributions play an important role in Bayesian data analysis. The open-source R statistical computing environment provides sufficient functionality to make Monte Carlo estimation very easy for a large number of statistical models and example R-code is provided throughout the text. Much of the example code can be run as is in R, and essentially all of it can be run after downloading the relevant datasets from the companion website for this book. Peter Hoff is an Associate Professor of Statistics and Biostatistics at the University of Washington. He has developed a variety of Bayesian methods for multivariate data, including covariance and copula estimation, cluster analysis, mixture modeling and social network analysis. He is on the editorial board of the Annals of Applied Statistics. Table of Contents and examples.- Belief, probability and exchangeability.- One-parameter models.- Monte Carlo approximation.- The normal model.- Posterior approximation with the Gibbs sampler.- The multivariate normal model.- Group comparisons and hierarchical modeling.- Linear regression.- Nonconjugate priors and Metropolis-Hastings algorithms.- Linear and generalized linear mixed effects models.- Latent variable methods for ordinal data. Review From the reviews:This is an excellent book for its intended audience: statisticians who wish to learn Bayesian methods. Although designed for a statistics audience, it would also be a good book for econometricians who have been trained in frequentist methods, but wish to learn Bayes. In relatively few pages, it takes the reader through a vast amount of material, beginning with deep issues in statistical methodology such as de Finettis theorem, through the nitty-gritty of Bayesian computation to sophisticated models such as generalized linear mixed effects models and copulas. And it does so in a simple manner, always drawing parallels and contrasts between Bayesian and frequentist methods, so as to allow the reader to see the similarities and differences with clarity. (Econometrics Journal) "Generally, I think this is an excellent choice for a text for a one-semester Bayesian Course. It provides a good overview of the basic tenets of Bayesian thinking for the common one and two parameter distributions and gives introductions to Bayesian regression, multivariate-response modeling, hierarchical modeling, and mixed effects models. The book includes an ample collection of exercises for all the chapters. A strength of the book is its good discussion of Gibbs sampling and Metropolis-Hastings algorithms. The author goes beyond a description of the MCMC algorithms, but also provides insight into why the algorithms work. …I believe this text would be an excellent choice for my Bayesian class since it seems to cover a good number of introductory topics and giv the student a good introduction to the modern computational tools for Bayesian inference with illustrations using R. (Journal of the American Statistical Association, June 2010, Vol. 105, No. 490)"Statisticians and applied scientists. The book is accessible to readers having a basic familiarity with probability theory and grounding statistical methods. The author has succeeded in writing an acceptable introduction to the theory and application of Bayesian statistical methods which is modern and covers both the theory and practice. … this book can be useful as a quick introduction to Bayesian methods for self study. In addition, I highly recommend this book as a text for a course for Bayesian statistics." (Lasse Koskinen, International Statistical Review, Vol. 78 (1), 2010)"The book under review covers a balanced choice of topics … presented with a focus on the interplay between Bayesian thinking and the underlying mathematical concepts. … the book by Peter D. Hoff appears to be an excellent choice for a main reading in an introductory course. After studying this text the student can go in a direction of his liking at the graduate level." (Krzysztof atuszyski, Mathematical Reviews, Issue 2011 m)"The book is a good introductory treatment of methods of Bayes analysis. It should especially appeal to the reader who has had some statistical courses in estimation and modeling, and wants to understand the Bayesian interpretation of those methods. Also, readers who are primarily interested in modeling data and who are working in areas outside of statistics should find this to be a good reference book. … should appeal to the reader who wants to keep with modern approaches to data analysis." (Richard P. Heydorn, Technometrics, Vol. 54 (1), February, 2012) Long Description Thisbookoriginatedfromasetoflecturenotesforaone-quartergradua- levelcoursetaughtattheUniversityofWashington. Thepurposeofthecourse istofamiliarizethestudentswiththebasicconceptsofBayesiantheoryand toquicklygetthemperformingtheirowndataanalysesusingBayesianc- putationaltools. Theaudienceforthiscourseincludesnon-statisticsgraduate studentswhodidwellintheirdepartmentsgraduate-levelintroductorystat- ticscoursesandwhoalsohaveaninterestinstatistics. Additionally,?rst-and second-yearstatisticsgraduatestudentshavefoundthiscoursetobeauseful introductiontostatisticalmodeling. Likethecourse,thisbookisintendedto beaself-containedandcompactintroductiontothemainconceptsofBayesian theoryandpractice. Bytheendofthetext,readersshouldhavetheabilityto understandandimplementthebasictoolsofBayesianstatisticalmethodsfor theirowndataanalysispurposes. Thetextisnotintendedasacompreh- sivehandbookforadvancedstatisticalresearchers,althoughitishopedthat thislattercategoryofreaderscouldusethisbookasaquickintroductionto Bayesianmethodsandasapreparationformorecomprehensiveanddetailed studies. Computing MonteCarlosummariesofposteriordistributionsplayanimportantrolein the way data analyses are presented in this text. My experience has been thatonceastudentunderstandsthebasicideaofposteriorsampling,their dataanalysesquicklybecomemorecreativeandmeaningful,usingrelevant posteriorpredictivedistributionsandinterestingfunctionsofparameters. The open-sourceRstatisticalcomputingenvironmentprovidessucientfuncti- alitytomakeMonteCarloestimationveryeasyforalargenumberofstat- ticalmodels,andexampleR-codeisprovidedthroughoutthetext. Muchof theexamplecodecanberun"asis"inR,andessentiallyallofitcanberun afterdownloadingtherelevantdatasetsfromthecompanionwebsiteforthis book. VI Preface Acknowledgments Thepresentationofmaterialinthisbook,andmyteachingstyleingeneral, havebeenheavilyinuencedbythediversesetofstudentstakingCSSS-STAT 564attheUniversityofWashington. Mythankstothemforimprovingmy teaching. IalsothankChrisHoman,VladimirMinin,XiaoyueNiuandMarc Suchard for their extensive comments, suggestions and corrections for this book,andtoAdrianRafteryforbibliographicsuggestions. Finally,Ithank mywifeJenforherpatienceandsupport. Seattle,WA PeterHo? March2009 Contents 1 Introductionandexamples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1. 1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1. 2 WhyBayes? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1. 2. 1 Estimatingtheprobabilityofarareevent . . . . . . . . . . . . 3 1. 2. 2 Buildingapredictivemodel. . . . . . . . . . . . . . . . . . . . . . . . . 8 1. 3 Wherewearegoing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 1. 4 Discussionandfurtherreferences. . . . . . . . . . . . . . . . . . . . . . . . . . 12 2 Belief,probabilityandexchangeability. . . . . . . . . . . . . . . . . . . . . 13 2. 1 Belieffunctionsandprobabilities. . . . . . . . . . . . . . . . . . . . . . . . . . 13 2. 2 Events,partitionsandBayesrule . . . . . . . . . . . . . . . . . . . . . . . . . 14 2. 3 Independence. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2. 4 Randomvariables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2. 4. 1 Discreterandomvariables. . . . . . . . . . . . . . . . . . . . . . . . . . 18 2. 4. 2 Continuousrandomvariables . . . . . . . . . . . . . . . . . . . . . . . 19 2. 4. 3 Descriptionsofdistributions. . . . . . . . . . . . . . . . . . . . . . . . 21 2. 5 Jointdistributions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 2. 6 Independentrandomvariables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 2. 7 Exchangeability. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 2. 8 deFinettistheorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 2. 9 Discussionandfurtherreferences. . . . . . . . . . . . . . . . . . . . . . . . . . 30 3 One-parametermodels. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 3. 1 Thebinomialmodel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 3. 1. 1 Inferenceforexchangeablebinarydata. . . . . . . . . . . . . . . 35 3. 1. 2 Condenceregions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 3. 2 ThePoissonmodel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 3. 2. 1 Posteriorinference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 3. 2. 2 Example:Birthrates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 3. 3 Exponentialfamiliesandconjugatepriors. . . . . . . . . . . . . . . . . . . 51 3. 4 Discussionandfurtherreferences. . . . . . . . . . . . . . . . . . . . . . . . . . 52 VIII Contents 4 MonteCarloapproximation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 4. 1 TheMonteCarlomethod. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 4. 2 Posteriorinferenceforarbitraryfunctions. . . . . . . . . . . . . . . . . . . 57 4. 3 Samplingfrompredictivedistributions . . . . . . . . . . . . . . . . . . . . . 60 4. 4 Posteriorpredictivemodelchecking. . . . . . . . . . . . . . . . . . . . . . . . 62 4. 5 Discussionandfurtherreferences. . . . . . . . . . . . . . . . . . . . . . . . . . 65 5 Thenormalmodel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 5. 1 Thenormalmodel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 5. 2 Inferenceforthemean,conditionalonthevariance . . . . . . . . . . 69 5. 3 Jointinferenceforthemeanandvariance. . . . . . . . . . . . . . . . . . . 73 5. 4 Bias,varianceandmeansquarederror . . . . . . . . . . . . . . . . . . . . . 79 5. 5 Priorspecicationbasedonexpectations . . . . . . . . . . . . . . . . . . . Review Quote This is an excellent book for its intended audience: statisticians who wish to learn Bayesianmethods. Although designed for a statistics audience, it would also be a good book foreconometricians who have been trained in frequentist methods, but wish to learn Bayes. Inrelatively few pages, it takes the reader through a vast amount of material, beginning withdeep issues in statistical methodology such as de Finetti "s theorem, through the nitty-gritty ofBayesian computation to sophisticated models such as generalized linear mixed effects modelsand copulas. And it does so in a simple manner, always drawing parallels and contrasts betweenBayesian and frequentist methods, so as to allow the reader to see the similarities and differences with clarity. (Econometrics Journal)"Generally, I think this is an excellent choice for a text for a one-semester Bayesian Course. It provides a good overview of the basic tenets of Bayesian thinking for the common one and two parameter distributions and gives introductions to Bayesian regression, multivariate-response modeling, hierarchical modeling, and mixed effects models. The book includes an ample collection of exercises for all the chapters. A strength of the book is its good discussion of Gibbs sampling and Metropolis-Hastings algorithms. The author goes beyond a description of the MCMC algorithms, but also provides insight into why the algorithms work. …I believe this text would be an excellent choice for my Bayesian class since it seems to cover a good number of introductory topics and giv the student a good introduction to the modern computational tools for Bayesian inference with illustrations using R. (Journal of the American Statistical Association, June 2010, Vol. 105, No. 490) Feature Provides a nice introduction to Bayesian statistics with sufficient grounding in the Bayesian framework without being distracted by more esoteric points The material is well-organized, weaving applications, background material and computation discussions throughout the book R examples also facilitate how the approaches work Details ISBN1441928286 Author Peter D. Hoff Publisher Springer-Verlag New York Inc. Series Springer Texts in Statistics Year 2010 ISBN-10 1441928286 ISBN-13 9781441928283 Format Paperback Imprint Springer-Verlag New York Inc. Place of Publication New York, NY Country of Publication United States Short Title 1ST COURSE IN BAYESIAN STATIST Language English Media Book DEWEY 519.5 Pages 271 Illustrations IX, 271 p. Publication Date 2010-11-19 AU Release Date 2010-11-19 NZ Release Date 2010-11-19 US Release Date 2010-11-19 UK Release Date 2010-11-19 Edited by Ion C. Tintoiu Birth 1955 Affiliation Amazon.com, Inc, Usa Position Assistant Professor Qualifications OBE Edition Description Softcover reprint of hardcover 1st ed. 2009 Alternative 9780387922997 Audience Professional & Vocational We've got this At The Nile, if you're looking for it, we've got it. With fast shipping, low prices, friendly service and well over a million items - you're bound to find what you want, at a price you'll love! TheNile_Item_ID:96238822;
Price: 109.3 AUD
Location: Melbourne
End Time: 2024-12-06T09:29:40.000Z
Shipping Cost: 9.87 AUD
Product Images
Item Specifics
Restocking fee: No
Return shipping will be paid by: Buyer
Returns Accepted: Returns Accepted
Item must be returned within: 30 Days
ISBN-13: 9781441928283
Book Title: A First Course in Bayesian Statistical Methods
Number of Pages: 271 Pages
Language: English
Publication Name: A First Course in Bayesian Statistical Methods
Publisher: Springer-Verlag New York Inc.
Publication Year: 2010
Subject: Economics, Computer Science, Mathematics, Management
Item Height: 235 mm
Item Weight: 439 g
Type: Textbook
Author: Peter D. Hoff
Subject Area: Data Analysis, Social Research
Item Width: 155 mm
Format: Paperback