Computer age statistical inference : algorithms, evidence, and data science / Bradley Efron, Stanford University, California, Trevor Hastie, Stanford University, California.
Material type:
Item type | Current library | Call number | Copy number | Status | Notes | Date due | Barcode |
---|---|---|---|---|---|---|---|
![]() |
Female Library | QA276.4 .E376 2016 (Browse shelf (Opens below)) | 1 | Available | STACKS | 51952000338635 | |
![]() |
Main Library | QA276.4 .E376 2016 (Browse shelf (Opens below)) | 1 | Available | STACKS | 51952000338628 |
Browsing Main Library shelves Close shelf browser
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
||
QA276.18 .N53 2009 Statistical methods for business and economics / | QA276.19 .S74 2009 Beginning statistics / | QA276.4 .C53 2006 Applied statistics and the SAS® programming language / | QA276.4 .E376 2016 Computer age statistical inference : algorithms, evidence, and data science / | QA276.4 .R625 2016 Data analysis for scientists and engineers / | QA276.45 .M53 M47 2008 Basic statistics using Excel for Office XP : for use with statistical techniques in business & economics / | QA276.45 .M53 O77 2007 Basic statistics using Excel and MegaStat / |
Includes bibliographical references and indexes.
Part I. Classic Statistical Inference: -- 1. Algorithms and inference -- 2. Frequentist inference -- 3. Bayesian inference -- 4. Fisherian inference and maximum likelihood estimation -- 5. Parametric models and exponential families -- Part II. Early Computer-Age Methods: -- 6. Empirical Bayes -- 7. James--Stein estimation and ridge regression -- 8. Generalized linear models and regression trees -- 9. Survival analysis and the EM algorithm -- 10. The jackknife and the bootstrap -- 11. Bootstrap confidence intervals -- 12. Cross-validation and Cp estimates of prediction error --13. Objective Bayes inference and Markov chain Monte Carlo --14. Statistical inference and methodology in the postwar era -- Part III. Twenty-First Century Topics: -- 15. Large-scale hypothesis testing and false discovery rates -- 16. Sparse modeling and the lasso -- 17. Random forests and boosting -- 18. Neural networks and deep learning -- 19. Support-vector machines and kernel methods -- 20. Inference after model selection -- 21. Empirical Bayes estimation strategies -- Epilogue.
The twenty-first century has seen a breathtaking expansion of statistical methodology, both in scope and in influence. 'Big data', 'data science', and 'machine learning' have become familiar terms in the news, as statistical methods are brought to bear upon the enormous data sets of modern science and commerce. How did we get here? And where are we going? This book takes us on an exhilarating journey through the revolution in data analysis following the introduction of electronic computation in the 1950s. Beginning with classical inferential theories - Bayesian, frequentist, Fisherian - individual chapters take up a series of influential topics: survival analysis, logistic regression, empirical Bayes, the jackknife and bootstrap, random forests, neural networks, Markov chain Monte Carlo, inference after model selection, and dozens more. The distinctly modern approach integrates methodology and algorithms with statistical inference. The book ends with speculation on the future direction of statistics and data science. -- Provided by publisher.
1 2
There are no comments on this title.