Computer Age Statistical Inference

Computer Age Statistical Inference

出版信息

Bradley Efron、Trevor Hastie / Cambridge University Press / 2016-7-21 / USD 74.99

内容简介

The twenty-first century has seen a breathtaking expansion of statistical methodology, both in scope and in influence. 'Big data', 'data science', and 'machine learning' have become familiar terms in the news, as statistical methods are brought to bear upon the enormous data sets of modern science and commerce. How did we get here? And where are we going? This book takes us on an exhilarating journey through the revolution in data analysis following the introduction of electronic computation in the 1950s. Beginning with classical inferential theories - Bayesian, frequentist, Fisherian - individual chapters take up a series of influential topics: survival analysis, logistic regression, empirical Bayes, the jackknife and bootstrap, random forests, neural networks, Markov chain Monte Carlo, inference after model selection, and dozens more. The distinctly modern approach integrates methodology and algorithms with statistical inference. The book ends with speculation on the future direction of statistics and data science.

Clarifies both traditional methods and current, popular algorithms (e.g. neural nets, random forests)

Written by two world-leading researchers

Addressed to all fields that work with data

作者简介

Bradley Efron, Stanford University, California

Bradley Efron is Max H. Stein Professor, Professor of Statistics, and Professor of Biomedical Data Science at Stanford University, California. He has held visiting faculty appointments at Harvard University, Massachusetts, the University of California, Berkeley, and Imperial College of Science, Technology and Medicine, London. Efron has worked extensively on theories of statistical inference, and is the inventor of the bootstrap sampling technique. He received the National Medal of Science in 2005 and the Guy Medal in Gold of the Royal Statistical Society in 2014.

Trevor Hastie, Stanford University, California

Trevor Hastie is John A. Overdeck Professor, Professor of Statistics, and Professor of Biomedical Data Science at Stanford University, California. He is coauthor of Elements of Statistical Learning, a key text in the field of modern data analysis. He is also known for his work on generalized additive models and principal curves, and for his contributions to the R computing environment. Hastie was awarded the Emmanuel and Carol Parzen prize for Statistical Innovation in 2014.

目录

Part I. Classic Statistical Inference:
1. Algorithms and inference
2. Frequentist inference
3. Bayesian inference
4. Fisherian inference and maximum likelihood Estimation
5. Parametric models and exponential families
Part II. Early Computer-Age Methods:
6. Empirical Bayes
7. James–Stein estimation and ridge regression
8. Generalized linear models and regression trees
9. Survival analysis and the EM algorithm
10. The jackknife and the bootstrap
11. Bootstrap confidence intervals
12. Cross-validation and Cp estimates of prediction error
13. Objective Bayes inference and Markov chain Monte Carlo
14. Statistical inference and methodology in the postwar era
Part III. Twenty-First Century Topics:
15. Large-scale hypothesis testing and false discovery rates
16. Sparse modeling and the lasso
17. Random forests and boosting
18. Neural networks and deep learning
19. Support-vector machines and Kernel methods
20. Inference after nodel selection
21. Empirical Bayes estimation strategies
Epilogue

本文地址:https://www.codercto.com/books/d/1353.html

CSS 压缩/解压工具

CSS 压缩/解压工具

在线压缩/解压 CSS 代码

XML 在线格式化

XML 在线格式化

在线 XML 格式化压缩工具

RGB CMYK 转换工具

RGB CMYK 转换工具

RGB CMYK 互转工具