A Computational Approach to Statistical Learning gives a novel introduction to predictive modeling by focusing on the algorithmic and numeric motivations behind popular statistical methods. The text contains annotated code to over 80 original reference functions. These functions provide minimal working implementations of common statistical learning algorithms. Every chapter concludes with a fully worked out application that illustrates predictive modeling tasks using a real-world dataset. The text begins with a detailed analysis of linear models and ordinary least squares. Subsequent chapters explore extensions such as ridge regression, generalized linear models, and additive models. The second half focuses on the use of general-purpose algorithms for convex optimization and their application to tasks in statistical learning. Models covered include the elastic net, dense neural networks, convolutional neural networks (CNNs), and spectral clustering. A unifying theme throughout the text is the use of optimization theory in the description of predictive models, with a particular focus on the singular value decomposition (SVD). Through this theme, the computational approach motivates and clarifies the relationships between various predictive models. Taylor Arnold is an assistant professor of statistics at the University of Richmond. His work at the intersection of computer vision, natural language processing, and digital humanities has been supported by multiple grants from the National Endowment for the Humanities (NEH) and the American Council of Learned Societies (ACLS). His first book, Humanities Data in R, was published in 2015. Michael Kane is an assistant professor of biostatistics at Yale University. He is the recipient of grants from the National Institutes of Health (NIH), DARPA, and the Bill and Melinda Gates Foundation. His R package bigmemory won the Chamber's prize for statistical software in 2010. Bryan Lewis is an applied mathematician and author of many popular R packages, including irlba, doRedis, and threejs.
Friedman, Jerome H. (1987), Exploratory projection pursuit, Journal of the American Statistical Association 82, 249I266. Friedman, Jerome H.; Jon Louis Bentley; and Raphael Ari Finkel E.3 References to the Literature 699.
The Handbook is divided into two volumes written by outstanding, internationally renowned scholars in the field. This second volume focuses on foundations and advances in data science, statistical modeling, and machine learning.
This book presents some of the most important modeling and prediction techniques, along with relevant applications.
This self-contained book is geared toward advanced undergraduate and beginning graduate students in the mathematical sciences, engineering, and computer science and can be used as the main text in a semester course.
Answers Book (for) Introduction to Statistical Analysis: A Modern Computational Approach
Chatterjee, C., Roychowdhury, V. P., & Chong, E. K. P. (1998). On relative convergence properties of principal component analysis algorithms. IEEE Transactions on Neural Networks, 9(2), 319–329. 9. Chauvin, Y. (1989).
C4.5: Programs for Machine Learning, Morgan Kaufmann, San Mateo. Ramsay, J. and Silverman, B. (1997). Functional Data Analysis, Springer Verlag. Rao, C. R. (1973). Linear Statistical Inference and Its Applications, Wiley, New York.
McNeil, A. J., R. Frey, and P. Embrechts (2015). Quantitative Risk Management: Concepts, Techniques, and Tools, revised edn., Princeton, NJ: Princeton University Press. McQuarrie, A. D. R. and C.-L. Tsai (1998).
Each topic in the book has been chosen to elucidate a general principle, which is explored in a precise formal setting.
This interdisciplinary text offers theoretical and practical results of information theoretic methods used in statistical learning.