درحال بارگذاري...
جستجو
ایمیل دوست | |
نام شما | |
ایمیل شما | |
کد مقابل را وارد نمایید | |
این صفحه برای دوست شما با موفقیت ارسال شد.
11236 مرتبه مشاهده شده
Machine learning : a probabilistic perspective
Murphy, Kevin P.,
- ISBN:9780262018029
- Call Number : Q 325 .5 .M87 2012
- Main Entry: Murphy, Kevin P., 1970-
- Title:Machine learning : a probabilistic perspective [electronic resource] / Kevin P. Murphy.
- Publisher:Cambridge, MA : MIT Press, 2012.
- Physical Description:xxix, 1067 p. : ill. (some col.) ; 24 cm
- Series:Adaptive computation and machine learning series
- Notes:Includes bibliographical references and indexes
- Subject:Machine learning.
- Subject:Probabilities.
- Cover Page
- Half Title Page
- Title Page
- Copyright Page
- Dedication
- Contents
- Preface
- 1 Introduction
- 1.1 Machine learning: what and why?
- 1.2 Supervised learning
- 1.3 Unsupervised learning
- 1.4 Some basic concepts in machine learning
- 1.4.1 Parametric vs non-parametric models
- 1.4.2 A simple non-parametric classifier: K-nearest neighbors
- 1.4.3 The curse of dimensionality
- 1.4.4 Parametric models for classification and regression
- 1.4.5 Linear regression
- 1.4.6 Logistic regression
- 1.4.7 Overfitting
- 1.4.8 Model selection
- 1.4.9 No free lunch theorem
- 2 Probability
- 3 Generative Models for Discrete Data
- 4 Gaussian Models
- 5 Bayesian Statistics
- 6 Frequentist Statistics
- 7 Linear Regression
- 8 Logistic Regression
- 9 Generalized Linear Models and the Exponential Family
- 10 Directed Graphical Models (Bayes Nets)
- 11 Mixture Models and the EM Algorithm
- 12 Latent Linear Models
- 13 Sparse Linear Models
- 14 Kernels
- 15 Gaussian Processes
- 16 Adaptive Basis Function Models
- 17 Markov and Hidden Markov Models
- 18 State Space Models
- 19 Undirected Graphical Models (Markov Random Fields)
- 19.1 Introduction
- 19.2 Conditional independence properties of UGMs
- 19.3 Parameterization of MRFs
- 19.4 Examples of MRFs
- 19.5 Learning
- 19.5.1 Training maxent models using gradient methods
- 19.5.2 Training partially observed maxent models
- 19.5.3 Approximate methods for computing the MLEs of MRFs
- 19.5.4 Pseudo likelihood
- 19.5.5 Stochastic maximum likelihood
- 19.5.6 Feature induction for maxent models *
- 19.5.7 Iterative proportional fitting (IPF) *
- 19.6 Conditional random fields (CRFs)
- 19.7 Structural SVMs
- 20 Exact Inference for Graphical Models
- 21 Variational Inference
- 22 More Variational Inference
- 23 Monte Carlo Inference
- 24 Markov Chain Monte Carlo (MCMC) Inference
- 24.1 Introduction
- 24.2 Gibbs sampling
- 24.2.1 Basic idea
- 24.2.2 Example: Gibbs sampling for the Ising model
- 24.2.3 Example: Gibbs sampling for inferring the parameters of a GMM
- 24.2.4 Collapsed Gibbs sampling *
- 24.2.5 Gibbs sampling for hierarchical GLMs
- 24.2.6 BUGS and JAGS
- 24.2.7 The Imputation Posterior (IP) algorithm
- 24.2.8 Blocking Gibbs sampling
- 24.3 Metropolis Hastings algorithm
- 24.4 Speed and accuracy of MCMC
- 24.5 Auxiliary variable MCMC *
- 24.6 Annealing methods
- 24.7 Approximating the marginal likelihood
- 25 Clustering
- 26 Graphical Model Structure Learning
- 27 Latent Variable Models for Discrete Data
- 27.1 Introduction
- 27.2 Distributed state LVMs for discrete data
- 27.3 Latent Dirichlet allocation (LDA)
- 27.3.1 Basics
- 27.3.2 Unsupervised discovery of topics
- 27.3.3 Quantitatively evaluating LDA as a language model
- 27.3.4 Fitting using (collapsed) Gibbs sampling
- 27.3.5 Example
- 27.3.6 Fitting using batch variational inference
- 27.3.7 Fitting using online variational inference
- 27.3.8 Determining the number of topics
- 27.4 Extensions of LDA
- 27.5 LVMs for graph-structured data
- 27.6 LVMs for relational data
- 27.7 Restricted Boltzmann machines (RBMs)
- 28 Deep Learning
- 28.1 Introduction
- 28.2 Deep generative models
- 28.3 Deep neural networks
- 28.4 Applications of deep networks
- 28.4.1 Handwritten digit classification using DBNs
- 28.4.2 Data visualization and feature discovery using deep auto-encoders
- 28.4.3 Information retrieval using deep auto-encoders (semantic hashing)
- 28.4.4 Learning audio features using 1d convolutional DBNs
- 28.4.5 Learning image features using 2d convolutional DBNs
- 28.5 Discussion
- Notation
- Bibliography
- Index to Code
- Index to Keywords