【机器学习】8 Logistic regression
本章目录
8 Logistic regression 245
8.1 Introduction 245
8.2 Model specification 245
8.3 Model fitting 245
8.3.1 MLE 246
8.3.2 Steepest descent 247
8.3.3 Newton’s method 249
8.3.4 Iteratively reweighted least squares (IRLS) 250
8.3.5 Quasi-Newton (variable metric) methods 251
8.3.6 2 regularization 252
8.3.7 Multi-class logistic regression 252
8.4 Bayesian logistic regression 254
8.4.1 Laplace approximation 255
8.4.2 Derivation of the BIC 255
8.4.3 Gaussian approximation for logistic regression 256
8.4.4 Approximating the posterior predictive 256
8.4.5 Residual analysis (outlier detection) * 260
8.5 Online learning and stochastic optimization 261
8.5.1 Online learning and regret minimization 262
8.5.2 Stochastic optimization and risk minimization 262
8.5.3 The LMS algorithm 264
8.5.4 The perceptron algorithm 265
8.5.5 A Bayesian view 266
8.6 Generative vs discriminative classifiers 267
8.6.1 Pros and cons of each approach 268
8.6.2 Dealing with missing data 269
8.6.3 Fisher’s linear discriminant analysis (FLDA) * 271
github下载链接:https://github.com/916718212/Machine-Learning-A-Probabilistic-Perspective-.git