当前位置:
首页
>
电子教材
>
详细信息
快速检索
数据库:
各中心已购纸本教材
各中心已购电子教材
国内高校课程
国外著名大学课程
外文原版教材出版信息
外文影印版教材出版信息
名校购书信息
关键词:
Linear Algebra and Optimization with Applications to Machine Learning: Volume Ii, Fundamentals of Optimization Theory with Applications to Machine Learning
书目信息
ISBN:
9789811216572
本馆索书号:
中图分类号:
O15
中文译名:
线性代数和最优化及在机器学习中的应用:卷2,最优化理论基础及其在机器学习中的应用
作者:
Jean Gallier
编者:
语种:
英语
出版信息
出版社:
WSPC
出版地:
出版年:
2020
版本:
版本类型:
原版
丛书题名:
卷期:
文献信息
关键词:
前言:
摘要:
内容简介:
Volume 2 applies the linear algebra concepts presented in Volume 1 to optimization problems which frequently occur throughout machine learning. This book blends theory with practice by not only carefully discussing the mathematical under pinnings of each optimization technique but by applying these techniques to linear programming, support vector machines (SVM), principal component analysis (PCA), and ridge regression. Volume 2 begins by discussing preliminary concepts of optimization theory such as metric spaces, derivatives, and the Lagrange multiplier technique for finding extrema of real valued functions. The focus then shifts to the special case of optimizing a linear function over a region determined by affine constraints, namely linear programming. Highlights include careful derivations and applications of the simplex algorithm, the dual-simplex algorithm, and the primal-dual algorithm. The theoretical heart of this book is the mathematically rigorous presentation of various nonlinear optimization methods, including but not limited to gradient decent, the Karush-Kuhn-Tucker (KKT) conditions, Lagrangian duality, alternating direction method of multipliers (ADMM), and the kernel method. These methods are carefully applied to hard margin SVM, soft margin SVM, kernel PCA, ridge regression, lasso regression, and elastic-net regression. Matlab programs implementing these methods are included.
目次:
全文链接:
读者对象:
实体信息
页码:
其它信息
原版ISBN:
书评:
扩展信息
相关附件