Support Vector Classifiers in scikit-learn
Support Vector Classifiers in scikit-learn (ISBN: 978-0-620-91004-0), by Dr. J.S.C. Prentice, our content manager, is Mathsophical’s first eBook.
Click image to buy
Consider the following questions, whether you are a data scientist, a student, a mathematician or simply a hobbyist, who uses Support Vector Classifiers (SVCs) in scikit-learn:
- What are the primal and dual problems in Support Vector Classifiers (SVCs), how are they derived, and how are they related?
- What is the discriminant and why is it important in SVCs?
- What is a hard-margin SVC, and a soft-margin SVC (known as C-SVC)?
- What role does the parameter C play in C-SVC?
- How can nonseparable datasets be accommodated using nonlinear SVCs?
- What are kernels, such as the RBF kernel, and what is the kernel trick?
- What is the analytical form of a nonlinear kernel, particularly in an infinite-dimensional feature space?
- What is v-SVC, how does scaling work in v-SVC, and how is v-SVC related to C-SVC?
- What is a One-Class Support Vector Machine?
- How do multiclass SVCs work?
- How are the statistical properties of an SVC determined?
- What is the mathematical significance of various input parameters for the SVC algorithms (SVC and NuSVC) in scikit-learn, such as C , nu , kernel , degree , gamma , coef0 , probability , tol , classs_weight , decision_function_shape ?
- How do we chose a value for nu in NuSVC in a multiclass context?
- How do we interpret the outputs of the algorithms, such as dual_coef_, intercept_, probA_, probB_?
- How do we interpret dual_coef_ in a multiclass context?
- What are the functions of the methods of the scikit-learn algorithms, such as decision_function, predict, predict_proba, predict_log_proba, score?
- What is the offset_ attribute of OneClassSVM?
- How do we implement custom kernels and precomputed Gram matrices?
- What are the inputs, outputs and methods of the LinearSVC algorithm?
- How do we use GridSearchCV to find optimal values for hyperparameters?
- Why is data scaling so important for SVCs?
All of these questions, and more, are answered in our eBook. We discuss all the necessary mathematics that relates to the scikit-learn software. We also include a few simple analytical examples, to show how the optimization procedure solves the relevant dual problems. A detailed case study regarding the RBF kernel is presented. In addition, useful code is supplied, such as a simple program for implementing an SVC, given a dataset, and plotting the result (if the data is two-dimensional). We provide code for analysing the very important attribute dual_coef_, in both a two-class and multiclass context. Code snippets for GridSearchCV, custom kernels and precomputed Gram matrices are also provided. And, even if you don’t make use of scikit-learn, there is a wealth of knowledge to be gained from the mathematical content alone.
The eBook is 262 pages (portrait version; landscape version is 339 pages). It is packed with information and mathematical detail, and contains numerous hyperlinks within the text for easy navigation.
** Check out the Table of Contents and some sample pages here. Please consult these sample pages carefully; Mathsophical is unable to provide a refund for an erroneous or undesired purchase (see our Redress policy here).
The product is sold in the form of a downloadable folder, containing the eBook Support Vector Classifiers in scikit-learn by Dr. J.S.C. Prentice, a version of the eBook in landscape format, and two notebooks (.ipynb) containing the various code snippets mentioned above. Click here to buy.