Machine learning. The art and science of algorithms that make sense of data.

*(English)*Zbl 1267.68010
Cambridge: Cambridge University Press (ISBN 978-1-107-09639-4/hbk; 978-1-107-42222-3/pbk; 978-1-139-57541-6/ebook). xvii, 396 p. (2012).

In the first chapter of his book, the author describes some basic ideas of machine learning such as supervised versus unsupervised learning, predictive versus descriptive paradigms, and machine learning models: logical, geometric and probabilistic. The next two chapters are still devoted to basic ideas: the second chapter explores instance space, features, classification, error rate, ROC curve, etc., all assuming binary classification. In Chapter 3 multi-class classifiers are introduced.

The next three chapters use a logical model of machine learning. Chapter 4 discusses concept learning, Chapter 5 sketches how to generate decision trees, and Chapter 6 how to induce rule sets.

The following two chapters use a geometric model of machine learning. In Chapter 7 linear models are discussed, using linear regression and support vector machines. In Chapter 8 nearest neighbor classification and the \(k\)-means algorithm of cluster analysis are described.

In Chapter 9 probabilistic models such as naive Bayes, Gaussian mixture and a model based on compression are discussed. Chapter 11 provides information about taxonomy of features (categorical, ordinal and quantitative). Ensemble models of machine learning, such as bagging and boosting, are presented in the next chapter. Finally, the last chapter provides some instruction how to conduct machine learning experiments.

This book is written in a simple way and it would be a good textbook for undergraduate students; however, lack of exercises may prevent it from adoption for some machine learning courses.

The next three chapters use a logical model of machine learning. Chapter 4 discusses concept learning, Chapter 5 sketches how to generate decision trees, and Chapter 6 how to induce rule sets.

The following two chapters use a geometric model of machine learning. In Chapter 7 linear models are discussed, using linear regression and support vector machines. In Chapter 8 nearest neighbor classification and the \(k\)-means algorithm of cluster analysis are described.

In Chapter 9 probabilistic models such as naive Bayes, Gaussian mixture and a model based on compression are discussed. Chapter 11 provides information about taxonomy of features (categorical, ordinal and quantitative). Ensemble models of machine learning, such as bagging and boosting, are presented in the next chapter. Finally, the last chapter provides some instruction how to conduct machine learning experiments.

This book is written in a simple way and it would be a good textbook for undergraduate students; however, lack of exercises may prevent it from adoption for some machine learning courses.

Reviewer: Jerzy W. Grzymala-Busse (Lawrence)

##### MSC:

68-02 | Research exposition (monographs, survey articles) pertaining to computer science |

68T05 | Learning and adaptive systems in artificial intelligence |