Haralamlos Marmanis and Dmitry Babenko
5.0 out of 5 stars A soon to be classic Algo book for improving intelligent web applications June 19, 2009
By Michael Mimo I have always had an interest in AI, machine learning, and data mining but I found the introductory books too mathematical and focused mostly on solving academic problems rather than real-world industrial problems. So, I was curious to see what this book was about.
I have read the book front-to-back (twice!) before I write this report. I started reading the electronic version a couple of months ago and read the paper print again over the weekend. This is the best practical book in machine learning that you can buy today — period. All the examples are written in Java and all algorithms are explained in plain English. The writing style is superb!
The book was written by one author (Marmanis) while the other one (Babenko) contributed in the source code, so there are no gaps in the narrative; it is engaging, pleasant, and fluent. The author leads the reader from the very introductory concepts to some fairly advanced topics. Some of the topics are covered in the book and some are left as an exercise at the end of each chapter (there is a “To Do” section, which was a wonderful idea!). I did not like some of the figures (they were probably made by the authors not an artist) but this was only a minor aesthetic inconvenience.
The book covers four cornerstones of machine learning and intelligence, i.e. intelligent search, recommendations, clustering, and classification. It also covers a subject that today you can find only in the academic literature, i.e. combination techniques. Combination techniques are very powerful and although the author presents the techniques in the context of classifiers, it is clear that the same can be done for ecommendations — as the Bell Korr team did for the Netflix prize.
I work in a financial company and a number of people that I work with have PhD degrees in mathematics and computer science. I found the book so fascinating that I asked them to have a look. They had nothing but praise for this book. The consensus is that everything is explained in the simplest possible way, with clarity but without sacrificing accuracy. As one of them told me, this is a major step forward in teaching AI techniques and introducing the field to millions of developers around the world. Even for experts in the field and experienced software engineers, there are important insights in almost every chapter.
We had tried to write a software library, for a small project, that analyzes log files and assesses IT risk (e.g. probability of intrusion; preemptive alerts on application performance issues, and so on) based on Segaran’s book “Programming collective intelligence”. We spend about six weeks trying to find how to match what was in Segaran’s book and what we wanted to do but we did not find the depth and clarity that was required. On top of that, Segaran used Python so the code had to be rewritten and things didn’t quite work as expected! We are now using the code from Marmanis’ book and our code analyzes apache and weblogic log files in order to assess risk! It just works! We wrote the code in one week! We would not have been able to succeed without reading this book.
Clearly, I am deeply impressed. This is an outstanding book; it was not just useful, it was inspiring! It is a “must have” book for every Java developer.
The content of the book includes:
* the PageRank algorithm; a content based algorithm similar to PageRank to which the author coined the term “DocRank” because it applies to Word, PDF, and other documents rather than Web pages; search improvements based on probabilistic methods (Naive Bayes); precision, recall, F1-score, and ROC curves;
* collaborative filtering as well as content based recommendations;
* k-means, ROCK, DBSCAN for clustering; the best explanation about the “curse of dimensionality” ever! I finally learned what this mystic term means!
* Bayesian classification; declarative programming (through the Drools rules engine); introduction to neural networks; decision trees
* Comparing and Combining classifiers: McNemar’s test; Cochran’sQ test; F-test; Bagging; Boosting; general classifier ensembles
Buy it, read it, enjoy it, and use it!