**Preamble**

Artificial Intelligence Engines: by Dr James V. StoneAn introduction to the Mathematics of Deep Learning the book and Github repository. (c) 2019 Sebtel Press |

*book trying to fill this gap with a pedagogical approach to the mathematics of deep learning*

**unique***avoiding showing of mathematical complexity*but aiming at conveying the understanding of how things work from the ground up. Moreover, the book provides pseudo-codes that one can be used to implement things from scratch along with a supporting implementation in Github repo. Author Dr James V. Stone, a trained cognitive scientist and researcher in mathematical neuroscience provides such approaches with other books many years now, writing for students, not for his peers to show off. One important note that this is

*not a cookbook*or practice tutorial but an upper-intermediate level academic book.

**Building associations and classify with a network**

The logical and conceptual separation of associations and classification tasks are introduced in the initial chapters. It is ideal to start with from learning

*one association with one connection*to many via gentle introduction to Gradient descent in learning the weights before going to 2 associations and 2 connections. This reminds me of George Gamow's term 1, 2 and infinity as a pedagogical principle. Perceptron is introduced later on how classification rules can be generated via a network and the problems it encounters with XOR problem.

**Backpropagation, Hopfield networks and Boltzmann machines**

Detail implementation of backpropagation is provided from scratch without too many cluttering index notation in such clarity. Probably this is the best explanation I have ever encountered. Following chapters introduced Hopfield networks and Boltzmann machines from the ground up to applied level. Unfortunately, many modern deep learning books skip these two great models but Dr Stone makes these two models implementable for a practitioner by reading his chapters. It is very impressive. Even though I am a bit biased in Hopfield networks as I see them as an extension to Ising models and its stochastic counterparts, but I have not seen anywhere else such explanations on how to use Hopfield networks in learning and in a pseudo-code algorithm to use in a real task.

**Advanced topics**

**Personally, I see the remaining chapters as advanced topics: Deep Boltzmann machines, variational encoders, GANs and introduction to reinforcement learning. Probably exception of deep backpropagation in Chapter 9. I would say what is now known as**

*deep learning*now

*was the inception of the architectures mentioned in sections 9.1 till 9.7.*

**Glossary, basic linear algebra and statistics**

Appendices provide a fantastic conceptual introduction to jargon and basics to main mathematical techniques. Of course, this isn't a replacement to fully-fledged linear algebra and statistics book but it provides immediate concise explanations.

**Not a cookbook: Not**

*import tensorflow as tf*

**book**

**One other crucial advantage of this book is that it is definitely not a cookbook. Unfortunately, almost all books related to deep learning are written in a cookbook style. This book is not. However, it is supplemented by full implementation in a repository supporting each chapter, URL here.**

**Conclusion**

This little book archives so much with down to earth approach with introducing basic concepts with a respectful attitude, assuming the reader is very smart but inexperience in the field. If you are a beginner or even experienced research scientist this is a must-have book. I still see this book as an academic book and can be used in upper-undergraduate class as the main book in an elective such as

*"Mathematics of Deep Learning"*.

Enjoy reading and learning from this book. Thank you, Dr Stone, for your efforts on making academic books more accessible.

__:__

*Disclosure**I received a review copy of the book but I have bought another copy for a family member.*