
My rating: 5 of 5 stars
This book gives an overview of the math that powers Deep Learning algorithms, I was goin to say high level, but that's not quite right. It goes into great depths, describing the concepts in lay terms. This book will give you an intuition for thinking about large language models (LLMs) like ChatGPT and such.
I love that the author gives clear explanations to complex topics. I really enjoyed getting the history and the people behind each discovery, the book goes in chronological order.
I have been studying and following AI since it was called "Pattern Recognition" in grad school, I took courses on "Machine Learning" at the University of Washington about 10 years ago, right around the time Deep Nets were stating to gain traction. I was starting to think I was not going to get anything new from this book but the last chapter of the book covers what's been happening since 2020 and it made me change my mind about this field. I don't want to "spoil" it, but I didn't know about that fact that Neural Networks are going past the theoretical bias/variance "Goldilocks" limit. Reading about that made me take another look at the tech. Like I said, I've been skeptical about NN and the "hype" of AI, but reading about this made me want to know more.
Excellent Read
View all my reviews
No comments:
Post a Comment