Full description not available
D**T
An absolute must-buy
Having bought the first two editions of Rothman's seminal Transformers for Natural Language Processing book, I was super-excited to get this third edition. Upon receiving, the first thing that struck me was that is a *big*, heavy, book - with some 20 chapters and 700 odd pages. No bad thing.As one might expect some of the book is the same content from the previous books (e.g., the opening few chapters on what transformers are, and their architecture), but the vast majority of the book appears to be new content. Clearly technology has moved rapidly since the last release, hence there are chapters on RAG, Vertex AI, and Palm.Also all-new are the chapters on transformers for vision (hence the name change of the book to Transformers for Natural Language Processing and Computer Vision). In this new section, Rothman describes some of the most powerful CV transformers, discusses stable diffusion, and explains how to use Hugging Face's Autotrain. However, this reader was intrigued by the chapter on "generative ideation" in which Rothman postulates the concept of generating ideas and content without human intervention. Fascinating stuff.In summary, Rothman is clearly an expert in this area. The book is written in an easy-to-read manner, with lots of examples and good explanations of some quite complicated concepts. If you are serious about transformers then this is simply a must-buy book, written by the foremost authority in this area. Highly recommended.
R**V
Great read - Highly recommended.
A book that is easy to follow along and understand the concepts. Denis , you have outdone yourself. It is relevant , the topics covered are exciting and have certainly made me enthusiastic and knowledgeable about this Foundation Models.Chapeau to you sir and I will definitely recommend your book to others.
N**S
Reasonable content but poor phrasing throughout
I'm pleased I bought it but I found the writing really poorly phrased: it feels like there has been no effort by the editor to make the content flow like natural English and it's at a level that actually hinders understanding. Particularly in early chapters there are justifications that feel like the whole point is lost in translation.No doubt this can be remedied in a second edition
M**A
Good reference for my research
I am working on research that uses transformers with Natural Language Processing and Computer Vision. I find this book excellent as a reference and information. I like it.
S**S
Detailed and approachable book on AI
I really enjoyed the book, I thought it was very thorough and well explained. It covers everything you would want to see in a book on NLP and CV.
D**R
A valuable resource to refresh our knowledge and inspire us to take the next steps
for those who can read, I can definitely say that this new third edition provides a fresh look at both the transformers themselves and the current environment in which they exist.A valuable resource to refresh our knowledge and inspire us to take the next stepsmy personal selection of what I appreciated in this third edition after about ten days of perusing, reading and note-takingthe emergence of new roles:* The role of AI professionals* The future of AI professionals* What resources should we use?* Guidelines for decision making* Chapter 3: Emergent vs. downstream tasks: The Unseen Depths of Transformers* Chapter 7: The Generative AI Revolution with ChatGPT* Chapter 12: Towards Syntax-Free Semantic Role Labelling with ChatGPT and GPT-4* Chapter 16: Beyond Text: Vision Transformers at the Dawn of Revolutionary AIRothman writes that this book is for data analysts, data scientists, and machine learning/AI engineers who want to understand how to process and interrogate the increasing amounts of speech and image data. Most of the programs in the book are Colaboratory notebooks. All you need is a free Google Gmail account and you can run the notebooks on the free Google Colaboratory VM.Context of my interest in this field: Shortly after the public release of ChatGPT in November 2022, Bill Gates described it and other LLMs as "as important as the PC, as important as the Internet". Jensen Huang, CEO of Nvidia, said ChatGPT was "truly one of the greatest things ever done for computing". Geoffrey Hinton, a Turing Laureate, said, "I think it's comparable in scale to the industrial revolution or electricity - or maybe the wheel. Perhaps that is why many of us need a qualified, updated context.I can definitely say that this new third edition gives a qualified context and fresh look at both the transformers themselves and the current environment in which they exist.and yet, the term "Computer Simulation" is far more accurate as an umbrella term than any characterization of machine software("AI," "LLM," "Generative AI," etc.).Rothman's profile shows that he has been designing and developing computer simulation software for decades in various forms: rule-based, expert systems, ML agents, DL agents, the first transformer models, and now trending Generative AI for NLP and Computer Vision. all these algorithms boil down to "computer simulation", no more, no less. They are toolss that are here for us to make "simulations" to enhance our abilities as a scientific calculator does.Who this book is for: Anyone who regularly works with LLMs professionally (e.g. data scientists, machine learning engineers, AI researchers, etc.) or anyone already familiar with natural language processing (NLP) who wants to take a deep dive into transformers.Another reviewer rightly wrote: Who this book is not for: Anyone with little to no knowledge of NLP, machine learning, or Python programming (i.e. the "casual" reader). This book is dense (in the sense of Clifford Geertz‘ thick description that helps us increase our understanding on both on a theoretical and a practical level). I still have a lot to think about.And I have to admit that I have not yet fully grasped all the emerging possibilities and food for thought that the book has triggered or will trigger as I re-read and explore the code provided.
V**V
great Book for learning Generative AI and transformer
If you're aspiring to become an expert in NLP or Generative AI, this book is an excellent resource. It provides a clear, step-by-step explanation of NLP models, making complex concepts easy to grasp through practical examples and Python code. . Starting with foundational models, the book introduces the architecture of Transformer, BERT, and RoBERTa, followed by an in-depth exploration of the GPT models which are the Generative AI revolution. The book also delves into image processing and computer vision. Additionally, the questions at the end of each chapter further enhance understanding and engagement with the material.
V**I
Well balanced code with just enough theory
The book tries to cover as much detail as possible so that you can hit the ground running with the accompanied code.There are 19 chapters and it also covers topics which normally we do not find in other books--SRL--Intepretability--autotrain--vision transformers--time complexityDue to the availability of code and crisp explainations one can get overall idea and at the same time try the samples for own use cases.Shows an example of transformer pretraining with good amount of steps and details.It is definitely worth reading book.Overall if you are a practioner or student in ML --having some background and not a noob in NLP then this book can be helpful to you.It also tries to explore the AGI potential with examples and comparision of the different available options and covers in well detail the OpenAI based models.It also shows comparison of summarization for T5 along with ChatGPT the model which has taken whole industry by storm.You can keep the book with you and refer quickly many aspects related to transformers on the go.
N**O
Unleashing the Power of Transformers: A Journey into NLP and Beyond
Denis Rothman’s latest book, “Transformers for Natural Language Processing and Computer Vision 3rd Edition, delves into the captivating world of transformers, the groundbreaking neural network architectures that have revolutionized natural language processing (NLP), text generation and computer vision.Here are the key highlights from this book:Platform Exploration:The book guides readers through various transformer models and platforms, helping them understand which ones best suit their needs.It emphasizes problem-solving skills to address model weaknesses effectively.Hands-On Approach:You’ll learn how to pretrain a Transformer model from scratch. This involves building the dataset, defining data collators, and training the model.If you’re interested in fine-tuning pretrained models like GPT-4, the book provides step-by-step guides.NLP Tasks Covered:The book explores a wide range of NLP tasks, including machine translation, speech-to-text or text-to-image.Get a glimpse of the future of transformers, including their role in computer vision tasks and code writing assistance.In summary, “Transformers for Natural Language Processing” bridges the gap between cutting-edge AI research and practical applications. Whether you’re a researcher, developer, or enthusiast, this book offers insights into the evolving landscape of transformers and their impact on language understanding and generation.
D**R
A valuable resource to refresh our knowledge and inspire us to take the next steps
for those who can read, I can definitely say that this new third edition provides a fresh look at both the transformers themselves and the current environment in which they exist.A valuable resource to refresh our knowledge and inspire us to take the next stepsmy personal selection of what I appreciated in this third edition after about ten days of perusing, reading and note-takingthe emergence of new roles:* The role of AI professionals* The future of AI professionals* What resources should we use?* Guidelines for decision making* Chapter 3: Emergent vs. downstream tasks: The Unseen Depths of Transformers* Chapter 7: The Generative AI Revolution with ChatGPT* Chapter 12: Towards Syntax-Free Semantic Role Labelling with ChatGPT and GPT-4* Chapter 16: Beyond Text: Vision Transformers at the Dawn of Revolutionary AIRothman writes that this book is for data analysts, data scientists, and machine learning/AI engineers who want to understand how to process and interrogate the increasing amounts of speech and image data. Most of the programs in the book are Colaboratory notebooks. All you need is a free Google Gmail account and you can run the notebooks on the free Google Colaboratory VM.Context of my interest in this field: Shortly after the public release of ChatGPT in November 2022, Bill Gates described it and other LLMs as "as important as the PC, as important as the Internet". Jensen Huang, CEO of Nvidia, said ChatGPT was "truly one of the greatest things ever done for computing". Geoffrey Hinton, a Turing Laureate, said, "I think it's comparable in scale to the industrial revolution or electricity - or maybe the wheel. Perhaps that is why many of us need a qualified, updated context.I can definitely say that this new third edition gives a qualified context and fresh look at both the transformers themselves and the current environment in which they exist.and yet, the term "Computer Simulation" is far more accurate as an umbrella term than any characterization of machine software("AI," "LLM," "Generative AI," etc.).Rothman's profile shows that he has been designing and developing computer simulation software for decades in various forms: rule-based, expert systems, ML agents, DL agents, the first transformer models, and now trending Generative AI for NLP and Computer Vision. all these algorithms boil down to "computer simulation", no more, no less. They are toolss that are here for us to make "simulations" to enhance our abilities as a scientific calculator does.Who this book is for: Anyone who regularly works with LLMs professionally (e.g. data scientists, machine learning engineers, AI researchers, etc.) or anyone already familiar with natural language processing (NLP) who wants to take a deep dive into transformers.Another reviewer rightly wrote: Who this book is not for: Anyone with little to no knowledge of NLP, machine learning, or Python programming (i.e. the "casual" reader). This book is dense (in the sense of Clifford Geertz‘ thick description that helps us increase our understanding on both on a theoretical and a practical level). I still have a lot to think about.And I have to admit that I have not yet fully grasped all the emerging possibilities and food for thought that the book has triggered or will trigger as I re-read and explore the code provided.
J**G
Hard to understand
I feel the author has used ChatGPT to write a book, as I have experienced the same way with other books on that topic. Many topics are entirely unnecessary. For other issues, he leaves essential points out for a thorough understanding. However, I think that he has experience in that matter.
Trustpilot
Hace 4 días
Hace 1 mes