Деталі електронної книги

Mastering Transformers. The Journey from BERT to Large Language Models and Stable Diffusion - Second Edition

Mastering Transformers. The Journey from BERT to Large Language Models and Stable Diffusion - Second Edition

Savaş Yildirim, Meysam Asgari- Chenaghlu

Eлектронна книга
Transformer-based language models such as BERT, T5, GPT, DALL-E, and ChatGPT have dominated NLP studies and become a new paradigm. Thanks to their accurate and fast fine-tuning capabilities, transformer-based language models have been able to outperform traditional machine learning-based approaches for many challenging natural language understanding (NLU) problems.
Aside from NLP, a fast-growing area in multimodal learning and generative AI has recently been established, showing promising results. Mastering Transformers will help you understand and implement multimodal solutions, including text-to-image. Computer vision solutions that are based on transformers are also explained in the book. You’ll get started by understanding various transformer models before learning how to train different autoregressive language models such as GPT and XLNet. The book will also get you up to speed with boosting model performance, as well as tracking model training using the TensorBoard toolkit. In the later chapters, you’ll focus on using vision transformers to solve computer vision problems. Finally, you’ll discover how to harness the power of transformers to model time series data and for predicting.
By the end of this transformers book, you’ll have an understanding of transformer models and how to use them to solve challenges in NLP and CV.
  • 1. From Bag-of-Words to the Transformer
  • 2. A Hands-On Introduction to the Subject
  • 3. Autoencoding Language Models
  • 4. Autoregressive Language Models
  • 5. Fine-Tuning Language Model for Text Classification
  • 6. Fine-Tuning Language Models for Token Classification
  • 7. Text Representation
  • 8. Boosting Your Model Performance
  • 9. Parameter Efficient Fine-Tuning
  • 10. Zero-Shot and Few-Shot Learning in NLP
  • 11. Explainable AI (XAI) for NLP
  • 12. Working with Efficient Transformers
  • 13. Cross-Lingual Language Modeling
  • 14. Serving Transformer Models
  • 15. Model Tracking and Monitoring
  • 16. Vision Transformers
  • 17. Tabular Transformers
  • 18. Multi-Model Transformers
  • Назва: Mastering Transformers. The Journey from BERT to Large Language Models and Stable Diffusion - Second Edition
  • Автор: Savaş Yildirim, Meysam Asgari- Chenaghlu
  • Оригінальна назва: Mastering Transformers. The Journey from BERT to Large Language Models and Stable Diffusion - Second Edition
  • ISBN: 9781837631506, 9781837631506
  • Дата видання: 2024-06-03
  • Формат: Eлектронна книга
  • Ідентифікатор видання: e_3xc7
  • Видавець: Packt Publishing