Pretrain Vision and Large Language Models in Python - Emily Webber

Pretrain Vision and Large Language Models in Python

End-to-end techniques for building and deploying foundation models on AWS

(Autor)

Buch | Softcover
258 Seiten
2023
Packt Publishing Limited (Verlag)
978-1-80461-825-7 (ISBN)
47,35 inkl. MwSt
Master the art of training vision and large language models with conceptual fundaments and industry-expert guidance. Learn about AWS services and design patterns, with relevant coding examples

Key Features

Learn to develop, train, tune, and apply foundation models with optimized end-to-end pipelines
Explore large-scale distributed training for models and datasets with AWS and SageMaker examples
Evaluate, deploy, and operationalize your custom models with bias detection and pipeline monitoring

Book DescriptionFoundation models have forever changed machine learning. From BERT to ChatGPT, CLIP to Stable Diffusion, when billions of parameters are combined with large datasets and hundreds to thousands of GPUs, the result is nothing short of record-breaking. The recommendations, advice, and code samples in this book will help you pretrain and fine-tune your own foundation models from scratch on AWS and Amazon SageMaker, while applying them to hundreds of use cases across your organization.

With advice from seasoned AWS and machine learning expert Emily Webber, this book helps you learn everything you need to go from project ideation to dataset preparation, training, evaluation, and deployment for large language, vision, and multimodal models. With step-by-step explanations of essential concepts and practical examples, you’ll go from mastering the concept of pretraining to preparing your dataset and model, configuring your environment, training, fine-tuning, evaluating, deploying, and optimizing your foundation models.

You will learn how to apply the scaling laws to distributing your model and dataset over multiple GPUs, remove bias, achieve high throughput, and build deployment pipelines.

By the end of this book, you’ll be well equipped to embark on your own project to pretrain and fine-tune the foundation models of the future.What you will learn

Find the right use cases and datasets for pretraining and fine-tuning
Prepare for large-scale training with custom accelerators and GPUs
Configure environments on AWS and SageMaker to maximize performance
Select hyperparameters based on your model and constraints
Distribute your model and dataset using many types of parallelism
Avoid pitfalls with job restarts, intermittent health checks, and more
Evaluate your model with quantitative and qualitative insights
Deploy your models with runtime improvements and monitoring pipelines

Who this book is forIf you’re a machine learning researcher or enthusiast who wants to start a foundation modelling project, this book is for you. Applied scientists, data scientists, machine learning engineers, solution architects, product managers, and students will all benefit from this book. Intermediate Python is a must, along with introductory concepts of cloud computing. A strong understanding of deep learning fundamentals is needed, while advanced topics will be explained. The content covers advanced machine learning and cloud techniques, explaining them in an actionable, easy-to-understand way.

Emily Webber is a Principal Machine Learning Specialist Solutions Architect at Amazon Web Services. She has assisted hundreds of customers on their journey to ML in the cloud, specializing in distributed training for large language and vision models. She mentors Machine Learning Solution Architects, authors countless feature designs for SageMaker and AWS, and guides the Amazon SageMaker product and engineering teams on best practices in regards around machine learning and customers. Emily is widely known in the AWS community for a 16-video YouTube series featuring SageMaker with 160,000 views, plus a Keynote at O’Reilly AI London 2019 on a novel reinforcement learning approach she developed for public policy.

Table of Contents

An Introduction to Pretraining Foundation Models
Dataset Preparation: Part One
Model Preparation
Containers and Accelerators on the Cloud
Distribution Fundamentals
Dataset Preparation: Part Two, the Data Loader
Finding the Right Hyperparameters
Large-Scale Training on SageMaker
Advanced Training Concepts
Fine-Tuning and Evaluating
Detecting, Mitigating, and Monitoring Bias
How to Deploy Your Model
Prompt Engineering
MLOps for Vision and Language
Future Trends in Pretraining Foundation Models

Erscheinungsdatum
Vorwort Andrea Olgiati
Verlagsort Birmingham
Sprache englisch
Maße 191 x 235 mm
Themenwelt Mathematik / Informatik Informatik Theorie / Studium
ISBN-10 1-80461-825-X / 180461825X
ISBN-13 978-1-80461-825-7 / 9781804618257
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich
Grundlagen – Anwendungen – Perspektiven

von Matthias Homeister

Buch | Softcover (2022)
Springer Vieweg (Verlag)
34,99
Eine Einführung in die Systemtheorie

von Margot Berghaus

Buch | Softcover (2022)
UTB (Verlag)
25,00