Hands-On Large Language Models (6th Early Release) » MIRLIB.RU - ТВОЯ БИБЛИОТЕКА
Категория: КНИГИ » ПРОГРАММИРОВАНИЕ
Hands-On Large Language Models (6th Early Release)
/
Название: Hands-On Large Language Models: Language Understanding and Generation (6th Early Release)
Автор: Jay Alammar, Maarten Grootendorst
Издательство: O’Reilly Media, Inc.
Год: 2024-03-21
Страниц: 227
Язык: английский
Формат: epub
Размер: 11.0 MB

AI has acquired startling new language capabilities in just the past few years. Driven by the rapid advances in deep learning, language AI systems are able to write and understand text better than ever before. This trend enables the rise of new features, products, and entire industries. With this book, Python developers will learn the practical tools and concepts they need to use these capabilities today.

One of the most common tasks in natural language processing, and machine learning in general, is classification. The goal of the task is to train a model to assign a label or class to some input text. Categorizing text is used across the world for a wide range of applications, from sentiment analysis and intent detection to extracting entities and detecting language.

We can use an LLM to represent the text to be fed into our classifier. The choice of this model, however, may not be as straightforward as you might think. Models differ in the language they can handle, their architecture, size, inference speed, architecture, accuracy for certain tasks, and many more differences exist.

BERT is a great underlying architecture for representing tasks that can be fine-tuned for a number of tasks, including classification. Although there are generative models that we can use, like the well-known Generated Pretrained Transformers (GPT) such as ChatGPT, BERT models often excel at being fine-tuned for specific tasks. In contrast, GPT-like models typically excel at a broad and wide variety of tasks. In a sense, it is specialization versus generalization.

Скачать Hands-On Large Language Models (6th Early Release)







[related-news]
[/related-news]
Комментарии 0
Комментариев пока нет. Стань первым!