Bigger isn’t always better. Train and tune highly focused language models optimized for domain specific tasks.
When you need a language model to respond accurately and quickly about a specific field of knowledge, the sprawling capacity of a LLM may hurt more than it helps. Domain-Specific Small Language Models teaches you to build generative AI models optimized for specific fields.
In Domain-Specific Small Language Models you’ll discover:
• Model sizing best practices
• Open source libraries, frameworks, utilities and runtimes
• Fine-tuning techniques for custom datasets
• Hugging Face’s libraries for SLMs
• Running SLMs on commodity hardware
• Model optimization or quantization
Perfect for cost- or hardware-constrained environments, Small Language Models (SLMs) train on domain specific data for high-quality results in specific tasks. In Domain-Specific Small Language Models you’ll develop SLMs that can generate everything from Python code to protein structures and antibody sequences—all on commodity hardware.
About the book
Domain-Specific Small Language Models teaches you how to create language models that deliver the power of LLMs for specific areas of knowledge. You’ll learn to minimize the computational horsepower your models require, while keeping high–quality performance times and output. You’ll appreciate the clear explanations of complex technical concepts alongside working code samples you can run and replicate on your laptop. Plus, you’ll learn to develop and deliver RAG systems and AI agents that rely solely on SLMs, and without the costs of foundation model access.
About the reader
For machine learning engineers familiar with Python.
About the author
Guglielmo Iozzia is a Director, ML/AI and Applied Mathematics at MSD. He studied Electronic and Biomedical Engineering at the University of Bologna, has an extensive background in Software and ML/AI Engineering applied to real-life use cases across different industries, such as Biotech Manufacturing, Healthcare, Cloud Operations, and Cyber Security.
1147892175
When you need a language model to respond accurately and quickly about a specific field of knowledge, the sprawling capacity of a LLM may hurt more than it helps. Domain-Specific Small Language Models teaches you to build generative AI models optimized for specific fields.
In Domain-Specific Small Language Models you’ll discover:
• Model sizing best practices
• Open source libraries, frameworks, utilities and runtimes
• Fine-tuning techniques for custom datasets
• Hugging Face’s libraries for SLMs
• Running SLMs on commodity hardware
• Model optimization or quantization
Perfect for cost- or hardware-constrained environments, Small Language Models (SLMs) train on domain specific data for high-quality results in specific tasks. In Domain-Specific Small Language Models you’ll develop SLMs that can generate everything from Python code to protein structures and antibody sequences—all on commodity hardware.
About the book
Domain-Specific Small Language Models teaches you how to create language models that deliver the power of LLMs for specific areas of knowledge. You’ll learn to minimize the computational horsepower your models require, while keeping high–quality performance times and output. You’ll appreciate the clear explanations of complex technical concepts alongside working code samples you can run and replicate on your laptop. Plus, you’ll learn to develop and deliver RAG systems and AI agents that rely solely on SLMs, and without the costs of foundation model access.
About the reader
For machine learning engineers familiar with Python.
About the author
Guglielmo Iozzia is a Director, ML/AI and Applied Mathematics at MSD. He studied Electronic and Biomedical Engineering at the University of Bologna, has an extensive background in Software and ML/AI Engineering applied to real-life use cases across different industries, such as Biotech Manufacturing, Healthcare, Cloud Operations, and Cyber Security.
Domain-Specific Small Language Models
Bigger isn’t always better. Train and tune highly focused language models optimized for domain specific tasks.
When you need a language model to respond accurately and quickly about a specific field of knowledge, the sprawling capacity of a LLM may hurt more than it helps. Domain-Specific Small Language Models teaches you to build generative AI models optimized for specific fields.
In Domain-Specific Small Language Models you’ll discover:
• Model sizing best practices
• Open source libraries, frameworks, utilities and runtimes
• Fine-tuning techniques for custom datasets
• Hugging Face’s libraries for SLMs
• Running SLMs on commodity hardware
• Model optimization or quantization
Perfect for cost- or hardware-constrained environments, Small Language Models (SLMs) train on domain specific data for high-quality results in specific tasks. In Domain-Specific Small Language Models you’ll develop SLMs that can generate everything from Python code to protein structures and antibody sequences—all on commodity hardware.
About the book
Domain-Specific Small Language Models teaches you how to create language models that deliver the power of LLMs for specific areas of knowledge. You’ll learn to minimize the computational horsepower your models require, while keeping high–quality performance times and output. You’ll appreciate the clear explanations of complex technical concepts alongside working code samples you can run and replicate on your laptop. Plus, you’ll learn to develop and deliver RAG systems and AI agents that rely solely on SLMs, and without the costs of foundation model access.
About the reader
For machine learning engineers familiar with Python.
About the author
Guglielmo Iozzia is a Director, ML/AI and Applied Mathematics at MSD. He studied Electronic and Biomedical Engineering at the University of Bologna, has an extensive background in Software and ML/AI Engineering applied to real-life use cases across different industries, such as Biotech Manufacturing, Healthcare, Cloud Operations, and Cyber Security.
When you need a language model to respond accurately and quickly about a specific field of knowledge, the sprawling capacity of a LLM may hurt more than it helps. Domain-Specific Small Language Models teaches you to build generative AI models optimized for specific fields.
In Domain-Specific Small Language Models you’ll discover:
• Model sizing best practices
• Open source libraries, frameworks, utilities and runtimes
• Fine-tuning techniques for custom datasets
• Hugging Face’s libraries for SLMs
• Running SLMs on commodity hardware
• Model optimization or quantization
Perfect for cost- or hardware-constrained environments, Small Language Models (SLMs) train on domain specific data for high-quality results in specific tasks. In Domain-Specific Small Language Models you’ll develop SLMs that can generate everything from Python code to protein structures and antibody sequences—all on commodity hardware.
About the book
Domain-Specific Small Language Models teaches you how to create language models that deliver the power of LLMs for specific areas of knowledge. You’ll learn to minimize the computational horsepower your models require, while keeping high–quality performance times and output. You’ll appreciate the clear explanations of complex technical concepts alongside working code samples you can run and replicate on your laptop. Plus, you’ll learn to develop and deliver RAG systems and AI agents that rely solely on SLMs, and without the costs of foundation model access.
About the reader
For machine learning engineers familiar with Python.
About the author
Guglielmo Iozzia is a Director, ML/AI and Applied Mathematics at MSD. He studied Electronic and Biomedical Engineering at the University of Bologna, has an extensive background in Software and ML/AI Engineering applied to real-life use cases across different industries, such as Biotech Manufacturing, Healthcare, Cloud Operations, and Cyber Security.
43.99
Pre Order
5
1

Domain-Specific Small Language Models
300
Domain-Specific Small Language Models
300Related collections and offers
43.99
Pre Order
Product Details
ISBN-13: | 9781638357889 |
---|---|
Publisher: | Manning |
Publication date: | 12/30/2025 |
Sold by: | SIMON & SCHUSTER |
Format: | eBook |
Pages: | 300 |
About the Author
From the B&N Reads Blog