This book aims to demonstrate that the history selection and modelling approaches proposed can effectively improve the performance of ConvQA models in different settings. The proposed models are compared with the state-of-the-art vis-à-vis different conversational datasets and provide new insights into conversational information retrieval. Through a systematic study of structured representations, entity-aware history selection, and open-domain passage retrieval using contrastive learning, this book presents a robust framework for advancing multi-turn QA systems.
It is an essential resource for researchers, practitioners, and graduate students working at the intersection of NLP, dialogue systems, and intelligent information access.
This book aims to demonstrate that the history selection and modelling approaches proposed can effectively improve the performance of ConvQA models in different settings. The proposed models are compared with the state-of-the-art vis-à-vis different conversational datasets and provide new insights into conversational information retrieval. Through a systematic study of structured representations, entity-aware history selection, and open-domain passage retrieval using contrastive learning, this book presents a robust framework for advancing multi-turn QA systems.
It is an essential resource for researchers, practitioners, and graduate students working at the intersection of NLP, dialogue systems, and intelligent information access.

When NLP meets LLM: Neural Approaches to Context-based Conversational Question Answering
102
When NLP meets LLM: Neural Approaches to Context-based Conversational Question Answering
102Hardcover
Product Details
ISBN-13: | 9781032970844 |
---|---|
Publisher: | CRC Press |
Publication date: | 10/16/2025 |
Pages: | 102 |
Product dimensions: | 5.44(w) x 8.50(h) x (d) |