Joint Training for Neural Machine Translation

This book presents four approaches to jointly training bidirectional neural machine translation (NMT) models. First, in order to improve the accuracy of the attention mechanism, it proposes an agreement-based joint training approach to help the two complementary models agree on word alignment matrices for the same training data. Second, it presents a semi-supervised approach that uses an autoencoder to reconstruct monolingual corpora, so as to incorporate these corpora into neural machine translation. It then introduces a joint training algorithm for pivot-based neural machine translation, which can be used to mitigate the data scarcity problem. Lastly it describes an end-to-end bidirectional NMT model to connect the source-to-target and target-to-source translation models, allowing the interaction of parameters between these two directional models.

1133107820
Joint Training for Neural Machine Translation

This book presents four approaches to jointly training bidirectional neural machine translation (NMT) models. First, in order to improve the accuracy of the attention mechanism, it proposes an agreement-based joint training approach to help the two complementary models agree on word alignment matrices for the same training data. Second, it presents a semi-supervised approach that uses an autoencoder to reconstruct monolingual corpora, so as to incorporate these corpora into neural machine translation. It then introduces a joint training algorithm for pivot-based neural machine translation, which can be used to mitigate the data scarcity problem. Lastly it describes an end-to-end bidirectional NMT model to connect the source-to-target and target-to-source translation models, allowing the interaction of parameters between these two directional models.

54.99 In Stock
Joint Training for Neural Machine Translation

Joint Training for Neural Machine Translation

by Yong Cheng
Joint Training for Neural Machine Translation

Joint Training for Neural Machine Translation

by Yong Cheng

eBook1st ed. 2019 (1st ed. 2019)

$54.99 

Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
WANT A NOOK?  Explore Now

Related collections and offers


Overview

This book presents four approaches to jointly training bidirectional neural machine translation (NMT) models. First, in order to improve the accuracy of the attention mechanism, it proposes an agreement-based joint training approach to help the two complementary models agree on word alignment matrices for the same training data. Second, it presents a semi-supervised approach that uses an autoencoder to reconstruct monolingual corpora, so as to incorporate these corpora into neural machine translation. It then introduces a joint training algorithm for pivot-based neural machine translation, which can be used to mitigate the data scarcity problem. Lastly it describes an end-to-end bidirectional NMT model to connect the source-to-target and target-to-source translation models, allowing the interaction of parameters between these two directional models.


Product Details

ISBN-13: 9789813297487
Publisher: Springer-Verlag New York, LLC
Publication date: 08/26/2019
Series: Springer Theses
Sold by: Barnes & Noble
Format: eBook
File size: 3 MB

About the Author

Yong Cheng is currently a software engineer engaged in research at Google. Before joining Google, he worked as a senior researcher at Tencent AI Lab. He obtained his Ph.D. from the Institute for Interdisciplinary Information Sciences (IIIS) at Tsinghua University in 2017. His research interests focus on neural machine translation and natural language processing.

Table of Contents

1. Introduction.- 2. Neural Machine Translation.- 3. Agreement-based Joint Training for Bidirectional Attention-based Neural Machine Translation.- 4. Semi-supervised Learning for Neural Machine Translation.- 5. Joint Training for Pivot-based Neural Machine Translation.- 6. Joint Modeling for Bidirectional Neural Machine Translation with Contrastive Learning.- 7. Related Work.- 8. Conclusion.

From the B&N Reads Blog

Customer Reviews