A Comparison of Neural Models for Word Ordering

View Researcher's Other Codes

Disclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).

Please contact us in case of a broken link from here

Authors Eva Hasler, Felix Stahlberg, Bill Byrne, Marcus Tomalin, Adri`a de Gispert
Journal/Conference Name WS 2017 9
Paper Category
Paper Abstract We compare several language models for the word-ordering task and propose a new bag-to-sequence neural model based on attention-based sequence-to-sequence models. We evaluate the model on a large German WMT data set where it significantly outperforms existing models. We also describe a novel search strategy for LM-based word ordering and report results on the English Penn Treebank. Our best model setup outperforms prior work both in terms of speed and quality.
Date of publication 2017
Code Programming Language C++
Comment

Copyright Researcher 2022