#acl2019nlp paper on "Beyond BLEU: Training NMT with Semantic Similarity" by Wieting et al.: https://t.co/5N9SBiPyDq
— Graham Neubig (@gneubig) August 6, 2019
I like this because it shows 1) a nice use case for semantic similarity, 2) that we can/should optimize seq2seq models for something other than likelihood or BLEU! pic.twitter.com/Fh8WJe5tKH