Moses-support Digest, Vol 113, Issue 74

Send Moses-support mailing list submissions to
moses-support@mit.edu

To subscribe or unsubscribe via the World Wide Web, visit
http://mailman.mit.edu/mailman/listinfo/moses-support
or, via email, send a message with subject or body 'help' to
moses-support-request@mit.edu

You can reach the person managing the list at
moses-support-owner@mit.edu

When replying, please edit your Subject line so it is more specific
than "Re: Contents of Moses-support digest..."


Today's Topics:

1. ** DEADLINE EXTENSION **: Computer Speech and Language
Special Issue on Deep Learning for Machine Translation (Marta Ruiz)
2. Maximum Phrase Table length (Vincent Nguyen)


----------------------------------------------------------------------

Message: 1
Date: Thu, 31 Mar 2016 12:06:31 +0200
From: Marta Ruiz <martaruizcostajussa@gmail.com>
Subject: [Moses-support] ** DEADLINE EXTENSION **: Computer Speech and
Language Special Issue on Deep Learning for Machine Translation
To: moses-support@mit.edu
Message-ID:
<CABEBqHLYcseNE=vDTqvVaoGSoNnCUd+pCxDc1nAC8OoxoLKfpw@mail.gmail.com>
Content-Type: text/plain; charset="utf-8"

*Computer Speech and Language Special Issue on Deep Learning for Machine
Translation
<http://www.journals.elsevier.com/computer-speech-and-language/call-for-papers/special-issue-on-deep-learning-for-machine-translation/>*



Deep Learning has been successfully applied to many areas including Natural
Language Processing, Speech Recognition and Image Processing. Deep learning
techniques have surprised the entire community both academy and industry by
powerfully learning from data.



Recently, deep learning has been introduced to Machine Translation (MT). It
first started as a kind of feature which was integrated in standard phrase
or syntax-based statistical approaches. Deep learning has been shown useful
in translation and language modeling as well as in reordering, tuning and
rescoring. Additionally, deep learning has been applied to MT evaluation
and quality estimation.



But the biggest impact on MT appeared with the new paradigm proposal:
Neural MT, which has just recently (in the Workshop of Machine Translation
2015) outperformed state-of-the-art systems. This new approach uses an
autoencoder architecture to build a neural system that is capable of
translating. With the new approach, the new big MT challenges lie on how to
deal with large vocabularies, document translation and computational power
among others*.*



This hot topic is raising interest from the scientific community and as a
response there have been several related events (i.e. tutorial[1]
<#m_-8327344658543317568_m_-2405071807023874716_m_8694670399740380760_-728715324_-1975383620_-1177276990_1928584617_684013530__ftn1>
and winter school[2]
<#m_-8327344658543317568_m_-2405071807023874716_m_8694670399740380760_-728715324_-1975383620_-1177276990_1928584617_684013530__ftn2>).
Moreover, the number of publications on this topic in top conferences such
as ACL, NAACL, EMNLP has dramatically increased in the last three years.
This would be the first special issue related to the topic. With this
special issue, we pretend to offer a compilation of works that give the
reader a global vision of how the deep learning techniques are applied to
MT and what new challenges offers.



This Special Issue expects high quality submissions on the following topics
(but not limited):

? Including deep learning knowledge in standard MT approaches (statistical,
rule-based, example-based...)

? Neural MT approaches

? MT hybrid techniques using deep learning

? Deep learning challenges in MT: vocabulary limitation, document
translation, computational power

? MT evaluation with deep learning techniques

? MT quality estimation with deep learning techniques

? Using deep learning in spoken language translation



*IMPORTANT DATES*

Submission deadline [EXTENDED]: *18th April 2016 *

Notification of rejection/re-submission: 30th July 2016

Notification of final acceptance: 30th October 2016

Expected publication date: 30th January 2017


*GUEST EDITORS*

Marta R. Costa-juss?, Universitat Polit?cnica de Catalunya, Spain.
marta.ruiz@upc.edu

Alexandre Allauzen, Centre National de la Recherche Scientifique, France.
allauzen@limsi.fr

Lo?c Barrault, Universit? du Maine, France.
loic.barrault@lium.univ-lemans.fr

Kyunghyun Cho, New York University, USA. kyunghyun.cho@nyu.edu

Holger Schwenk, Facebook, USA. schwenk@fb.com



------------------------------

[1]
<#m_-8327344658543317568_m_-2405071807023874716_m_8694670399740380760_-728715324_-1975383620_-1177276990_1928584617_684013530__ftnref1>
http://naacl.org/naacl-hlt-2015/tutorial-deep-learning.html

[2]
<#m_-8327344658543317568_m_-2405071807023874716_m_8694670399740380760_-728715324_-1975383620_-1177276990_1928584617_684013530__ftnref2>
http://dl4mt.computing.dcu.ie/



<http://www.costa-jussa.com>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/moses-support/attachments/20160331/6ae2fb94/attachment-0001.html

------------------------------

Message: 2
Date: Thu, 31 Mar 2016 14:58:35 +0200
From: Vincent Nguyen <vnguyen@neuf.fr>
Subject: [Moses-support] Maximum Phrase Table length
To: moses-support <moses-support@mit.edu>
Message-ID: <56FD1EFB.4020001@neuf.fr>
Content-Type: text/plain; charset=utf-8; format=flowed

Hello,

Does someone have some support to this (found in the doc) :

Maximum Phrase Length

The maximum length of phrases is limited to 7 words. The maximum phrase
length impacts the size of the phrase translation table, so shorter
limits may be desirable, if phrase table size is an issue. Previous
experiments have shown that performance increases only slightly when
including phrases of more that 3 words.

Summary

--max-phrase-length -- maximum length of phrases entered into
phrase table (default 7)


If there is no major improvement above 3, why is the default 7, and is
there a benchmark somewhere ?


Thanks
Vincent.


------------------------------

_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support


End of Moses-support Digest, Vol 113, Issue 74
**********************************************

0 Response to "Moses-support Digest, Vol 113, Issue 74"

Post a Comment