Send Moses-support mailing list submissions to
moses-support@mit.edu
To subscribe or unsubscribe via the World Wide Web, visit
http://mailman.mit.edu/mailman/listinfo/moses-support
or, via email, send a message with subject or body 'help' to
moses-support-request@mit.edu
You can reach the person managing the list at
moses-support-owner@mit.edu
When replying, please edit your Subject line so it is more specific
than "Re: Contents of Moses-support digest..."
Today's Topics:
1. Workshop on Neural Machine Translation Shared Task on
Efficient NMT (Graham Neubig)
2. Call for Proposals: Low-resource Neural Machine Translation
(Juan Miguel Pino)
----------------------------------------------------------------------
Message: 1
Date: Fri, 9 Mar 2018 20:14:26 -0500
From: Graham Neubig <gneubig@cs.cmu.edu>
Subject: [Moses-support] Workshop on Neural Machine Translation Shared
Task on Efficient NMT
To: "<moses-support@mit.edu>" <moses-support@mit.edu>
Message-ID:
<CADkjOCOMiPpBBXWT6yR7yYDmqibr2fs0DCpwZoHGK8R=-H+h7Q@mail.gmail.com>
Content-Type: text/plain; charset="UTF-8"
Hi All,
(apologies for cross-posting, but I imagine that many people on this
list are also interested in NMT)
I'd like to note that there will be a shared task on efficient Neural
Machine Translation at WNMT2018 (co-located with ACL2018).
> https://sites.google.com/site/wnmt18/shared-task
If you're interested in making memory or compute efficient NMT models,
we'd love to have you participate! More details below.
Also, we're still looking for research paper submissions to the
workshop, so we'd encourage people to join that as well.
Graham
-----------------
Basic Idea
The basic idea of this task (inspired by the small NMT task at the
workshop on Asian Translation) is that for NMT, not only accuracy, but
also test-time efficiency is important.
Efficiency can include a number of concepts:
* Memory Efficiency: We would like to have small models. Evaluation
measures include:
** Size on disk of the model
** Number of parameters of the model
** Size in memory of the full program
* Computational Efficiency: We would like to have fast models.
Evaluation measures include:
** Time to decode the test set in a single CPU thread
** Time to decode the test set on a single GPU
Tracks
The goal of the task will be to find systems that are both accurate
and efficient. In other words, we want to find systems on the Pareto
Frontier of efficiency in accuracy. Participants can submit any system
that they like, and any system on this Pareto frontier will be
considered advantageous. However, we will particularly highlight
systems that satisfy one of the two categories:
* Efficiency track: We will have a track where the models that perform
at least as well as the baseline attempt to create the most efficient
implementation. Here, the winner will be the system that achieves a
baseline BLEU score with the highest efficiency, memory or
computational.
* Accuracy track: We will have a track where models that are at least
as efficient as the baseline attempt to improve the BLEU score. Here,
the winner will be the system that can improve accuracy the most
without a decrease in efficiency.
------------------------------
Message: 2
Date: Sat, 10 Mar 2018 07:50:55 +0000
From: Juan Miguel Pino <juancarabina@fb.com>
Subject: [Moses-support] Call for Proposals: Low-resource Neural
Machine Translation
To: "moses-support@mit.edu" <moses-support@mit.edu>
Message-ID: <1B48A2EF-A85B-450A-B238-A7D276B13F3B@fb.com>
Content-Type: text/plain; charset="utf-8"
[Apologies for cross-posting]
Facebook is pleased to invite the academic community to respond to this call for research proposals on low-resource neural machine translation. Awards will be made in amounts ranging from $20,000 to $40,000 per proposal for projects up to one year in duration, beginning in June 2018. Research topics should be relevant to low resource neural machine translation, including, but not limited to:
- Unsupervised Neural Machine Translation for low resource language pairs
- Comparable corpora mining for low resource language pairs
- Exploiting monolingual resources for low resource language pairs
Applications close April 18, 2018, 11:59 PST. More details and instructions on how to apply are available at https://research.fb.com/programs/research-awards/proposals/low-resource-neural-machine-translation.
Please send email to fguzman@fb.com or juancarabina@fb.com for further questions.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/moses-support/attachments/20180310/54dc138f/attachment-0001.html
------------------------------
_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support
End of Moses-support Digest, Vol 137, Issue 3
*********************************************
Subscribe to:
Post Comments (Atom)
0 Response to "Moses-support Digest, Vol 137, Issue 3"
Post a Comment