Send Moses-support mailing list submissions to
moses-support@mit.edu
To subscribe or unsubscribe via the World Wide Web, visit
http://mailman.mit.edu/mailman/listinfo/moses-support
or, via email, send a message with subject or body 'help' to
moses-support-request@mit.edu
You can reach the person managing the list at
moses-support-owner@mit.edu
When replying, please edit your Subject line so it is more specific
than "Re: Contents of Moses-support digest..."
Today's Topics:
1. Tuning and test sets for all EU language pairs (Lane Schwartz)
2. Re: integration of efmaral word alignment in Moses
pipeline/EMS (Jorg Tiedemann)
3. Conversion of phrase model to PhraseDictionaryCompact
(Shubham Khandelwal)
4. RNN-based features in Moses (Shafagh Fadaee)
5. Re: Conversion of phrase model to PhraseDictionaryCompact
(Shubham Khandelwal)
----------------------------------------------------------------------
Message: 1
Date: Wed, 7 Dec 2016 15:01:32 -0600
From: Lane Schwartz <dowobeha@gmail.com>
Subject: [Moses-support] Tuning and test sets for all EU language
pairs
To: "moses-support@mit.edu" <moses-support@mit.edu>
Message-ID:
<CABv3vZnP47REmc43=umO=25EvrKeuXaOcRu7V36pEdWRhGNRaQ@mail.gmail.com>
Content-Type: text/plain; charset="utf-8"
For those of you who have worked on building systems for the full set of
Europarl languages, did you have a consistent tuning and devtest set that
was consistent and parallel across all of them?
Thanks,
Lane
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/moses-support/attachments/20161207/a7349b87/attachment-0001.html
------------------------------
Message: 2
Date: Thu, 8 Dec 2016 00:06:37 +0200
From: Jorg Tiedemann <tiedeman@gmail.com>
Subject: Re: [Moses-support] integration of efmaral word alignment in
Moses pipeline/EMS
To: Matt Post <post@cs.jhu.edu>
Cc: moses-support <moses-support@mit.edu>
Message-ID: <45E6F6F8-45AC-4A2B-8D5A-E5CA392451BA@gmail.com>
Content-Type: text/plain; charset=utf-8
>
> From the GitHub pages it appears that eflomal supercedes efmaral
this is true, I guess
> ? is there any purpose therefore in using efmaral?
it is a bit faster (but consumes more memory).
eflomal does not support the fast_align input format at the moment but that can easily be fixed if necessary.
> Also, the linked PBML paper has no mention of eflomal ? how does it perform in downstream BLEU tasks? Is it comparable to what you reported in Table 4?
eflomal did not exist at the tie of writing that paper. We haven?t done the BLEU evaluations but I expect no big difference. Actually, efmoral can be run with parameter settings that makes it equivalent to efmaral.
So, eflomal should be the way to go. It wouldn?t be too hard to support both I guess but I don?t dare to touch the experiment,perl monster
J?rg
> matt
>
>
>> On Dec 7, 2016, at 2:50 AM, Jorg Tiedemann <tiedeman@gmail.com> wrote:
>>
>>
>> efmaral and eflomal are efficient Markov chain word aligners using Gibbs sampling that can be used to replace GIZA++/fast_align in the typical Moses training pipelines:
>>
>> https://github.com/robertostling/efmaral
>> https://github.com/robertostling/eflomal
>>
>> Would anyone be interested in adding support in the Moses pipelines and experiment.perl?
>> Input and output formats are compatible with fast_align and Moses formats.
>>
>> The tools could also be mentioned at statmt.org/moses
>>
>> All the best,
>> J?rg
>>
>> ?????????????????????????????????
>> J?rg Tiedemann
>> Department of Modern Languages
>> University of Helsinki
>> http://blogs.helsinki.fi/language-technology/
>> ?????????????????????????????????
>>
>>
>>
>>
>>
>>
>> _______________________________________________
>> Moses-support mailing list
>> Moses-support@mit.edu
>> http://mailman.mit.edu/mailman/listinfo/moses-support
>
>
> _______________________________________________
> Moses-support mailing list
> Moses-support@mit.edu
> http://mailman.mit.edu/mailman/listinfo/moses-support
------------------------------
Message: 3
Date: Thu, 8 Dec 2016 09:04:13 +0530
From: Shubham Khandelwal <skhlnmiit@gmail.com>
Subject: [Moses-support] Conversion of phrase model to
PhraseDictionaryCompact
To: moses-support@mit.edu
Message-ID:
<CAHweNTs+fDzLvrMLKOjFMz+6mFT6coYYDKzNh1UZVZRY1ZuDOw@mail.gmail.com>
Content-Type: text/plain; charset="utf-8"
Hello,
I have just downloaded phrase-table.2.gz (18GB) de-en model
and phrase-table.3.gz (22GB) fr-en model from the available pre-made
models.
Now, I am converting them to PhraseDictionaryCompact using following
command (for exmaple):
*~/mosesdecoder/bin/processPhraseTableMin -threads all -in
~/model/phrase-table.3.gz -nscores 4 -out binarised-model/phrase-table *
But after passing 1/3, it gave following segementation fault error:
*Pass 1/3: Creating hash function for rank assignment*
*Segmentation fault (core dumped)*
I have found almost same issue on this thread:
http://comments.gmane.org/gmane.comp.nlp.moses.user/13033
However, I have provided the existing *binarised-model *folder in the
command. Also, I have the write-access in /tmp but still it gave
sementation fault.
Can you please tell me what could be wrong here ?
Thanking You.
Regards,
Shubham
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/moses-support/attachments/20161207/d54da2b1/attachment-0001.html
------------------------------
Message: 4
Date: Thu, 8 Dec 2016 12:04:50 +0330
From: Shafagh Fadaee <shafagh@gmail.com>
Subject: [Moses-support] RNN-based features in Moses
To: moses-support@mit.edu
Message-ID:
<CAKqZtrSRSX-Hc_6_GUQUcoip=bpCw8eJP__b4nnVRoR6oV2g4w@mail.gmail.com>
Content-Type: text/plain; charset="utf-8"
Dear Moses Community,
I'm searching for some RNN-based features added to Moses decoder or
used in reranking Moses' n-best list.
It seems Moses have some (LM) features based on Feedforward neural
networks but I haven't find any implementation for recurrent neural
networks in Moses.
Apparently there are many efforts in this field but I have trouble
finding released codes to use as guideline in my work.
I would be happy to hear your suggestions.
Best regards,
Hakimeh Fadaei
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/moses-support/attachments/20161208/645fc39c/attachment-0001.html
------------------------------
Message: 5
Date: Thu, 8 Dec 2016 21:22:41 +0530
From: Shubham Khandelwal <skhlnmiit@gmail.com>
Subject: Re: [Moses-support] Conversion of phrase model to
PhraseDictionaryCompact
To: moses-support@mit.edu
Message-ID:
<CAHweNTtd6JhExoeYWnH=E1or2wRfw6h6gtSkRg+3M_wxQ7iq3Q@mail.gmail.com>
Content-Type: text/plain; charset="utf-8"
Hello,
This is just the reminder of my previous email.
Thanking You.
Regards,
Shubham
On Thu, Dec 8, 2016 at 9:04 AM, Shubham Khandelwal <skhlnmiit@gmail.com>
wrote:
> Hello,
>
> I have just downloaded phrase-table.2.gz (18GB) de-en model
> and phrase-table.3.gz (22GB) fr-en model from the available pre-made
> models.
> Now, I am converting them to PhraseDictionaryCompact using following
> command (for exmaple):
>
>
> *~/mosesdecoder/bin/processPhraseTableMin -threads all -in
> ~/model/phrase-table.3.gz -nscores 4 -out binarised-model/phrase-table *
>
> But after passing 1/3, it gave following segementation fault error:
>
> *Pass 1/3: Creating hash function for rank assignment*
> *Segmentation fault (core dumped)*
>
> I have found almost same issue on this thread:
> http://comments.gmane.org/gmane.comp.nlp.moses.user/13033
> However, I have provided the existing *binarised-model *folder in the
> command. Also, I have the write-access in /tmp but still it gave
> sementation fault.
>
> Can you please tell me what could be wrong here ?
>
> Thanking You.
>
> Regards,
> Shubham
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/moses-support/attachments/20161208/e186744f/attachment.html
------------------------------
_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support
End of Moses-support Digest, Vol 122, Issue 12
**********************************************
Subscribe to:
Post Comments (Atom)
0 Response to "Moses-support Digest, Vol 122, Issue 12"
Post a Comment