Send Moses-support mailing list submissions to
moses-support@mit.edu
To subscribe or unsubscribe via the World Wide Web, visit
http://mailman.mit.edu/mailman/listinfo/moses-support
or, via email, send a message with subject or body 'help' to
moses-support-request@mit.edu
You can reach the person managing the list at
moses-support-owner@mit.edu
When replying, please edit your Subject line so it is more specific
than "Re: Contents of Moses-support digest..."
Today's Topics:
1. Low NCE log-likelihood on Bilingual Neural LM (jian zhang)
2. Re: Low NCE log-likelihood on Bilingual Neural LM (Rico Sennrich)
3. Re: Performance issues using Moses Server with Moses 3
(Barry Haddow)
----------------------------------------------------------------------
Message: 1
Date: Tue, 21 Jul 2015 13:15:39 +0100
From: jian zhang <zhangj@computing.dcu.ie>
Subject: [Moses-support] Low NCE log-likelihood on Bilingual Neural LM
To: moses-support@mit.edu
Message-ID:
<CALA=z0DcBLfh3QB1gQvr2Po1mNy0+_ipU_zMu9WAkcAm9WPsHw@mail.gmail.com>
Content-Type: text/plain; charset="utf-8"
Hi all,
I am running experiments on Bilingual Neural LM.
For extract_training.py, I set
--prune-target-vocab 10000 --prune-source-vocab 10000 --target-context 5
--source-context 4
For train_nplm.py, I set
--ngram-size 14 --output-embedding 512 --input-embedding 192 --hidden 512
--e 5
I use 2 million parallel sentence pairs for training. The implementation is
from https://github.com/rsennrich/nplm
>From the training log generated by train_nplm.py at the first 2 iterations,
I have
Number of training minibatches: 52853
Epoch 1
Current learning rate: 1
Training minibatches: 10000...20000...30000...40000...50000...done.
Training NCE log-likelihood: -1.64277e+08
Writing model
Epoch 2
Current learning rate: 1
Training minibatches: 10000...20000...30000...40000...50000...done.
Training NCE log-likelihood: -1.38122e+08
Writing model
The NCE log-likelihood number is suspicious. It is very low. Did I set any
parameters wrong?
Regards,
Jian
--
Jian Zhang
Centre for Next Generation Localisation (CNGL)
<http://www.cngl.ie/index.html>
Dublin City University <http://www.dcu.ie/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/moses-support/attachments/20150721/c66c47fe/attachment-0001.htm
------------------------------
Message: 2
Date: Tue, 21 Jul 2015 13:38:35 +0100
From: Rico Sennrich <rico.sennrich@gmx.ch>
Subject: Re: [Moses-support] Low NCE log-likelihood on Bilingual
Neural LM
To: moses-support@mit.edu
Message-ID: <55AE3D4B.3090104@gmx.ch>
Content-Type: text/plain; charset="windows-1252"
Hello Jian,
NPLM reports the log-likelihood of the whole training set, and the
number is plausible.
assuming you have a minibatch size of 1000, your training set perplexity
is exp(1.38122e+08/52853/1000)=13.64
you probably want to measure perplexity on a held-out development set
though, with softmax normalisation instead of NCE.
best wishes,
Rico
On 21.07.2015 13:15, jian zhang wrote:
> Hi all,
>
> I am running experiments on Bilingual Neural LM.
>
> For extract_training.py, I set
> --prune-target-vocab 10000 --prune-source-vocab 10000 --target-context
> 5 --source-context 4
>
> For train_nplm.py, I set
> --ngram-size 14 --output-embedding 512 --input-embedding 192 --hidden
> 512 --e 5
>
> I use 2 million parallel sentence pairs for training. The
> implementation is from https://github.com/rsennrich/nplm
>
> From the training log generated by train_nplm.py at the first 2
> iterations, I have
>
> Number of training minibatches: 52853
> Epoch 1
> Current learning rate: 1
> Training minibatches: 10000...20000...30000...40000...50000...done.
> Training NCE log-likelihood: -1.64277e+08
> Writing model
> Epoch 2
> Current learning rate: 1
> Training minibatches: 10000...20000...30000...40000...50000...done.
> Training NCE log-likelihood: -1.38122e+08
> Writing model
>
>
> The NCE log-likelihood number is suspicious. It is very low. Did I set
> any parameters wrong?
>
> Regards,
>
> Jian
>
>
>
> --
> Jian Zhang
> Centre for Next Generation Localisation (CNGL)
> <http://www.cngl.ie/index.html>
> Dublin City University <http://www.dcu.ie/>
>
>
> _______________________________________________
> Moses-support mailing list
> Moses-support@mit.edu
> http://mailman.mit.edu/mailman/listinfo/moses-support
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/moses-support/attachments/20150721/aa2aff76/attachment-0001.htm
------------------------------
Message: 3
Date: Tue, 21 Jul 2015 15:07:17 +0100
From: Barry Haddow <bhaddow@staffmail.ed.ac.uk>
Subject: Re: [Moses-support] Performance issues using Moses Server
with Moses 3
To: Oren <mooshified@gmail.com>
Cc: "moses-support@mit.edu" <moses-support@mit.edu>
Message-ID: <55AE5215.4060003@staffmail.ed.ac.uk>
Content-Type: text/plain; charset="utf-8"
On 21/07/15 14:51, Oren wrote:
> I am using the in-memory mode, using about 50GB of RAM. (No swap
> issues as far as I can tell.) Could that cause issues?
Yes, swapping would definitely cause issues - was that your question?
>
> I looked at the commit you linked to, but it doesn't seem to be
> something configurable beyond the -threads switch. Am I missing something?
The commit enables you to set the maximum number of connections to be
the same as the maximum number of threads.
>
> On Tuesday, July 21, 2015, Barry Haddow <bhaddow@staffmail.ed.ac.uk
> <mailto:bhaddow@staffmail.ed.ac.uk>> wrote:
>
> Hi Oren
>
> Does your host have 18 threads available? It could also be that
> xmlrpc-c is limiting the number of connections - this can now be
> configured:
> https://github.com/moses-smt/mosesdecoder/commit/b3baade7f022edbcea2969679a40616683f63523
>
> Slowdowns in Moses are often caused by disk access bottlenecks.
> You can use --minphr-memory and --minlexr-memory to make sure that
> the phrase and reordering tables are loaded in to memory, rather
> than being access on-demand. Make sure your host has enough RAM
> and is not swapping. As I mentioned before there are various ways
> to make your models smaller
> (http://www.statmt.org/moses/?n=Advanced.RuleTables), which can
> make a big difference to speed depending on your setup.
>
> cheers - Barry
>
> On 21/07/15 09:30, Oren wrote:
>> Hi Barry,
>>
>> Thanks for the quick response.
>>
>> I added the switch "-threads 18" to the command to raise moses
>> server. The slowness issue persists but in a different form. Most
>> requests return right away, even under heavy load, but some
>> requests (about 5%) take far longer - about 15-20seconds.
>>
>> Perhaps there are other relevant switches?
>>
>> Thanks again.
>>
>> On Monday, July 20, 2015, Barry Haddow
>> <bhaddow@staffmail.ed.ac.uk
>> <javascript:_e(%7B%7D,'cvml','bhaddow@staffmail.ed.ac.uk');>> wrote:
>>
>> Hi Oren
>>
>> The threading model is different. In v1, the server created a
>> new thread for every request, v3 uses a thread pool. Try
>> increasing the number of threads.
>>
>> Also, make sure you use the compact phrase table and KenLM as
>> they are normally faster, and pre-pruning your phrase table
>> can help,
>>
>> cheers - Barry
>>
>> On 20/07/15 12:01, Oren wrote:
>>> Hi all,
>>>
>>> We are in the process of migrating from Moses 1 to Moses 3.
>>> We have noticed a significant slowdown when sending many
>>> requests at once to Moses Server. The first request will
>>> actually finish about 25% faster that a single request using
>>> Moses 1, but as more requests accumulate there is a marked
>>> slowdown, until requests take 5 times longer or more.
>>>
>>> Is this a known issue? Is it specific to Moses Server? What
>>> can we do about it?
>>>
>>> Thanks!
>>>
>>> Oren.
>>>
>>>
>>> _______________________________________________
>>> Moses-support mailing list
>>> Moses-support@mit.edu
>>> http://mailman.mit.edu/mailman/listinfo/moses-support
>>
>>
>>
>> _______________________________________________
>> Moses-support mailing list
>> Moses-support@mit.edu <javascript:_e(%7B%7D,'cvml','Moses-support@mit.edu');>
>> http://mailman.mit.edu/mailman/listinfo/moses-support
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/moses-support/attachments/20150721/19d9d8aa/attachment.htm
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: not available
Url: http://mailman.mit.edu/mailman/private/moses-support/attachments/20150721/19d9d8aa/attachment.bat
------------------------------
_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support
End of Moses-support Digest, Vol 105, Issue 41
**********************************************
Subscribe to:
Post Comments (Atom)
0 Response to "Moses-support Digest, Vol 105, Issue 41"
Post a Comment