Moses-support Digest, Vol 125, Issue 21

Send Moses-support mailing list submissions to
moses-support@mit.edu

To subscribe or unsubscribe via the World Wide Web, visit
http://mailman.mit.edu/mailman/listinfo/moses-support
or, via email, send a message with subject or body 'help' to
moses-support-request@mit.edu

You can reach the person managing the list at
moses-support-owner@mit.edu

When replying, please edit your Subject line so it is more specific
than "Re: Contents of Moses-support digest..."


Today's Topics:

1. Exception: std::bad_alloc on Testing step (S?n Nguy?n V?n)
2. Please Help Me ! (Tran Anh)


----------------------------------------------------------------------

Message: 1
Date: Sun, 12 Mar 2017 19:28:11 +0100
From: S?n Nguy?n V?n <bnvanson@gmail.com>
Subject: [Moses-support] Exception: std::bad_alloc on Testing step
To: moses-support@mit.edu
Message-ID:
<CA+TRtYXbLD7GbN2wedA2awxY_x6r2XJmmFcxS=u_Wf-L64_z0A@mail.gmail.com>
Content-Type: text/plain; charset="utf-8"

Hello,

I installed Moses using Virtual machine ubuntu 14.04 32-bit.ova (
http://www.statmt.org/moses/RELEASE-3.0/vm/). Memory for VM is 2Gb.

When I run step "Testing" (from Baseline System) with command:

* ~/mosesdecoder/bin/moses -f ~/working/mert-work/moses.ini*

This is the result:
hieu@hieu-VirtualBox:~$ /home/hieu/workspace/github/mosesdecoder/bin/moses
-f /home/hieu/ working/mert-work/moses.ini Defined parameters (per
moses.ini or switch): config: /home/hieu/ working/mert-work/moses.ini
line=WordPenalty FeatureFunction: WordPenalty0 start: 0 end: 0 Exception:
moses/ScoreComponentCollection.cpp:250 in void
Moses::ScoreComponentCollection::Assign(const Moses::FeatureFunction*,
const std::vector<float>&) threw util::Exception'. Feature function
WordPenalty0 specified 1 dense scores or weights. Actually has 0

Using this command:
/home/hieu/workspace/github/mosesdecoder/bin/moses -f
/home/hieu/working/mert-work/moses.ini \ -i /home/hieu/working/a >
/home/hieu/working/aa.txt

Result:
hieu@hieu-VirtualBox:~/working$
/home/hieu/workspace/github/mosesdecoder/bin/moses -f
/home/hieu/working/mert-work/moses.ini \ > -i /home/hieu/working/a.txt >
/home/hieu/working/aa.txt Defined parameters (per moses.ini or switch):
config: /home/hieu/working/mert-work/moses.ini distortion-limit: 6 feature:
UnknownWordPenalty WordPenalty PhrasePenalty PhraseDictionaryMemory
name=TranslationModel0 num-features=4
path=/home/hieu/working/train/model/phrase-table.gz input-factor=0
output-factor=0 LexicalReordering name=LexicalReordering0 num-features=6
type=wbe-msd-bidirectional-fe-allff input-factor=0 output-factor=0
path=/home/hieu/working/train/model/reordering-table.wbe-msd-bidirectional-fe.gz
Distortion KENLM lazyken=0 name=LM0 factor=0
path=/home/hieu/lm/news-commentary-v8.fr-en.blm.en order=3 input-factors: 0
input-file: /home/hieu/working/a mapping: 0 T 0 threads: 8 weight:
LexicalReordering0= 0.0723751 0.00375534 0.0524019 0.0411615 0.0241884
0.0165493 Distortion0= 0.0625359 LM0= 0.0700801 WordPenalty0= -0.27279
PhrasePenalty0= 0.163331 TranslationModel0= 0.00903105 0.0843863 0.0633766
0.0640378 UnknownWordPenalty0= 1 line=UnknownWordPenalty FeatureFunction:
UnknownWordPenalty0 start: 0 end: 0 line=WordPenalty FeatureFunction:
WordPenalty0 start: 1 end: 1 line=PhrasePenalty FeatureFunction:
PhrasePenalty0 start: 2 end: 2 line=PhraseDictionaryMemory
name=TranslationModel0 num-features=4
path=/home/hieu/working/train/model/phrase-table.gz input-factor=0
output-factor=0 FeatureFunction: TranslationModel0 start: 3 end: 6
line=LexicalReordering name=LexicalReordering0 num-features=6
type=wbe-msd-bidirectional-fe-allff input-factor=0 output-factor=0
path=/home/hieu/working/train/model/reordering-table.wbe-msd-bidirectional-fe.gz
FeatureFunction: LexicalReordering0 start: 7 end: 12 Initializing
LexicalReordering.. line=Distortion FeatureFunction: Distortion0 start: 13
end: 13 line=KENLM lazyken=0 name=LM0 factor=0
path=/home/hieu/lm/news-commentary-v8.fr-en.blm.en order=3 FeatureFunction:
LM0 start: 14 end: 14 Loading UnknownWordPenalty0 Loading WordPenalty0
Loading PhrasePenalty0 Loading LexicalReordering0 Loading table into
memory...done. Loading Distortion0 Loading LM0 Loading TranslationModel0
Start loading text phrase table. Moses format : [85.074] seconds Reading
/home/hieu/working/train/model/phrase-table.gz
----5---10---15---20---25---30---35---40---45---50---55---60---65---70---75---80---85---90---95--100
****************************************************************************************************
*Exception: std::bad_alloc*

Could you please help me what is the problem here and how can I solve it?

Thank you and Best regards,
Binh
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/moses-support/attachments/20170312/88154654/attachment-0001.html

------------------------------

Message: 2
Date: Mon, 13 Mar 2017 07:44:12 +0800
From: Tran Anh <anhuni1006@gmail.com>
Subject: [Moses-support] Please Help Me !
To: moses-support@mit.edu
Message-ID:
<CA+QuFCTYX=LY+6r9xx2f2aNZA4W9kGbPf+o0e+oHxBu0SgpU6w@mail.gmail.com>
Content-Type: text/plain; charset="utf-8"

I have done experiments with Factored model. The tuning and testing is done
with source text annotated with the same factors as during the training.
The target text is clean, without factors.

I found that my factored model (bleu score = 22.2) higher than bleu score
of Baseline = 21.11(no factor).
Training command has translation factors and generation factors steps:
(......--translation-factors 0-0+1-1+2-2 --generation-factors 2,3-0.....).

*This is moses.ini file (trainig is finished, but notyet tuning):*

#########################
### MOSES CONFIG FILE ###
#########################

# input factors
[input-factors]
0
1
2

# mapping steps
[mapping]
0 T 0
0 T 1
0 T 2

[distortion-limit]
6

# feature functions
[feature]
UnknownWordPenalty
WordPenalty
PhrasePenalty
PhraseDictionaryMemory name=TranslationModel0 num-features=4
path=/home/yychen/55factor-hz4new-VC/train2-ge3/train/model/phrase-table.0-0.gz
input-factor=0 output-factor=0
PhraseDictionaryMemory name=TranslationModel1 num-features=4
path=/home/yychen/55factor-hz4new-VC/train2-ge3/train/model/phrase-table.1-1.gz
input-factor=1 output-factor=1
PhraseDictionaryMemory name=TranslationModel2 num-features=4
path=/home/yychen/55factor-hz4new-VC/train2-ge3/train/model/phrase-table.2-2.gz
input-factor=2 output-factor=2
Generation name=GenerationModel0 num-features=2 path=/home/yychen/55factor-
hz4new-VC/train2-ge3/train/model/generation.2,3-0.gz input-factor=2,3
output-factor=0
LexicalReordering name=LexicalReordering0 num-features=6
type=wbe-msd-bidirectional-fe-allff input-factor=0 output-factor=0
path=/home/yychen/55factor-hz4new-VC/train2-ge3/train/
model/reordering-table.0-0.wbe-msd-bidirectional-fe.gz
Distortion
KENLM lazyken=0 name=LM0 factor=0 path=/home/yychen/55factor-
hz4new-VC/train2-ge3/vi-ch.lm.ch order=3

# dense weights for feature functions
[weight]
# The default weights are NOT optimized for translation quality. You MUST
tune the weights.
# Documentation for tuning is here: http://www.statmt.org/moses/?
n=FactoredTraining.Tuning

UnknownWordPenalty0= 1
WordPenalty0= -1
PhrasePenalty0= 0.2
TranslationModel0= 0.2 0.2 0.2 0.2
TranslationModel1= 0.2 0.2 0.2 0.2
TranslationModel2= 0.2 0.2 0.2 0.2
GenerationModel0= 0.3 0
LexicalReordering0= 0.3 0.3 0.3 0.3 0.3 0.3
Distortion0= 0.3
LM0= 0.5


*This is moses.ini file (tuning is finished):*

# MERT optimized configuration
# decoder /opt/moses/bin/moses
# BLEU 0.200847 on dev /home/yychen/55factor-hz4new-VC/tun2-ge3/vi.tun4-new
# We were before running iteration 4
# finished 2017? 01? 08? ??? 19:51:49 CST
### MOSES CONFIG FILE ###
#########################

# input factors
[input-factors]
0
1
2

# mapping steps
[mapping]
0 T 0



#[decoding-graph-backoff]
#0
#1

[distortion-limit]
6

# feature functions
[feature]
UnknownWordPenalty
WordPenalty
PhrasePenalty
PhraseDictionaryMemory name=TranslationModel0 num-features=4
path=/home/yychen/55factor-hz4new-VC/train2-ge3/train/model/phrase-table.0-0.gz
input-factor=0 output-factor=0
PhraseDictionaryMemory name=TranslationModel1 num-features=4
path=/home/yychen/55factor-hz4new-VC/train2-ge3/train/model/phrase-table.1-1.gz
input-factor=1 output-factor=1
PhraseDictionaryMemory name=TranslationModel2 num-features=4
path=/home/yychen/55factor-hz4new-VC/train2-ge3/train/model/phrase-table.2-2.gz
input-factor=2 output-factor=2
Generation name=GenerationModel0 num-features=2
path=/home/yychen/55factor-hz4new-VC/train2-ge3/train/model/generation.2,3-0.gz
input-factor=2,3 output-factor=0
LexicalReordering name=LexicalReordering0 num-features=6
type=wbe-msd-bidirectional-fe-allff input-factor=0 output-factor=0
path=/home/yychen/55factor-hz4new-VC/train2-ge3/train/model/
reordering-table.0-0.wbe-msd-bidirectional-fe.gz
Distortion
KENLM lazyken=0 name=LM0 factor=0 path=/home/yychen/55factor-hz4
new-VC/train2-ge3/vi-ch.lm.ch order=3

# dense weights for feature functions
[weight]

LexicalReordering0= 0.0421305 0.0145905 0.0421305 0.0419472 0.0571605
0.110762
Distortion0= 0.0357908
LM0= 0.0702177
WordPenalty0= -0.140435
PhrasePenalty0= 0.037449
TranslationModel0= 0.00820789 0.0280871 0.117941 -0.00550954
TranslationModel1= 0.0280871 0.0273782 -0.0150248 0.0280871
TranslationModel2= 0.0453928 0.00576192 0.0280871 0.0276907
GenerationModel0= 0.0421305 0
UnknownWordPenalty0= 1

Here, I want to try translate " ? ? ? " in source language to target
language by using n-best.
However, I want to demonstrate why that result is better BASELINE, by using
n-best (% moses -f moses.ini -n-best-list listfile2 < in).
When tuning process is finished, i tried to translate some resource
sentences to target sentences. But, parameters of TranslationModel0 ( map
0-0) is changed, while the parameters of (TranslationModel1,
TranslationModel2, GenerationModel0) are 0 0 0 0. Translation results is as
follows . *(here, n = 2)**:*



0 ||| ? ? ? ||| LexicalReordering0= -1.60944 0 0 0 0 0 Distortion0= 0 LM0=
-15.2278 LM1= -699.809 WordPenalty0= -3 PhrasePenalty0= 1
TranslationModel0= -1.38629 -2.20651 0 -2.21554 *TranslationModel1= 0 0 0 0
TranslationModel2= 0 0 0 0 GenerationModel0= 0 0* ||| -0.589076
0 ||| ? ? ? ||| LexicalReordering0= -1.86048 0 0 -0.510826 0 0
Distortion0= 0 LM0= -15.2278 LM1= -699.809 WordPenalty0= -3
PhrasePenalty0= 2 TranslationModel0= -2.86909 -2.20651 -0.09912
-2.21554 *TranslationModel1=
0 0 0 0 TranslationModel2= 0 0 0 0 GenerationModel0= 0 0* ||| -0.727864

I want to compare my factored model with baseline at every translation step
in SMT to explain why my model is good.
So I want to ask you:

1. Can you explain for me why that parameters are 0 0 0 0.?
2. My factors which I added to factored model are useful or not?
3. How to get the parameters in translation result (n-best) of
*TranslationModel1, **TranslationModel2, **GenerationModel0* is different
to 0 0 0 0?

i am waiting for you reply ~~!
Thank you so much!
With best regards,

Tran Anh,
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/moses-support/attachments/20170312/3cea9230/attachment.html

------------------------------

_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support


End of Moses-support Digest, Vol 125, Issue 21
**********************************************

0 Response to "Moses-support Digest, Vol 125, Issue 21"

Post a Comment