Moses-support Digest, Vol 111, Issue 73

Send Moses-support mailing list submissions to
moses-support@mit.edu

To subscribe or unsubscribe via the World Wide Web, visit
http://mailman.mit.edu/mailman/listinfo/moses-support
or, via email, send a message with subject or body 'help' to
moses-support-request@mit.edu

You can reach the person managing the list at
moses-support-owner@mit.edu

When replying, please edit your Subject line so it is more specific
than "Re: Contents of Moses-support digest..."


Today's Topics:

1. Re: OSM and lmplz are both using -T as a parameter directive
which causes error (Ergun Bicici)
2. how to give text file as input to moses without saving its
content (Apurva Joshi)
3. 2nd CFP: Computer Speech and Language Special Issue on Deep
Learning for Machine Translation (Marta Ruiz)


----------------------------------------------------------------------

Message: 1
Date: Sun, 24 Jan 2016 19:17:32 +0100
From: Ergun Bicici <ergunbicici@yahoo.com>
Subject: Re: [Moses-support] OSM and lmplz are both using -T as a
parameter directive which causes error
To: Ergun Bicici <ergunbicici@yahoo.com>
Cc: moses-support@mit.edu
Message-ID:
<CAB59qTNXJzsSEOL2UBcL=qrZrfpgS4xO91zX_RNAcswL04GSDw@mail.gmail.com>
Content-Type: text/plain; charset="utf-8"

?This works ok (without additional -T directive to lmplz):?
mosesdecoder/scripts/OSM/OSM-Train.perl --corpus-f SMT_de-en/training/cor
pus.1.de --corpus-e
?
SMT_de-en/training/corpus.1.en --alignment
?
SMT_de-en/mod
el/aligned.1.grow-diag-final-and --order 4 --out-dir
?
SMT_de-en/model/OSM.1 --moses-src-dir mosesdecoder --input-extension de
--output-extension en -lmplz 'mosesdecoder/bin/lmplz -S 40% '

Ergun

On Sun, Jan 24, 2016 at 3:50 PM, Ergun Bicici <ergunbicici@yahoo.com> wrote:

>
> Dear Moses Support,
>
> ?OSM training script instance like the following:?
>
> mosesdecoder/scripts/OSM/OSM-Train.perl --corpus-f SMT_de-en/training/cor
> pus.1.de --corpus-e
> ?
> SMT_de-en/training/corpus.1.en --alignment
> ?
> SMT_de-en/mod
> el/aligned.1.grow-diag-final-and --order 4 --out-dir
> ?
> SMT_de-en/model/OSM.1 --moses-src-dir mosesdecoder --input-extension de
> --output-extension en -lmplz 'mosesdecoder/bin/lmplz -S 40% -T
> ?
> SMT_de-en/model/tmp'
>
> Calls lmplz like the following:?
> ?Executing: mosesdecoder/bin/lmplz -S 40% -T
> ?
> SMT_de-en/model/tmp -T
> ?
> SMT_de-en
> /model/OSM.1 --order 4 --text
> ?
> SMT_de-en/model/OSM.1//opCorpus --arpa
> ?
> SMT_de-en
> /model/OSM.1//operationLM --prune 0 0 1
>
> causing the following error:
> option '--temp_prefix' cannot be specified more than once
> ?
> ?This works ok:?
> mosesdecoder/scripts/OSM/OSM-Train.perl --corpus-f SMT_de-en/training/cor
> pus.1.de --corpus-e
> ?
> SMT_de-en/training/corpus.1.en --alignment
> ?
> SMT_de-en/mod
> el/aligned.1.grow-diag-final-and --order 4 --out-dir
> ?
> SMT_de-en/model/OSM.1 --moses-src-dir mosesdecoder --input-extension de
> --output-extension en -lmplz 'mosesdecoder/bin/lmplz -S 40% -T
> ?
> SMT_de-en/model/tmp'
>
> Regards,
> Ergun
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/moses-support/attachments/20160124/e34a3cea/attachment-0001.html

------------------------------

Message: 2
Date: Mon, 25 Jan 2016 16:30:52 +0530
From: Apurva Joshi <apurvajoshi1992@gmail.com>
Subject: [Moses-support] how to give text file as input to moses
without saving its content
To: moses-support@mit.edu
Message-ID:
<CAMfCXbWBV9NyK5mVs-vLrQ7wLyQAy_VRywvc-d3+uDs4b+XSnQ@mail.gmail.com>
Content-Type: text/plain; charset="utf-8"

hello all,
i designed MT for english-hindi pair
and now i am creating website of my MT

my shell script is

cd /home/techmahindra/only_MT

cat /home/techmahindra/input.txt > only_moses_sentence.txt

/home/techmahindra/mosesdecoder-RELEASE-3.0/bin/moses -f
/home/techmahindra/working_coep/train/model/moses.ini -i
/home/techmahindra/only_MT/only_moses_sentence.txt>
/home/techmahindra/hindi_god/festival-hi-0.1/test2.hi


my website structure is

textbox for english input text >> convert(button to be press..after
pressing it above script will run) >> textbox for hindi output text


now when i type new english sentence in first textbox and press convert
button >
then my script runs ...
first content on textbox is copied to ~/input.txt ...and process continues
as per script
but pbm is every time i need to save input.txt ...then only moses command
accept it...i want automatic conversion after pressing "convert"
but i need to first save input.txt ... then i get output...

so in short file reloading problem is coming ... so plz tell me how to
solve this??
is there any linux command by which we can save txt file??

plzz do reply on apurvajoshi1992@gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/moses-support/attachments/20160125/43c72dff/attachment-0001.html

------------------------------

Message: 3
Date: Mon, 25 Jan 2016 12:32:24 +0100
From: Marta Ruiz <martaruizcostajussa@gmail.com>
Subject: [Moses-support] 2nd CFP: Computer Speech and Language Special
Issue on Deep Learning for Machine Translation
To: moses-support@mit.edu
Message-ID:
<CABEBqHKssL=XOkYhwgWUn6j4Y0Dxa1trroW4aSc0eDTyCJQPVg@mail.gmail.com>
Content-Type: text/plain; charset="utf-8"

*Computer Speech andLanguage Special Issue on Deep Learning for Machine
Translation
<http://www.journals.elsevier.com/computer-speech-and-language/call-for-papers/special-issue-on-deep-learning-for-machine-translation/>*



Deep Learning has been successfully applied to many areas including Natural
Language Processing, Speech Recognition and Image Processing. Deep learning
techniques have surprised the entire community both academy and industry by
powerfully learning from data.



Recently, deep learning has been introduced to Machine Translation (MT). It
first started as a kind of feature which was integrated in standard phrase
or syntax-based statistical approaches. Deep learning has been shown useful
in translation and language modeling as well as in reordering, tuning and
rescoring. Additionally, deep learning has been applied to MT evaluation
and quality estimation.



But the biggest impact on MT appeared with the new paradigm proposal:
Neural MT, which has just recently (in the Workshop of Machine Translation
2015) outperformed state-of-the-art systems. This new approach uses an
autoencoder architecture to build a neural system that is capable of
translating. With the new approach, the new big MT challenges lie on how to
deal with large vocabularies, document translation and computational power
among others*.*



This hot topic is raising interest from the scientific community and as a
response there have been several related events (i.e. tutorial[1]
<#-1177276990_1928584617_684013530__ftn1> and winter school[2]
<#-1177276990_1928584617_684013530__ftn2>). Moreover, the number of
publications on this topic in top conferences such as ACL, NAACL, EMNLP has
dramatically increased in the last three years. This would be the first
special issue related to the topic. With this special issue, we pretend to
offer a compilation of works that give the reader a global vision of how
the deep learning techniques are applied to MT and what new challenges
offers.



This Special Issue expects high quality submissions on the following topics
(but not limited):

? Including deep learning knowledge in standard MT approaches (statistical,
rule-based, example-based...)

? Neural MT approaches

? MT hybrid techniques using deep learning

? Deep learning challenges in MT: vocabulary limitation, document
translation, computational power

? MT evaluation with deep learning techniques

? MT quality estimation with deep learning techniques

? Using deep learning in spoken language translation



*IMPORTANT DATES*

Submission deadline: 30th March 2016

Notification of rejection/re-submission: 30th July 2016

Notification of final acceptance: 30th October 2016

Expected publication date: 30th January 2017


*GUEST EDITORS*

Marta R. Costa-juss?, Universitat Polit?cnica de Catalunya, Spain.
marta.ruiz@upc.edu

Alexandre Allauzen, Centre National de la Recherche Scientifique, France.
allauzen@limsi.fr

Lo?c Barrault, Universit? du Maine, France.
loic.barrault@lium.univ-lemans.fr

Kyunghyun Cho, New York University, USA. kyunghyun.cho@nyu.edu

Holger Schwenk, Facebook, USA. schwenk@fb.com



------------------------------

[1] <#-1177276990_1928584617_684013530__ftnref1>
http://naacl.org/naacl-hlt-2015/tutorial-deep-learning.html

[2] <#-1177276990_1928584617_684013530__ftnref2>
http://dl4mt.computing.dcu.ie/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/moses-support/attachments/20160125/8478b4a5/attachment.html

------------------------------

_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support


End of Moses-support Digest, Vol 111, Issue 73
**********************************************

Related Posts :

0 Response to "Moses-support Digest, Vol 111, Issue 73"

Post a Comment