Send Moses-support mailing list submissions to
moses-support@mit.edu
To subscribe or unsubscribe via the World Wide Web, visit
http://mailman.mit.edu/mailman/listinfo/moses-support
or, via email, send a message with subject or body 'help' to
moses-support-request@mit.edu
You can reach the person managing the list at
moses-support-owner@mit.edu
When replying, please edit your Subject line so it is more specific
than "Re: Contents of Moses-support digest..."
Today's Topics:
1. Re: lmplz deadlocks only when compiled with-tcmalloc
(Jean-Fran?ois Beaulac)
2. Performance issues in 2.1.1 (Mike Ladwig)
3. PhraseDictionaryFuzzyMatch (Jon Olds)
4. Re: Performance issues in 2.1.1 (Hieu Hoang)
----------------------------------------------------------------------
Message: 1
Date: Wed, 25 Jun 2014 14:40:26 -0400
From: Jean-Fran?ois Beaulac <JBeaulac@versacom.ca>
Subject: Re: [Moses-support] lmplz deadlocks only when compiled
with-tcmalloc
To: "moses-support@mit.edu" <moses-support@mit.edu>
Message-ID:
<11D00BC163ADCE4EAD47F5D28A6019410291EFCD9224@EXC-MB.linguistiques.ca>
Content-Type: text/plain; charset="iso-8859-1"
Hi,
I just pulled and tested, seems like the problem is fixed, I do not observe deadlocks anymore.
Thanks,
Jf
-----Message d'origine-----
De?: moses-support-bounces@mit.edu [mailto:moses-support-bounces@mit.edu] De la part de Kenneth Heafield
Envoy??: 22 juin 2014 08:55
??: moses-support@mit.edu
Objet?: Re: [Moses-support] lmplz deadlocks only when compiled with-tcmalloc
Hi,
When you pull and recompile, does it still deadlock?
Kenneth
On 06/02/14 07:59, Jean-Fran?ois Beaulac wrote:
> Hi,
>
>
>
> When I compile moses with TCMalloc, invoking lmplz without limiting
> the memory used for sorting results in what appears to be a deadlock.
> If I compiled using the --without-tcmalloc switch, the problem goes away.
>
>
>
> If I run it with:
>
> bin/lmplz -T /tmp -o 3 < bin/testcorpus > testcorpus.out
>
> I get a deadlock
>
>
>
> If I run it with to -S option, I can get it to work:
>
> bin/lmplz -S 50% -T /tmp -o 3 < bin/testcorpus > testcorpus.out
>
>
>
> In my setup, if I go above 54% it always deadlocks.
>
>
>
> I compiled it with the latest gperftools version and boost 1.55, the
> machine I run it on has 40Gig of physical RAM
>
>
>
> Heres the backtraces for all threads when it locks up
>
>
>
>
>
> Thread 6 (Thread 0x7ffff61dd700 (LWP 32333)):
>
> #0 0x00007ffff6d95000 in sem_wait () from /lib64/libpthread.so.0
>
> #1 0x0000000000431039 in
> boost::interprocess::ipcdetail::semaphore_wait
> (handle=0x1d08282e0) at
> /opt/boost-1.55/include/boost/interprocess/sync/posix/semaphore_wrappe
> r.hpp:157
>
> #2 0x0000000000431118 in
> boost::interprocess::ipcdetail::posix_semaphore::wait
> (this=0x1d08282e0) at
> /opt/boost-1.55/include/boost/interprocess/sync/posix/semaphore.hpp:45
>
> #3 0x000000000043116e in
> boost::interprocess::interprocess_semaphore::wait (this=0x1d08282e0)
> at
> /opt/boost-1.55/include/boost/interprocess/sync/interprocess_semaphore
> .hpp:128
>
> #4 0x00000000004311a3 in util::WaitSemaphore (on=...) at
> ./util/pcqueue.hh:16
>
> #5 0x00000000004323b4 in util::PCQueue<util::stream::Block>::Consume
> (this=0x1d08282c0, out=...) at ./util/pcqueue.hh:59
>
> #6 0x000000000042f392 in util::stream::Link::Init
> (this=0x7ffff61dcd70,
> position=...) at
> /store/disk1/jbeaulac/moses-src/RELEASE-2.1.1.git/util/stream/chain.cc
> :115
>
> #7 0x000000000042f419 in util::stream::Link::Link
> (this=0x7ffff61dcd70,
> position=...) at
> /store/disk1/jbeaulac/moses-src/RELEASE-2.1.1.git/util/stream/chain.cc
> :119
>
> #8 0x000000000042e74a in util::stream::Recycler::Run
> (this=0x1d082c5f8,
> position=...) at
> /store/disk1/jbeaulac/moses-src/RELEASE-2.1.1.git/util/stream/chain.cc
> :31
>
> #9 0x0000000000435d76 in
> util::stream::Thread::operator()<util::stream::ChainPosition,
> util::stream::Recycler> (this=0xfa01c0, position=..., worker=...)
>
> at ./util/stream/chain.hh:54
>
> #10 0x0000000000435c55 in
> boost::_bi::list2<boost::_bi::value<util::stream::ChainPosition>,
> boost::_bi::value<util::stream::Recycler>
>>::operator()<boost::reference_wrapper<util::stream::Thread>,
> boost::_bi::list0> (this=0x1d082c5c0, f=..., a=...) at
> /opt/boost-1.55/include/boost/bind/bind.hpp:313
>
> #11 0x0000000000435a07 in boost::_bi::bind_t<void,
> boost::reference_wrapper<util::stream::Thread>,
> boost::_bi::list2<boost::_bi::value<util::stream::ChainPosition>,
> boost::_bi::value<util::stream::Recycler> > >::operator()
> (this=0x1d082c5b8) at
> /opt/boost-1.55/include/boost/bind/bind_template.hpp:20
>
> #12 0x00000000004357d4 in
> boost::detail::thread_data<boost::_bi::bind_t<void,
> boost::reference_wrapper<util::stream::Thread>,
> boost::_bi::list2<boost::_bi::value<util::stream::ChainPosition>,
> boost::_bi::value<util::stream::Recycler> > > >::run
> (this=0x1d082c400) at
> /opt/boost-1.55/include/boost/thread/detail/thread.hpp:117
>
> #13 0x00000000004e584a in thread_proxy ()
>
> #14 0x00007ffff6d8f0a2 in start_thread () from /lib64/libpthread.so.0
>
> #15 0x00007ffff6ac4b5d in clone () from /lib64/libc.so.6
>
>
>
> Thread 1 (Thread 0x7ffff7fd8740 (LWP 32323)):
>
> #0 0x00007ffff6d9304f in pthread_cond_wait@@GLIBC_2.3.2 () from
> /lib64/libpthread.so.0
>
> #1 0x00000000004e87eb in
> boost::condition_variable::wait(boost::unique_lock<boost::mutex>&) ()
>
> #2 0x00000000004e6276 in boost::thread::join_noexcept() ()
>
> #3 0x0000000000430a7b in boost::thread::join (this=0xfa01c0) at
> /opt/boost-1.55/include/boost/thread/detail/thread.hpp:756
>
> #4 0x000000000042e6b1 in util::stream::Thread::~Thread
> (this=0xfa01c0, __in_chrg=<optimized out>) at
> /store/disk1/jbeaulac/moses-src/RELEASE-2.1.1.git/util/stream/chain.cc
> :22
>
> #5 0x00000000004353c2 in boost::checked_delete<util::stream::Thread
> const> (x=0xfa01c0) at
> const> /opt/boost-1.55/include/boost/checked_delete.hpp:34
>
> #6 0x00000000004350aa in boost::delete_clone<util::stream::Thread>
> (r=0xfa01c0) at
> /opt/boost-1.55/include/boost/ptr_container/clone_allocator.hpp:56
>
> #7 0x0000000000434bb0 in
> boost::heap_clone_allocator::deallocate_clone<util::stream::Thread>
> (r=0xfa01c0) at
> /opt/boost-1.55/include/boost/ptr_container/clone_allocator.hpp:74
>
> #8 0x000000000043452e in
> boost::ptr_container_detail::reversible_ptr_container<boost::ptr_conta
> iner_detail::sequence_config<util::stream::Thread,
> std::vector<void*, std::allocator<void*> > >,
> boost::heap_clone_allocator>::null_clone_allocator<false>::deallocate_
> clone
> (x=0xfa01c0)
>
> at
> /opt/boost-1.55/include/boost/ptr_container/detail/reversible_ptr_cont
> ainer.hpp:126
>
> #9 0x0000000000434c6b in
> boost::ptr_container_detail::reversible_ptr_container<boost::ptr_conta
> iner_detail::sequence_config<util::stream::Thread,
> std::vector<void*, std::allocator<void*> > >,
> boost::heap_clone_allocator>::null_policy_deallocate_clone
> (x=0xfa01c0) at
> /opt/boost-1.55/include/boost/ptr_container/detail/reversible_ptr_cont
> ainer.hpp:276
>
> #10 0x0000000000434676 in
> boost::ptr_container_detail::reversible_ptr_container<boost::ptr_conta
> iner_detail::sequence_config<util::stream::Thread,
> std::vector<void*, std::allocator<void*> > >,
> boost::heap_clone_allocator>::remove<boost::void_ptr_iterator<__gnu_cx
> x::__normal_iterator<void**, std::vector<void*, std::allocator<void*>
> > >, util::stream::Thread> > (this=0xfc4048, i=...) at
> /opt/boost-1.55/include/boost/ptr_container/detail/reversible_ptr_cont
> ainer.hpp:250
>
> #11 0x0000000000433e39 in
> boost::ptr_container_detail::reversible_ptr_container<boost::ptr_conta
> iner_detail::sequence_config<util::stream::Thread,
> std::vector<void*, std::allocator<void*> > >,
> boost::heap_clone_allocator>::remove<boost::void_ptr_iterator<__gnu_cx
> x::__normal_iterator<void**, std::vector<void*, std::allocator<void*>
> > >, util::stream::Thread> > (this=0xfc4048, first=..., last=...) at
> /opt/boost-1.55/include/boost/ptr_container/detail/reversible_ptr_cont
> ainer.hpp:257
>
> #12 0x0000000000433171 in
> boost::ptr_container_detail::reversible_ptr_container<boost::ptr_conta
> iner_detail::sequence_config<util::stream::Thread,
> std::vector<void*, std::allocator<void*> > >,
> boost::heap_clone_allocator>::remove_all (this=0xfc4048) at
> /opt/boost-1.55/include/boost/ptr_container/detail/reversible_ptr_cont
> ainer.hpp:218
>
> #13 0x000000000043219a in
> boost::ptr_container_detail::reversible_ptr_container<boost::ptr_conta
> iner_detail::sequence_config<util::stream::Thread,
> std::vector<void*, std::allocator<void*> > >,
> boost::heap_clone_allocator>::clear (this=0xfc4048) at
> /opt/boost-1.55/include/boost/ptr_container/detail/reversible_ptr_cont
> ainer.hpp:601
>
> #14 0x000000000042ee4a in util::stream::Chain::Wait (this=0xfc4000,
> release_memory=true) at
> /store/disk1/jbeaulac/moses-src/RELEASE-2.1.1.git/util/stream/chain.cc
> :68
>
> #15 0x000000000042ebd0 in util::stream::Chain::~Chain (this=0xfc4000,
> __in_chrg=<optimized out>) at
> /store/disk1/jbeaulac/moses-src/RELEASE-2.1.1.git/util/stream/chain.cc
> :47
>
> #16 0x00000000004829bc in
> lm::builder::FixedArray<util::stream::Chain>::clear
> (this=0x7fffffffdb60) at ./lm/builder/multi_stream.hh:64
>
> #17 0x00000000004815e3 in
> lm::builder::FixedArray<util::stream::Chain>::~FixedArray
> (this=0x7fffffffdb60, __in_chrg=<optimized out>) at
> ./lm/builder/multi_stream.hh:40
>
> #18 0x0000000000481369 in lm::builder::Chains::~Chains
> (this=0x7fffffffdb60, __in_chrg=<optimized out>) at
> ./lm/builder/multi_stream.hh:92
>
> #19 0x000000000047f387 in lm::builder::(anonymous
> namespace)::Master::~Master (this=0x7fffffffdad0, __in_chrg=<optimized
> out>)
>
> ---Type <return> to continue, or q <return> to quit---
>
> at
> /store/disk1/jbeaulac/moses-src/RELEASE-2.1.1.git/lm/builder/pipeline.
> cc:33
>
> #20 0x000000000047fc0c in lm::builder::Pipeline (config=...,
> text_file=0, out_arpa=1) at
> /store/disk1/jbeaulac/moses-src/RELEASE-2.1.1.git/lm/builder/pipeline.
> cc:318
>
> #21 0x00000000004a2bec in main (argc=5, argv=0x7fffffffe2c8) at
> /store/disk1/jbeaulac/moses-src/RELEASE-2.1.1.git/lm/builder/lmplz_mai
> n.cc:109
>
>
>
> *---*
>
> *Jean-Francois Beaulac*
>
> Programmeur analyste
>
> Versacom inc.
> 6^e ?tage
> 1501, avenue McGill College
> Montr?al (Qu?bec) H3A 3M8
>
> 514-394-7142
> www.versacom.ca <http://www.versacom.ca>
>
>
>
>
>
> _______________________________________________
> Moses-support mailing list
> Moses-support@mit.edu
> http://mailman.mit.edu/mailman/listinfo/moses-support
>
_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support
------------------------------
Message: 2
Date: Wed, 25 Jun 2014 16:25:21 -0400
From: Mike Ladwig <mdladwig@gmail.com>
Subject: [Moses-support] Performance issues in 2.1.1
To: moses-support@mit.edu
Message-ID:
<CAB3VaD119i=Lx7zNq4Pn66cJfyNM-y5WyXN=hOO4-xbccy9sOw@mail.gmail.com>
Content-Type: text/plain; charset="utf-8"
I'm trying to move from the moses 1.x release to 2.x, but have encountered
large performance issues. On my workstation (Scientific Linux 6.5), using
the same spa-eng data to create two systems I get performance roughly 3x
slower on release 2.1.1.
I started by comparing moseserver between the 1.x and 2.x releases. After
discovering the "single phrase cache per thread" issue, I rewrote
mosesserver using a thread pool but only got a 10-20% improvement.
Thinking I might not really have fixed mosesserver, I tried comparing
unmodified moses-cmd speed between releases. The values are in words per
minute for a 2000 line, 48k word file.
1T 4T 8T
Rel 1: 4850 16492 19500
Rel 2: 1742 5324 6559
Any suggestions?
Regards,
mike.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/moses-support/attachments/20140625/3a911d4c/attachment-0001.htm
------------------------------
Message: 3
Date: Thu, 26 Jun 2014 12:48:22 +0100
From: Jon Olds <joft_uk@yahoo.co.uk>
Subject: [Moses-support] PhraseDictionaryFuzzyMatch
To: moses-support <moses-support@mit.edu>
Message-ID: <53AC0886.8050002@yahoo.co.uk>
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
Hi,
I would like to test out the PhraseDictionaryFuzzyMatch feature in Moses.
Do I need to do something special when compiling Moses, as this feature
does not appear to be available in moses_chart as things stand? (see
output of moses_chart with no arguments below).
Also, can it be used with Mosesserver (in theory)?
Cheers,
Jon
Available feature functions:
KENLM IRSTLM SkeletonStatelessFF SyntaxRHS NieceTerminal
MaxSpanFreeNonTermSource RuleScope SetSourcePhrase ReferenceComparison
HyperParameterAsWeight SoftMatchingFeature CoveredReferenceFeature
ConstrainedDecoding OpSequenceModel UnknownWordPenalty ExternalFeature
SkeletonStatefulFF PhraseDictionaryALSuffixArray
PhraseDictionaryMultiModelCounts PhraseDictionaryMultiModel
PhraseDictionaryMemory PhraseDictionaryDynSuffixArray SpanLength
CountNonTerms InputFeature PhrasePenalty WordPenalty Distortion
Generation TargetNgramFeature TreeStructureFeature TargetBigramFeature
PhraseLengthFeature LexicalReordering SourceGHKMTreeInputMatchFeature
PhraseBoundaryFeature BleuScoreFeature SkeletonLM ControlRecombination
TargetWordInsertionFeature WordTranslationFeature
PhraseDictionaryTransliteration SourceWordDeletionFeature SkeletonPT
PhraseDictionaryBinary GlobalLexicalModel PhraseDictionaryOnDisk
PhrasePairFeature PhraseDictionaryScope3
------------------------------
Message: 4
Date: Thu, 26 Jun 2014 09:19:20 -0400
From: Hieu Hoang <Hieu.Hoang@ed.ac.uk>
Subject: Re: [Moses-support] Performance issues in 2.1.1
To: Mike Ladwig <mdladwig@gmail.com>
Cc: moses-support <moses-support@mit.edu>
Message-ID:
<CAEKMkbizA+X1wuPVA5LY5pdujQsZykGi9MXMd+H8K-8J8ASmSA@mail.gmail.com>
Content-Type: text/plain; charset="utf-8"
Is the performance ok with the command line version? Can you push your
thread pool changes to a new branch.
I'll take a look at it when I get the chance
On 25 June 2014 16:25, Mike Ladwig <mdladwig@gmail.com> wrote:
> I'm trying to move from the moses 1.x release to 2.x, but have encountered
> large performance issues. On my workstation (Scientific Linux 6.5), using
> the same spa-eng data to create two systems I get performance roughly 3x
> slower on release 2.1.1.
>
> I started by comparing moseserver between the 1.x and 2.x releases. After
> discovering the "single phrase cache per thread" issue, I rewrote
> mosesserver using a thread pool but only got a 10-20% improvement.
>
> Thinking I might not really have fixed mosesserver, I tried comparing
> unmodified moses-cmd speed between releases. The values are in words per
> minute for a 2000 line, 48k word file.
>
> 1T 4T 8T
> Rel 1: 4850 16492 19500
> Rel 2: 1742 5324 6559
>
> Any suggestions?
>
> Regards,
> mike.
>
> _______________________________________________
> Moses-support mailing list
> Moses-support@mit.edu
> http://mailman.mit.edu/mailman/listinfo/moses-support
>
>
--
Hieu Hoang
Research Associate
University of Edinburgh
http://www.hoang.co.uk/hieu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/mailman/private/moses-support/attachments/20140626/fd7838f4/attachment.htm
------------------------------
_______________________________________________
Moses-support mailing list
Moses-support@mit.edu
http://mailman.mit.edu/mailman/listinfo/moses-support
End of Moses-support Digest, Vol 92, Issue 45
*********************************************
Subscribe to:
Post Comments (Atom)
0 Response to "Moses-support Digest, Vol 92, Issue 45"
Post a Comment