Skip to content

EdinburghNLP/marian-dev

 
 

Repository files navigation

Marian

Build Status CPU Build Status Tests Status Twitter

Marian is an efficient Neural Machine Translation framework written in pure C++ with minimal dependencies. Named in honour of Marian Rejewski, a Polish mathematician and cryptologist.

Named in honour of Marian Rejewski, a Polish mathematician and cryptologist.

Main features:

  • Fast multi-gpu training and translation
  • Compatible with Nematus and DL4MT
  • Efficient pure C++ implementation
  • Permissive open source license (MIT)
  • more details...

If you use this, please cite:

Marcin Junczys-Dowmunt, Roman Grundkiewicz, Tomasz Dwojak, Hieu Hoang, Kenneth Heafield, Tom Neckermann, Frank Seide, Ulrich Germann, Alham Fikri Aji, Nikolay Bogoychev, André F. T. Martins, Alexandra Birch (2018). Marian: Fast Neural Machine Translation in C++ (http://www.aclweb.org/anthology/P18-4020)

@InProceedings{mariannmt,
    title     = {Marian: Fast Neural Machine Translation in {C++}},
    author    = {Junczys-Dowmunt, Marcin and Grundkiewicz, Roman and
                 Dwojak, Tomasz and Hoang, Hieu and Heafield, Kenneth and
                 Neckermann, Tom and Seide, Frank and Germann, Ulrich and
                 Fikri Aji, Alham and Bogoychev, Nikolay and
                 Martins, Andr\'{e} F. T. and Birch, Alexandra},
    booktitle = {Proceedings of ACL 2018, System Demonstrations},
    pages     = {116--121},
    publisher = {Association for Computational Linguistics},
    year      = {2018},
    month     = {July},
    address   = {Melbourne, Australia},
    url       = {http://www.aclweb.org/anthology/P18-4020}
}

Amun

The handwritten decoder for RNN models compatible with Marian and Nematus has been superseded by the Marian decoder. The code is available in a seperate repositoy: https://github.com/marian-nmt/amun

Website

More information on https://marian-nmt.github.io

Acknowledgements

The development of Marian received funding from the European Union's Horizon 2020 Research and Innovation Programme under grant agreements 688139 (SUMMA; 2016-2019), 645487 (Modern MT; 2015-2017), 644333 (TraMOOC; 2015-2017), 644402 (HiML; 2015-2017), the Amazon Academic Research Awards program, the World Intellectual Property Organization, and is based upon work supported in part by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA), via contract #FA8650-17-C-9117.

This software contains source code provided by NVIDIA Corporation.

About

The training framework for marian - development repository

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • C++ 83.1%
  • Cuda 12.6%
  • CMake 2.3%
  • Python 1.6%
  • Perl 0.2%
  • Shell 0.1%
  • Other 0.1%