nematus
OpenNMT-py
nematus | OpenNMT-py | |
---|---|---|
1 | 6 | |
796 | 6,574 | |
0.0% | 0.9% | |
0.0 | 8.7 | |
over 1 year ago | 13 days ago | |
Python | Python | |
BSD 3-clause "New" or "Revised" License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
nematus
-
Can I connect docker to Google Cloud's GPU?
Sorry if the title is not so clear, I am still very new to Docker and linux/bash scripts. For a project, I want to run a data science model that I have found on github (specifically this one: https://github.com/EdinburghNLP/nematus). However, given the huge amount of data that I want to train, I will need to run this model using GPU (and the only GPU I have access to at the moment is the ones provided by Google Cloud).
OpenNMT-py
-
Making a custom Google Translate equivalent / web translation filter for my conlang?
I already tried this with OpenNMT.
-
Cutting edge language translation models
fairseq and OpenNMT are very good starting points if you want to train your NMT model from scratch.
- How Telegram Messenger circumvents Google Translate's API
-
WEBNLG challenge 2017 on Google Colab error
It looks like this uses the version of OpenNMT implemented in torch, which has been deprecated. You will be much better off using the pytorch implementation of OpenNMT or the transformers library. In fact, I would recommend taking a look at the GEM benchmark, since it also uses the WebNLG dataset. Here is a tutorial to get started, you can change the dataset here to WebNLG instead of CommonGen.
-
Help with Neural Machine Translation
Umm... open-nmt This is a library maintained since 2016 for NMT
-
Oop concepts for pytorch
However, you do not need to use much OOP when training models with pytorch. Most of the time it is just inheriting a class and overwriting functions. You might need more advanced stuff if you were writing a framework on top of it, something like ONMT
What are some alternatives?
OpenSeq2Seq - Toolkit for efficient experimentation with Speech Recognition, Text2Speech and NLP
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
bitextor - Bitextor generates translation memories from multilingual websites
pytorch-tutorial - PyTorch Tutorial for Deep Learning Researchers
NeMo - A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Automatic Speech Recognition and Text-to-Speech)
tensor2tensor - Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
stopes - A library for preparing data for machine translation research (monolingual preprocessing, bitext mining, etc.) built by the FAIR NLLB team.
Transformer-Models-from-Scratch - implementing various transformer models for various tasks
Opus-MT - Open neural machine translation models and web services
OpenNMT - Open Source Neural Machine Translation in Torch (deprecated)
LibreTranslate - Free and Open Source Machine Translation API. Self-hosted, offline capable and easy to setup.
espnet - End-to-End Speech Processing Toolkit