marian | sequifier | |
---|---|---|
3 | 3 | |
1,170 | 5 | |
1.5% | - | |
0.0 | 6.3 | |
8 months ago | 1 day ago | |
C++ | Python | |
GNU General Public License v3.0 or later | BSD 3-clause "New" or "Revised" License |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
marian
Posts with mentions or reviews of marian.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-02-01.
-
[P] A CLI tool for easy transformer sequence classifier training and inference
As a reference, I forked https://github.com/marian-nmt/marian privately to support sequence tagging tasks. With a positional loss mask, It can also support sequence classificaiton.
-
Hello I’m looking for an app or website to translate accurately from English to Welsh
It's powered by the same underlying engine as Bing translate but with specific enhancements for Welsh language by the Bangor uni language tech experts. https://marian-nmt.github.io/
- [D] Deep Learning Framework for C++.
sequifier
Posts with mentions or reviews of sequifier.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-02-15.
-
Transformer: When do we use encoder-only, decoder-only and encoder-decoder models?
Yes I’ve also used the encoder layer and wondered whether I should switch to the decoder layer, given that next token prediction is usually done that way. I’m building a CLI tool for small transformer models in case you’re interested: https://github.com/0xideas/sequifier
-
Discrete sequence modelling with transformers
The context is that I have developed a CLI interface to train discrete sequence classification transformer models, that can either be used to learn to predict the next token/state/object, or some class based on a sequence of tokens/states/objects. It's called sequifier (for sequence classifier).
-
[P] A CLI tool for easy transformer sequence classifier training and inference
The project is called "sequifier" and can be found here: https://github.com/0xideas/sequifier
What are some alternatives?
When comparing marian and sequifier you can also consider the following projects:
flashlight - A C++ standalone library for machine learning
mdspan - Reference implementation of mdspan targeting C++23
marian-dev - Fast Neural Machine Translation in C++ - development repository
mmaction2 - OpenMMLab's Next Generation Video Understanding Toolbox and Benchmark
deepdetect - Deep Learning API and Server in C++14 support for Caffe, PyTorch,TensorRT, Dlib, NCNN, Tensorflow, XGBoost and TSNE
ArrayFire - ArrayFire: a general purpose GPU library.
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration