zero_to_gpt VS apertium

Compare zero_to_gpt vs apertium and see what are their differences.

zero_to_gpt

Go from no deep learning knowledge to implementing GPT. (by VikParuchuri)

apertium

Core tools (driver script, transfer, tagger, formatters) for the FOSS RBMT system Apertium (by apertium)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
zero_to_gpt apertium
3 5
800 85
- -
6.3 5.6
10 months ago 4 days ago
Jupyter Notebook C++
GNU General Public License v3.0 or later GNU General Public License v3.0 only
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

zero_to_gpt

Posts with mentions or reviews of zero_to_gpt. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-11-19.
  • Deep Learning Course
    2 projects | news.ycombinator.com | 19 Nov 2023
    The deep learning book is a great choice, as many have mentioned.

    I've been making a course that has a little less theory, and a little more application here - https://github.com/VikParuchuri/zero_to_gpt . Videos are all optional (cover the same content as the text).

  • Ask HN: Resources to brush up from 'Intro to ML' to current LLMs/generative AI?
    1 project | news.ycombinator.com | 18 Nov 2023
    I've been putting a course together that teaches deep learning from the ground up - https://github.com/VikParuchuri/zero_to_gpt . It includes theory and code, and tries to strike a balance between the two.

    It focuses on text models over image models (rnn, transformer, etc).

    It's not 100% finished, but has enough to get you very far.

  • Ask HN: Tell us about your project that's not done yet but you want feedback on
    68 projects | news.ycombinator.com | 16 Aug 2023
    I'm in the process of creating a deep learning course called Zero to GPT - https://github.com/VikParuchuri/zero_to_gpt .

    It teaches you everything you need to train your own LLM, including the basics of deep learning and linear algebra. You learn the theory and the application, so you have a strong grounding in what you're doing. It includes written explanations, diagrams, and videos.

    I'm up to transformers now - only a few more lessons to go. It's been fun to write, but balancing time spent training models with writing the course has been hard. Hopefully I will get the time to finish it soon.

apertium

Posts with mentions or reviews of apertium. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-08-16.
  • Ask HN: Tell us about your project that's not done yet but you want feedback on
    68 projects | news.ycombinator.com | 16 Aug 2023
    This is very cool, looking forward to it! I've been doing the same thing with Spanish Wikipedia articles for a while, using a few lines of Bash + Regex. I was using Apertium for it. https://apertium.org/ It's definitely worse than most ML-based solutions, but it works reliably and fast; you can run it entirely offline. With Spanish translations, the main problem I was facing is lack of vocabulary, so I created https://github.com/phil294/apertium-eng-spa-wiktionary which about doubles the amount of recognized words, albeit with wonky grammar.
  • Show HN: Unlimited machine translation API for $200 / Month
    1 project | news.ycombinator.com | 1 Jun 2022
    I used to keep track of the state of machine translation some years back.

    I think the way you measure the success of an automated translation is edit distance, i.e. how many manual edits you need to make to a translated text before you reach some acceptable state. I suppose it's somewhat subjective, but it is possible to construct a benchmark and allow for multiple correct results.

    The best resources I knew back then were:

    VISL's CG-3 self-reported a competitively low edit distance compared to Google Translate: https://visl.sdu.dk/constraint_grammar.html -- the abstraction unfortunately requires a rather deep knowledge of any one particular language's grammar. It is a convincing argument that in order to beat Google Translate, you want less fuzzy machine learning and more structural analysis. But you also need a PhD in computational linguistics and deep knowledge of each language.

    Apertium has an open-source pipeline: https://apertium.org/ -- seems to be much more like an open-source approach with a quality similar to Google Translate (although I don't know if it's better or worse; probably slightly worse in most cases, and with a slightly lower coverage).

  • Translating several languages ​​into CV Creole
    1 project | /r/CapeVerde | 13 Dec 2021
    For context, I have been contributing CV Creole data to Unicode's CLDR and MediaWiki for a number of years now, but both are mostly manual work. I once considered setting up an Apertium language pair between CV Creole and Portuguese, given the grammatical similarities, but never got around to it.
  • "Lingva" Google Translate but without the tracking
    4 projects | /r/privacytoolsIO | 1 Sep 2021
    Lingva is awesome. Also don't forget to check out LibreTranslate and Apertium. They are open source. Apertium can even translate web pages (you need to enter the URL).
  • How I installed Apertium on CentOS 7
    6 projects | dev.to | 10 Jun 2021
    #!/bin/bash set -x mkdir -p apertium-src && \ mkdir -p $MTDIR cd apertium-src && \ wget http://ftp.tsukuba.wide.ad.jp/software/gcc/releases/gcc-8.5.0/gcc-8.5.0.tar.gz -O - \ | gzip -dc \ | tar -xf - && \ cd gcc-8.5.0 && \ ./configure --prefix=$MTDIR --disable-multilib && \ make -j $(nproc) && \ make install && \ cd .. || exit 1 cd apertium-src && \ wget https://github.com/unicode-org/icu/releases/download/release-69-1/icu4c-69_1-src.tgz -O - \ | gzip -dc \ | tar -xf - \ && cd icu/source \ && CC=gcc CXX=g++ ./configure --prefix=$MTDIR \ && CC=gcc CXX=g++ make -j $(nproc) \ && CC=gcc CXX=g++ make install \ && cd ../.. \ || exit 1 cd apertium-src && \ svn checkout http://beta.visl.sdu.dk/svn/visl/tools/vislcg3/trunk vislcg3 && \ cd vislcg3 && ./get-boost.sh \ && ./cmake.sh -DCMAKE_INSTALL_PREFIX=$MTDIR \ -DICU_INCLUDE_DIR=$MTDIR/include \ -DICU_LIBRARY=$MTDIR/lib/libicuuc.so \ -DICU_IO_LIBRARY=$MTDIR/lib/libicuio.so \ -DICU_I18N_LIBRARY=$MTDIR/lib/libicui18n.so \ && make -j$(nproc) && \ make install && cd .. || exit 1 cd apertium-src && \ git clone https://github.com/apertium/lttoolbox && \ cd lttoolbox && ./autogen.sh --prefix=$MTDIR && make -j $(nproc) && make install && cd ../.. || exit 1 cd apertium-src && \ git clone https://github.com/apertium/apertium && \ cd apertium && ./autogen.sh --prefix=$MTDIR && make -j $(nproc) && make install && cd ../.. || exit 1 cd apertium-src && \ git clone https://github.com/apertium/apertium-lex-tools && \ cd apertium-lex-tools && ./autogen.sh --prefix=$MTDIR && make -j $(nproc) && make install && cd ../.. || exit 1 cd apertium-src && \ git clone https://github.com/apertium/apertium-tha && \ cd apertium-tha && ./autogen.sh --prefix=$MTDIR && make && make install && cd ../.. || exit 1 cd apertium-src && \ git clone https://github.com/apertium/apertium-tha-eng && \ cd apertium-tha-eng && ./autogen.sh --prefix=$MTDIR && make && make install && cd .. && \ cd .. || exit 1

What are some alternatives?

When comparing zero_to_gpt and apertium you can also consider the following projects:

mit-deep-learning-book-pdf - MIT Deep Learning Book in PDF format (complete and parts) by Ian Goodfellow, Yoshua Bengio and Aaron Courville

lingva-translate - Alternative front-end for Google Translate

logseq - A local-first, non-linear, outliner notebook for organizing and sharing your personal knowledge base. Use it to organize your todo list, to write your journals, or to record your unique life.

icu - The home of the ICU project source code.

url2epub - Create ePub files from URLs

LibreTranslate - Free and Open Source Machine Translation API. Self-hosted, offline capable and easy to setup.

paisa - Paisa – Personal Finance Manager. https://paisa.fyi demo: https://demo.paisa.fyi

apertium-tha-eng - Apertium translation pair for Thai and English

pls - `pls` is a prettier and powerful `ls(1)` for the pros.

lttoolbox - Finite state compiler, processor and helper tools used by apertium

divedb - This is the source repository for the DiveDB site

feature-express