cortex VS Tribuo

Compare cortex vs Tribuo and see what are their differences.

cortex

Drop-in, local AI alternative to the OpenAI stack. Multi-engine (llama.cpp, TensorRT-LLM). Powers 👋 Jan (by janhq)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
cortex Tribuo
8 15
1,661 1,230
14.2% 0.9%
9.8 4.8
about 10 hours ago 9 days ago
C++ Java
GNU Affero General Public License v3.0 Apache 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

cortex

Posts with mentions or reviews of cortex. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-05-05.
  • Introducing Jan
    4 projects | dev.to | 5 May 2024
    Jan incorporates a lightweight, built-in inference server called Nitro. Nitro supports both llama.cpp and NVIDIA's TensorRT-LLM engines. This means many open LLMs in the GGUF format are supported. Jan's Model Hub is designed for easy installation of pre-configured models but it also allows you to install virtually any model from Hugging Face or even your own.
  • Ollama Python and JavaScript Libraries
    17 projects | news.ycombinator.com | 24 Jan 2024
    I'd like to see a comparison to nitro https://github.com/janhq/nitro which has been fantastic for running a local LLM.
  • FLaNK Weekly 08 Jan 2024
    41 projects | dev.to | 8 Jan 2024
  • Nitro: A fast, lightweight 3MB inference server with OpenAI-Compatible API
    9 projects | news.ycombinator.com | 5 Jan 2024
    Look... I appreciate a cool project, but this is probably not a good idea.

    > Built on top of the cutting-edge inference library llama.cpp, modified to be production ready.

    It's not. It's literally just llama.cpp -> https://github.com/janhq/nitro/blob/main/.gitmodules

    Llama.cpp makes no pretense at being a robust safe network ready library; it's a high performance library.

    You've made no changes to llama.cpp here; you're just calling the llama.cpp API directly from your drogon app.

    Hm.

    ...

    Look... that's interesting, but, honestly, I know there's this wave of "C++ is back!" stuff going on, but building network applications in C++ is very tricky to do right, and while this is cool, I'm not sure 'llama.cpp is in c++ because it needs to be fast' is a good reason to go 'so lets build a network server in c++ too!'.

    I mean, I guess you could argue that since llama.cpp is a C++ application, it's fair for them to offer their own server example with an openai compatible API (which you can read about here: https://github.com/ggerganov/llama.cpp/issues/4216, https://github.com/ggerganov/llama.cpp/blob/master/examples/...).

    ...but a production ready application?

    I wrote a rust binding to llama.cpp and my conclusion was that llama.cpp is pretty bleeding edge software, and bluntly, you should process isolate it from anything you really care about, if you want to avoid undefined behavior after long running inference sequences; because it updates very often, and often breaks. Those breaks are usually UB. It does not have a 'stable' version.

    Further more, when you run large models and run out of memory, C++ applications are notoriously unreliable in their 'handle OOM' behaviour.

    Soo.... I know there's something fun here, but really... unless you had a really really compelling reason to need to write your server software in c++ (and I see no compelling reason here), I'm curious why you would?

    It seems enormously risky.

    The quality of this code is 'fun', not 'production ready'.

  • Apple Silicon Llama 7B running in docker?
    5 projects | /r/LocalLLaMA | 7 Dec 2023
  • Is there any LLM that can be installed with out python
    2 projects | /r/LocalLLaMA | 5 Dec 2023

Tribuo

Posts with mentions or reviews of Tribuo. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-01-08.
  • FLaNK Weekly 08 Jan 2024
    41 projects | dev.to | 8 Jan 2024
  • Is deeplearning4j a good choice?
    2 projects | /r/java | 11 Mar 2023
    It seems to have been picked up by Eclipse and there is also Oracle Labs' Tribuo and Deep Java Library. All seem active, but I don't know much about any of them. I agree it's probably best to follow the community and use a more popular tool like PyTorch.
  • Stochastic gradient descent written in SQL
    3 projects | news.ycombinator.com | 7 Mar 2023
    We built model & data provenance into our open source ML library, though it's admittedly not the W3C PROV standard. There were a few gaps in it until we built an automated reproducibility system on top of it, but now it's pretty solid for all the algorithms we implement. Unfortunately some of the things we wrap (notably TensorFlow) aren't reproducible enough due to some unfixed bugs. There's an overview of the provenance system in this reprise of the JavaOne talk I gave here https://www.youtube.com/watch?v=GXOMjq2OS_c. The library is on GitHub - https://github.com/oracle/tribuo.
  • Just want to vent a bit
    3 projects | /r/ProgrammingLanguages | 3 Dec 2022
    Although it may be a bit more work, you can do both machine learning and AI in Java. If you are doing deep learning, you can use DeepJavaLibrary (I do work on this one at Amazon). If you are looking for other ML algorithms, I have seen Smile, Tribuo, or some around Spark.
  • Anybody here using Java for machine learning?
    11 projects | /r/java | 13 Sep 2022
    We've been developing Tribuo on Github for two years now, MS are very actively developing ONNX Runtime (and the Java layer is fairly thin and wrapped over the same C API they use for node.js and C#), and things like XGBoost and LibSVM have been around for many years and the Java bits are developed in tree with the rest of the code so updated along with it. Amazon have a team of people working on DJL, though you'd have to ask them what their plans are.
  • Java engineer wants to be a researcher
    1 project | /r/java | 16 Jul 2022
    FWIW, Oracle actually did release a Java ML library - https://github.com/oracle/tribuo.
  • txtai 3.4 released - Build AI-powered semantic search applications in Java
    4 projects | /r/java | 9 Oct 2021
    Tribuo (tribuo.org, github.com/oracle/tribuo). ONNX export support is there for 2 models at the moment in main, there's a PR for factorization machines which supports ONNX export, and we plan to add another couple of models and maybe ensembles before the upcoming release. Plus I need to write a tutorial on how it all works, but you can check the tests in the meantime.
  • Hottest topics for research for JAVA software engineers
    1 project | /r/java | 18 Aug 2021
    You can do ML & data science in Java (full disclosure: I help run TensorFlow-Java, I maintain ONNX Runtime's Java interface, and I'm the lead developer on Oracle Labs' Java ML library Tribuo, so I'm pretty biased). It tends not to be as favoured in research, though I've published academic ML papers which used Java implementations. People do deploy ML models quite a bit in Java in industry.
  • John Snow Labs Spark-NLP 3.1.0: Over 2600+ new models and pipelines in 200+ languages, new DistilBERT, RoBERTa, and XLM-RoBERTa transformers, support for external Transformers, and lots more!
    3 projects | /r/java | 8 Jun 2021
    It might be worth having a look at the ONNX Runtime Java API in addition to TF-Java, it'll let you deploy the rest of the HuggingFace pytorch models that don't have TF equivalents. I built the Java API a few years ago, and it's now a supported part of the ONNX Runtime project. We use it in Tribuo to provide one of our text feature embedding classes (BERTFeatureExtractor).
  • If it gets better w age, will java become compatible for machine learning and data science?
    7 projects | /r/java | 20 May 2021
    The IJava notebook kernel works pretty well for data science on top of Java. We use it in Tribuo to write all our tutorials, and if you've got the jar file in the right folder everything is runnable. For example, this is our intro classification tutorial - https://github.com/oracle/tribuo/blob/main/tutorials/irises-tribuo-v4.ipynb.

What are some alternatives?

When comparing cortex and Tribuo you can also consider the following projects:

ollama - Get up and running with Llama 3, Mistral, Gemma, and other large language models.

Deep Java Library (DJL) - An Engine-Agnostic Deep Learning Framework in Java

bionic-gpt - BionicGPT is an on-premise replacement for ChatGPT, offering the advantages of Generative AI while maintaining strict data confidentiality

Deeplearning4j - Suite of tools for deploying and training deep learning models using the JVM. Highlights include model import for keras, tensorflow, and onnx/pytorch, a modular and tiny c++ library for running math code and a java based math library on top of the core c++ library. Also includes samediff: a pytorch/tensorflow like library for running deep learning using automatic differentiation.

csvlens - Command line csv viewer

oj! Algorithms - oj! Algorithms

nnl - a low-latency and high-performance inference engine for large models on low-memory GPU platform.

spark-nlp - State of the Art Natural Language Processing

hyperfine - A command-line benchmarking tool

txtai - 💡 All-in-one open-source embeddings database for semantic search, LLM orchestration and language model workflows

java - Java bindings for TensorFlow

grobid - A machine learning software for extracting information from scholarly documents