jan
koboldcpp
jan | koboldcpp | |
---|---|---|
20 | 180 | |
19,848 | 4,187 | |
11.1% | - | |
10.0 | 10.0 | |
7 days ago | 4 days ago | |
TypeScript | C++ | |
GNU Affero General Public License v3.0 | GNU Affero General Public License v3.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
jan
- Jan โ Turn your computer into an AI computer
-
Devoxx Genie Plugin : an Update
I focused on supporting Ollama, GPT4All, and LMStudio, all of which run smoothly on a Mac computer. Many of these tools are user-friendly wrappers around Llama.cpp, allowing easy model downloads and providing a REST interface to query the available models. Last week, I also added "๐๐ผ Jan" support because HuggingFace has endorsed this provider out-of-the-box.
-
Ask HN: Which LLMs can run locally on most consumer computers
seconded - IMHO Jan has the cleanest UI and most straightforward setup out of all LLM frontends available now.
https://jan.ai/
https://github.com/janhq/jan
-
Introducing Jan
As we continue this blog series, let's explore a fully open-source alternative to LM Studio - Jan, a project from Southeast Asia.
-
AI enthusiasm - episode #2๐
Jan.ai is a 100% local alternative to ChatGPT: you can download LLMs and run them directly from within the application, or even prompting them and retrieving their response via API.
- Ask HN: What is the current (Apr. 2024) gold standard of running an LLM locally?
-
Show HN: I made an app to use local AI as daily driver
It would be cool to have the option to use the OpenAI API as well in the same interface. http://jan.ai does this, so that's what I'm using at the moment.
- Jan โ Bringing AI to Your Desktop
- FLaNK 15 Jan 2024
-
Why the M2 is more advanced that it seemed
Was it this? I havenโt tried it yet but it does look nice.
https://jan.ai/
koboldcpp
- Any Online Communities on Local/Home AI?
- Koboldcpp-1.62.1 adds support for Command-R+
- Show HN: I made an app to use local AI as daily driver
-
Easiest way to show my model to my mom?
FYI this is the easiest way to host on the horde: https://github.com/LostRuins/koboldcpp
- IT Veteran... why am I struggling with all of this?
- What do you use to run your models?
- ByteDance AI researcher suggests that open source model more powerful than Gemini to be released soon
- i need some help guys
-
[Guide] How install KoboldAI in Android via Termux (Update 04-12-2023)
For more information of Koboldcpp look this guide: https://github.com/LostRuins/koboldcpp/wiki
-
SillyTavern 1.10.10 has been released
Out of curiosity, is there a specific reason for this? The most popular fork KoboldCpp is in active development, and was the first to adopt the Min P sampler, and even distincts itself with the context shift feature. Just wondering what this means for the future. Thanks!
What are some alternatives?
unstructured - Open source libraries and APIs to build custom preprocessing pipelines for labeling, training, or production machine learning pipelines.
KoboldAI
chainlit - Build Conversational AI in minutes โก๏ธ
text-generation-webui - A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
obsidian-local-llm - Obsidian Local LLM is a plugin for Obsidian that provides access to a powerful neural network, allowing users to generate text in a wide range of styles and formats using a local LLM.
KoboldAI - KoboldAI is generative AI software optimized for fictional use, but capable of much more!
ollama - Get up and running with Llama 3, Mistral, Gemma, and other large language models.
TavernAI - Atmospheric adventure chat for AI language models (KoboldAI, NovelAI, Pygmalion, OpenAI chatgpt, gpt-4)
FLaNK-VectorDB - NiFi and Vector Databases
modelfusion-llamacpp-nextjs-starter - Starter examples for using Next.js and the Vercel AI SDK with Llama.cpp and ModelFusion.
SillyTavern - LLM Frontend for Power Users. [Moved to: https://github.com/SillyTavern/SillyTavern]