jan
slint
jan | slint | |
---|---|---|
20 | 138 | |
19,848 | 15,610 | |
11.1% | 3.8% | |
10.0 | 9.9 | |
7 days ago | 3 days ago | |
TypeScript | Rust | |
GNU Affero General Public License v3.0 | GNU General Public License v3.0 Or Slint Royalty-Free |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
jan
- Jan โ Turn your computer into an AI computer
-
Devoxx Genie Plugin : an Update
I focused on supporting Ollama, GPT4All, and LMStudio, all of which run smoothly on a Mac computer. Many of these tools are user-friendly wrappers around Llama.cpp, allowing easy model downloads and providing a REST interface to query the available models. Last week, I also added "๐๐ผ Jan" support because HuggingFace has endorsed this provider out-of-the-box.
-
Ask HN: Which LLMs can run locally on most consumer computers
seconded - IMHO Jan has the cleanest UI and most straightforward setup out of all LLM frontends available now.
https://jan.ai/
https://github.com/janhq/jan
-
Introducing Jan
As we continue this blog series, let's explore a fully open-source alternative to LM Studio - Jan, a project from Southeast Asia.
-
AI enthusiasm - episode #2๐
Jan.ai is a 100% local alternative to ChatGPT: you can download LLMs and run them directly from within the application, or even prompting them and retrieving their response via API.
- Ask HN: What is the current (Apr. 2024) gold standard of running an LLM locally?
-
Show HN: I made an app to use local AI as daily driver
It would be cool to have the option to use the OpenAI API as well in the same interface. http://jan.ai does this, so that's what I'm using at the moment.
- Jan โ Bringing AI to Your Desktop
- FLaNK 15 Jan 2024
-
Why the M2 is more advanced that it seemed
Was it this? I havenโt tried it yet but it does look nice.
https://jan.ai/
slint
-
Ask HN: Why would you ever use C++ for a new project over Rust?
Did you get a chance to check https://slint.dev?
Disclaimer: I work for Slint
-
Deno in 2023
Currently, we do it by using binaries through napi-rs so we can bring in a window using the platform native API. And then we do some hack to merge the event loops.
But if Deno supports bringing up a window directly, this means we can just ship wasm instead of native binary for all platform. And also I hope event loop integration will be simplified.
Although we'd also need more API than just showing a window (mouse and keyboard input, accessibility, popup window, system tray, ...)
[1] https://slint.dev
-
Slint GUI Toolkit
Rich Text content is not yet implemented. This is tracked in https://github.com/slint-ui/slint/issues/2723
Thanks for reporting the broken link. Fixed in https://github.com/slint-ui/slint/commit/9200480b532f49007d2...
-
slint VS rinf - a user suggested alternative
2 projects | 24 Jan 2024
-
A 2024 Plea for Lean Software
With Slint (https://slint.dev) we're trying to make a lightweight toolkit that doesn't use HTML/CSS. And that you can program either from low level languages such as C++ or Rust. As well as with higher level language such as JavaScript, and we want to extend to python too.
-
Immediate Mode GUI Programming
I haven't. I was just searching for a GUI library that was Bevy-compatible and slint isn't at the moment: https://github.com/slint-ui/slint/discussions/940
Sorry!
-
Why the M2 is more advanced that it seemed
Trying to do that with Slint: https://slint.dev
- 9 years of Apple text editor solo dev
-
The Linux graphics stack in a nutshell, part 1
You can do that with Slint (https://slint.dev) and its linuxkms backend. No need for a xorg server or wayland compositor, just run the application made with Slint from the init script.
- Qt 6.6 and 6.7 Make QML Faster Than Ever: A New Benchmark and Analysis
What are some alternatives?
unstructured - Open source libraries and APIs to build custom preprocessing pipelines for labeling, training, or production machine learning pipelines.
tauri - Build smaller, faster, and more secure desktop applications with a web frontend.
chainlit - Build Conversational AI in minutes โก๏ธ
iced - A cross-platform GUI library for Rust, inspired by Elm
obsidian-local-llm - Obsidian Local LLM is a plugin for Obsidian that provides access to a powerful neural network, allowing users to generate text in a wide range of styles and formats using a local LLM.
egui - egui: an easy-to-use immediate mode GUI in Rust that runs on both web and native
ollama - Get up and running with Llama 3, Mistral, Gemma, and other large language models.
lvgl - Embedded graphics library to create beautiful UIs for any MCU, MPU and display type.
FLaNK-VectorDB - NiFi and Vector Databases
dioxus - Fullstack GUI library for web, desktop, mobile, and more.
modelfusion-llamacpp-nextjs-starter - Starter examples for using Next.js and the Vercel AI SDK with Llama.cpp and ModelFusion.
cxx-qt - Safe interop between Rust and Qt