jan
slint
jan | slint | |
---|---|---|
16 | 138 | |
18,549 | 15,289 | |
21.8% | 4.2% | |
10.0 | 9.9 | |
4 days ago | 7 days ago | |
TypeScript | Rust | |
GNU Affero General Public License v3.0 | GNU General Public License v3.0 Or Slint Royalty-Free |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
jan
-
Introducing Jan
As we continue this blog series, let's explore a fully open-source alternative to LM Studio - Jan, a project from Southeast Asia.
-
AI enthusiasm - episode #2🚀
Jan.ai is a 100% local alternative to ChatGPT: you can download LLMs and run them directly from within the application, or even prompting them and retrieving their response via API.
- Ask HN: What is the current (Apr. 2024) gold standard of running an LLM locally?
-
Show HN: I made an app to use local AI as daily driver
It would be cool to have the option to use the OpenAI API as well in the same interface. http://jan.ai does this, so that's what I'm using at the moment.
- Jan – Bringing AI to Your Desktop
- FLaNK 15 Jan 2024
-
Why the M2 is more advanced that it seemed
Was it this? I haven’t tried it yet but it does look nice.
https://jan.ai/
- Jan is an open source alternative to ChatGPT that runs 100% offline
- Open-Source ChatGPT Alternative Jan
- Run LLMs Locally with an OpenAI API
slint
-
Ask HN: Why would you ever use C++ for a new project over Rust?
Did you get a chance to check https://slint.dev?
Disclaimer: I work for Slint
-
Deno in 2023
Currently, we do it by using binaries through napi-rs so we can bring in a window using the platform native API. And then we do some hack to merge the event loops.
But if Deno supports bringing up a window directly, this means we can just ship wasm instead of native binary for all platform. And also I hope event loop integration will be simplified.
Although we'd also need more API than just showing a window (mouse and keyboard input, accessibility, popup window, system tray, ...)
[1] https://slint.dev
-
Slint GUI Toolkit
Rich Text content is not yet implemented. This is tracked in https://github.com/slint-ui/slint/issues/2723
Thanks for reporting the broken link. Fixed in https://github.com/slint-ui/slint/commit/9200480b532f49007d2...
-
slint VS rinf - a user suggested alternative
2 projects | 24 Jan 2024
-
A 2024 Plea for Lean Software
With Slint (https://slint.dev) we're trying to make a lightweight toolkit that doesn't use HTML/CSS. And that you can program either from low level languages such as C++ or Rust. As well as with higher level language such as JavaScript, and we want to extend to python too.
-
Immediate Mode GUI Programming
I haven't. I was just searching for a GUI library that was Bevy-compatible and slint isn't at the moment: https://github.com/slint-ui/slint/discussions/940
Sorry!
-
Why the M2 is more advanced that it seemed
Trying to do that with Slint: https://slint.dev
- 9 years of Apple text editor solo dev
-
The Linux graphics stack in a nutshell, part 1
You can do that with Slint (https://slint.dev) and its linuxkms backend. No need for a xorg server or wayland compositor, just run the application made with Slint from the init script.
- Qt 6.6 and 6.7 Make QML Faster Than Ever: A New Benchmark and Analysis
What are some alternatives?
unstructured - Open source libraries and APIs to build custom preprocessing pipelines for labeling, training, or production machine learning pipelines.
tauri - Build smaller, faster, and more secure desktop applications with a web frontend.
chainlit - Build Conversational AI in minutes ⚡️
iced - A cross-platform GUI library for Rust, inspired by Elm
FLaNK-VectorDB - NiFi and Vector Databases
egui - egui: an easy-to-use immediate mode GUI in Rust that runs on both web and native
obsidian-local-llm - Obsidian Local LLM is a plugin for Obsidian that provides access to a powerful neural network, allowing users to generate text in a wide range of styles and formats using a local LLM.
lvgl - Embedded graphics library to create beautiful UIs for any MCU, MPU and display type.
modelfusion-llamacpp-nextjs-starter - Starter examples for using Next.js and the Vercel AI SDK with Llama.cpp and ModelFusion.
dioxus - Fullstack GUI library for web, desktop, mobile, and more.
VOLlama - An accessible chat client for Ollama
cxx-qt - Safe interop between Rust and Qt