async-openai
Rust library for OpenAI (by 64bit)
tiktoken-rs
Ready-made tokenizer library for working with GPT and tiktoken (by zurawiki)
async-openai | tiktoken-rs | |
---|---|---|
3 | 1 | |
966 | 206 | |
- | - | |
8.4 | 7.2 | |
9 days ago | 5 days ago | |
Rust | Rust | |
MIT License | MIT License |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
async-openai
Posts with mentions or reviews of async-openai.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-09-29.
-
Proper type for axum SSE stream
I am trying to stream a response from the OpenAI API as an SSE with axum. I have combined the following examples from the async-openai and axum repos to produce the below code I've used iterators in Rust but have not used streams, I have no idea how to reconcile the types here and don't know where to start to solve the problem. A solution or any pointers would be greatly appreciated. https://github.com/tokio-rs/axum/tree/axum-v0.6.20/examples/sse https://github.com/64bit/async-openai/tree/main/examples/chat-stream ``rust async fn sse_handler( TypedHeader(user_agent): TypedHeader, ) -> Sse>> { println!("{}` connected", user_agent.as_str());
- Show HN: Async Rust Library for OpenAI
-
Async Rust library for OpenAI
Hello fellow Rustaceans, I've been writing Rust this year and excited to share my second crate https://github.com/64bit/async-openai - Rust bindings for OpenAI API. My first crate was almost 10 months ago and I have come far in my journey in Rust.
tiktoken-rs
Posts with mentions or reviews of tiktoken-rs.
We have used some of these posts to build our list of alternatives
and similar projects.
What are some alternatives?
When comparing async-openai and tiktoken-rs you can also consider the following projects:
libopenai - A Rust client for OpenAI's API
opentau - Using Large Language Models for Gradual Type Inference
wordle-rs - Wordle in Rust
bytepiece-rs - The Bytepiece Tokenizer Implemented in Rust.
rust - Empowering everyone to build reliable and efficient software.
nlpo3 - Thai Natural Language Processing library in Rust, with Python and Node bindings.
gt - Using the OpenAI GPT model, one can conveniently access language translation from the command line.
rusty - AI-powered CLI tool to help you remember bash commands.
smartgpt - A program that provides LLMs with the ability to complete complex tasks using plugins.