openai-chat
async-openai
openai-chat | async-openai | |
---|---|---|
1 | 3 | |
3 | 955 | |
- | - | |
8.7 | 8.4 | |
3 months ago | 1 day ago | |
Rust | Rust | |
- | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
openai-chat
async-openai
-
Proper type for axum SSE stream
I am trying to stream a response from the OpenAI API as an SSE with axum. I have combined the following examples from the async-openai and axum repos to produce the below code I've used iterators in Rust but have not used streams, I have no idea how to reconcile the types here and don't know where to start to solve the problem. A solution or any pointers would be greatly appreciated. https://github.com/tokio-rs/axum/tree/axum-v0.6.20/examples/sse https://github.com/64bit/async-openai/tree/main/examples/chat-stream ``rust async fn sse_handler( TypedHeader(user_agent): TypedHeader, ) -> Sse>> { println!("{}` connected", user_agent.as_str());
- Show HN: Async Rust Library for OpenAI
-
Async Rust library for OpenAI
Hello fellow Rustaceans, I've been writing Rust this year and excited to share my second crate https://github.com/64bit/async-openai - Rust bindings for OpenAI API. My first crate was almost 10 months ago and I have come far in my journey in Rust.
What are some alternatives?
libopenai - A Rust client for OpenAI's API
tiktoken-rs - Ready-made tokenizer library for working with GPT and tiktoken
wordle-rs - Wordle in Rust
opentau - Using Large Language Models for Gradual Type Inference
rust - Empowering everyone to build reliable and efficient software.
gt - Using the OpenAI GPT model, one can conveniently access language translation from the command line.
smartgpt - A program that provides LLMs with the ability to complete complex tasks using plugins.
axum - Ergonomic and modular web framework built with Tokio, Tower, and Hyper
llm-chain - `llm-chain` is a powerful rust crate for building chains in large language models allowing you to summarise text and complete complex tasks
openai-client - OpenAI Dive is an unofficial async Rust library that allows you to interact with the OpenAI API.