Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
Llm-mistral Alternatives
Similar projects and alternatives to llm-mistral
-
petals
🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
llm-mistral reviews and mentions
-
Mistral Large
Feature request for Mistral API maintainers: the https://api.mistral.ai/v1/models API endpoint returns all of the language models and mistral-embed as well, but there's currently nothing in the JSON to help distinguish that embedding models from the others: https://github.com/simonw/llm-mistral/issues/5#issuecomment-...
It would be useful if there was an indication of which models are embedding models.
-
Many options for running Mistral models in your terminal using LLM
I was hoping I could run my LLM CLI tool against Ollama via their localhost API, but it looks like they don't offer an OpenAI-compatible endpoint yet.
If they add that it will work out of the box: https://llm.datasette.io/en/stable/other-models.html#openai-...
Otherwise someone would need to write a plugin for it, which would probably be pretty simple - I imagine it would look a bit like the llm-mistral plugin but adapted for the Ollama API design: https://github.com/simonw/llm-mistral/blob/main/llm_mistral....
- LLM-Mistral
-
A note from our sponsor - InfluxDB
www.influxdata.com | 9 May 2024
Stats
simonw/llm-mistral is an open source project licensed under Apache License 2.0 which is an OSI approved license.
The primary programming language of llm-mistral is Python.
Sponsored