-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
Not sure when they implemented this, but ollama now has a JSON mode [0]. Not function calling, but one of the simpler ways to get JSON in a local LLM. I'm using it with `knoopx/hermes-2-pro-mistral:7b-q8_0` and it's worked well for me so far.
0 - https://github.com/ollama/ollama/blob/main/docs/api.md#json-...
No benchmarks, just my anecdotal experience trying to get local LLM's to respond with JSON. The method above works for my use case nearly 100% of the time. Other things I've tried (e.g. `outlines`[0]) are really slow or don't work at all. Would love to hear what others have tried!
0 - https://github.com/outlines-dev/outlines
Ah yes. Have you tried out instructor [0] or Guidance [1]?
[0]: https://github.com/jxnl/instructor/
[1]: https://github.com/guidance-ai/guidance/tree/main
No fine tuning. Looks like he's do raw model capabilities with simple prompt. repo: https://github.com/parea-ai/tool-use-benchmark