InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now. Learn more →
Guidance Alternatives
Similar projects and alternatives to guidance
-
-
InfluxDB
InfluxDB – Built for High-Performance Time Series Workloads. InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now.
-
ollama
Get up and running with Llama 3.3, DeepSeek-R1, Phi-4, Gemma 3, Mistral Small 3.1 and other large language models.
-
-
-
autogen
A programming framework for agentic AI 🤖 PyPi: autogen-agentchat Discord: https://aka.ms/autogen-discord Office Hour: https://aka.ms/autogen-officehour
-
-
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
-
-
-
-
-
-
NeMo-Guardrails
NeMo Guardrails is an open-source toolkit for easily adding programmable guardrails to LLM-based conversational systems.
-
-
-
Segment-Everything-Everywhere-All-At-Once
[NeurIPS 2023] Official implementation of the paper "Segment Everything Everywhere All at Once"
-
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
guidance discussion
guidance reviews and mentions
-
smartfunc: Turn Docstrings into LLM-Functions
I use something similar to this decorator (more or less a thin wrapper around instructor) and have looked a little bit at the codegen + cache route. It gets more interesting with the addition of tool calls, but I've found JSON outputs create quality degradation and reliability issues. My next experiment on that thread is to either use guidance (https://github.com/guidance-ai/guidance) or reimplement some of their heuristics to try to get tool calling without 100% reliance on JSON.
-
Structured Outputs with Ollama
Hi, just wanted to say how much I appreciate your work.
I'm curious if you have considered implementing Microsoft's Guidance (https://github.com/guidance-ai/guidance)? Their approach offers significant speed improvements, which I understand can sometimes be shortcoming of GBNF (e.g https://github.com/ggerganov/llama.cpp/issues/4218).
-
StructuredRAG: JSON Response Formatting with Large Language Models
Interesting paper, but their reason for dismissing constrained decoding methods seems to be that they want to academically study the in-context setting.
For practitioners, using a framework like Guidance which forces the models to write valid JSON as they generate text solves this trivially (https://github.com/guidance-ai/guidance)
For json in particular these frameworks have functions that take in json schemas or pydantic schemas https://guidance.readthedocs.io/en/latest/generated/guidance...
-
Introducing Structured Outputs in the API
I was impressed by Microsoft’s AICI where the idea is a WASM program can choose the next tokens. And relatedly their Guidance[1] framework which can use CFGs and programs for local inference to even speed it up with context aware token filling. I hope this implies API-based LLMs may be moving in a similar direction.
[1] https://github.com/guidance-ai/guidance
-
What We Learned from a Year of Building with LLMs
Via APIs, yes. But if you have direct access to the model you can use libraries like https://github.com/guidance-ai/guidance to manipulate the output structure directly.
-
Anthropic's Haiku Beats GPT-4 Turbo in Tool Use
[1]: https://github.com/guidance-ai/guidance/tree/main
-
Show HN: Prompts as (WASM) Programs
> The most obvious usage of this is forcing a model to output valid JSON
Isn't this something that Outlines [0], Guidance [1] and others [2] already solve much more elegantly?
0. https://github.com/outlines-dev/outlines
1. https://github.com/guidance-ai/guidance
2. https://github.com/sgl-project/sglang
- Show HN: Fructose, LLM calls as strongly typed functions
-
LiteLlama-460M-1T has 460M parameters trained with 1T tokens
Or combine it with something like llama.cpp's grammer or microsoft's guidance-ai[0] (which I prefer) which would allow adding some react-style prompting and external tools. As others have mentioned, instruct tuning would help too.
[0] https://github.com/guidance-ai/guidance
- Forcing AI to Follow a Specific Answer Pattern Using GBNF Grammar
-
A note from our sponsor - InfluxDB
www.influxdata.com | 24 May 2025
Stats
guidance-ai/guidance is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of guidance is Jupyter Notebook.