guidance

A guidance language for controlling large language models. (by guidance-ai)

Guidance Alternatives

Similar projects and alternatives to guidance

  1. llama.cpp

    LLM inference in C/C++

  2. InfluxDB

    InfluxDB – Built for High-Performance Time Series Workloads. InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now.

    InfluxDB logo
  3. ollama

    Get up and running with Llama 3.3, DeepSeek-R1, Phi-4, Gemma 3, Mistral Small 3.1 and other large language models.

  4. langchain

    71 guidance VS langchain

    🦜🔗 Build context-aware reasoning applications

  5. outlines

    43 guidance VS outlines

    Structured Text Generation

  6. autogen

    49 guidance VS autogen

    A programming framework for agentic AI 🤖 PyPi: autogen-agentchat Discord: https://aka.ms/autogen-discord Office Hour: https://aka.ms/autogen-officehour

  7. lmql

    34 guidance VS lmql

    A language for constraint-guided and efficient LLM programming.

  8. instructor

    26 guidance VS instructor

    structured outputs for llms

  9. SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
  10. jsonformer

    25 guidance VS jsonformer

    A Bulletproof Way to Generate Structured JSON from Language Models

  11. marvin

    17 guidance VS marvin

    ✨ AI agents that spark joy

  12. magentic

    15 guidance VS magentic

    Seamlessly integrate LLMs as Python functions

  13. clownfish

    Constrained Decoding for LLMs against JSON Schema

  14. semantic-kernel

    Integrate cutting-edge LLM technology quickly and easily into your apps

  15. aici

    8 guidance VS aici

    AICI: Prompts as (Wasm) Programs

  16. NeMo-Guardrails

    NeMo Guardrails is an open-source toolkit for easily adding programmable guardrails to LLM-based conversational systems.

  17. ad-llama

    7 guidance VS ad-llama

    Structured inference with Llama 2 in your browser

  18. lida

    6 guidance VS lida

    Automatic Generation of Visualizations and Infographics using Large Language Models

  19. Segment-Everything-Everywhere-All-At-Once

    [NeurIPS 2023] Official implementation of the paper "Segment Everything Everywhere All at Once"

  20. langchainrb

    Build LLM-powered applications in Ruby

  21. SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a better guidance alternative or higher similarity.

guidance discussion

Log in or Post with

guidance reviews and mentions

Posts with mentions or reviews of guidance. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2025-04-08.
  • smartfunc: Turn Docstrings into LLM-Functions
    5 projects | news.ycombinator.com | 8 Apr 2025
    I use something similar to this decorator (more or less a thin wrapper around instructor) and have looked a little bit at the codegen + cache route. It gets more interesting with the addition of tool calls, but I've found JSON outputs create quality degradation and reliability issues. My next experiment on that thread is to either use guidance (https://github.com/guidance-ai/guidance) or reimplement some of their heuristics to try to get tool calling without 100% reliance on JSON.
  • Structured Outputs with Ollama
    7 projects | news.ycombinator.com | 6 Dec 2024
    Hi, just wanted to say how much I appreciate your work.

    I'm curious if you have considered implementing Microsoft's Guidance (https://github.com/guidance-ai/guidance)? Their approach offers significant speed improvements, which I understand can sometimes be shortcoming of GBNF (e.g https://github.com/ggerganov/llama.cpp/issues/4218).

  • StructuredRAG: JSON Response Formatting with Large Language Models
    2 projects | news.ycombinator.com | 22 Aug 2024
    Interesting paper, but their reason for dismissing constrained decoding methods seems to be that they want to academically study the in-context setting.

    For practitioners, using a framework like Guidance which forces the models to write valid JSON as they generate text solves this trivially (https://github.com/guidance-ai/guidance)

    For json in particular these frameworks have functions that take in json schemas or pydantic schemas https://guidance.readthedocs.io/en/latest/generated/guidance...

  • Introducing Structured Outputs in the API
    10 projects | news.ycombinator.com | 6 Aug 2024
    I was impressed by Microsoft’s AICI where the idea is a WASM program can choose the next tokens. And relatedly their Guidance[1] framework which can use CFGs and programs for local inference to even speed it up with context aware token filling. I hope this implies API-based LLMs may be moving in a similar direction.

    [1] https://github.com/guidance-ai/guidance

  • What We Learned from a Year of Building with LLMs
    2 projects | news.ycombinator.com | 29 May 2024
    Via APIs, yes. But if you have direct access to the model you can use libraries like https://github.com/guidance-ai/guidance to manipulate the output structure directly.
  • Anthropic's Haiku Beats GPT-4 Turbo in Tool Use
    5 projects | news.ycombinator.com | 8 Apr 2024
    [1]: https://github.com/guidance-ai/guidance/tree/main
  • Show HN: Prompts as (WASM) Programs
    9 projects | news.ycombinator.com | 11 Mar 2024
    > The most obvious usage of this is forcing a model to output valid JSON

    Isn't this something that Outlines [0], Guidance [1] and others [2] already solve much more elegantly?

    0. https://github.com/outlines-dev/outlines

    1. https://github.com/guidance-ai/guidance

    2. https://github.com/sgl-project/sglang

  • Show HN: Fructose, LLM calls as strongly typed functions
    10 projects | news.ycombinator.com | 6 Mar 2024
  • LiteLlama-460M-1T has 460M parameters trained with 1T tokens
    1 project | news.ycombinator.com | 7 Jan 2024
    Or combine it with something like llama.cpp's grammer or microsoft's guidance-ai[0] (which I prefer) which would allow adding some react-style prompting and external tools. As others have mentioned, instruct tuning would help too.

    [0] https://github.com/guidance-ai/guidance

  • Forcing AI to Follow a Specific Answer Pattern Using GBNF Grammar
    2 projects | /r/LocalLLaMA | 10 Dec 2023
  • A note from our sponsor - InfluxDB
    www.influxdata.com | 24 May 2025
    InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now. Learn more →

Stats

Basic guidance repo stats
29
20,206
9.4
5 days ago

Sponsored
InfluxDB – Built for High-Performance Time Series Workloads
InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now.
www.influxdata.com

Did you know that Jupyter Notebook is
the 13th most popular programming language
based on number of references?