Show HN: Magentic – Use LLMs as simple Python functions

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
  • magentic

    Seamlessly integrate LLMs as Python functions

  • 1) The OpenAI API will be queried each time a "prompt-function" is called in python code. If you provide the `functions` argument in order to use function-calling then magentic will not execute the function the LLM has chosen, instead it returns a `FunctionCall` instance which you can validate before calling.

    2) I haven't measured additional latency but it should be negligible in comparison to the speed of generation of the LLM. And since it makes it easy to use streaming and async functions you might be able to achieve much faster generation speeds overall - see the Async section in the README. Token usage should also be a negligible change from calling the OpenAI API directly - the only "prompting" magentic does currently is in naming the functions sent to OpenAI, all other input tokens are written by the user. A user switching from explicitly defining the output schema in the prompt to using function-calling via magentic might actually save a few tokens.

    3) Functionality is not deterministic, even with `temperature=0`, but since we're working with python functions one option is to just add the `@cache` decorator. This would save you tokens and time when calling the same prompt-function with the same inputs.

    ---

    1) https://github.com/jackmpcollins/magentic#usage

  • marvin

    ✨ Build AI interfaces that spark joy

  • Yes, similar ideas. Marvin [asks the LLM to mimic the python function](https://github.com/PrefectHQ/marvin/blob/f37ad5b15e2e77dd998...), whereas in magentic the function signature just represents the inputs/outputs to the prompt-template/LLM, so the LLM “doesn’t know” that it is pretending to be a python function - you specify all the prompts.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • lmql

    A language for constraint-guided and efficient LLM programming.

  • This is also similar in spirit to LMQL

    https://github.com/eth-sri/lmql

  • outlines

    Structured Text Generation

  • Right now it just works with OpenAI chat models (gpt-3.5-turbo, gpt-4) but if there's interest I plan to extend it to have several backends. These would probably each be an existing library that implements generating structured output like https://github.com/outlines-dev/outlines or https://github.com/guidance-ai/guidance. If you have ideas how this should be done let me know - on a github issue would be great to make it visible to others.

  • guidance

    A guidance language for controlling large language models.

  • Right now it just works with OpenAI chat models (gpt-3.5-turbo, gpt-4) but if there's interest I plan to extend it to have several backends. These would probably each be an existing library that implements generating structured output like https://github.com/outlines-dev/outlines or https://github.com/guidance-ai/guidance. If you have ideas how this should be done let me know - on a github issue would be great to make it visible to others.

  • cria

    OpenAI compatible API for serving LLAMA-2 model

  • openai-functools

    openai-functools: Simplified Generation of OpenAI Functions JSON Metadata for OpenAI Function Calling

  • Very cool! At first the title reminded me of a project me and my colleague are working on called OpenAI-Functools [1], but your concept is quite the opposite, combining LLMs in your code rather seamlessly instead of the other way around. Quite cool, and interesting examples :)

    I’ll definitely try to apply it in one of my pet projects. Good stuff

    [1] https://github.com/Jakob-98/openai-functools

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

    WorkOS logo
  • vanna

    🤖 Chat with your SQL database 📊. Accurate Text-to-SQL Generation via LLMs using RAG 🔄.

  • Nice! I’m going to try it out and possibly integrate it into my Python package: https://vanna.ai

  • LocalAI

    :robot: The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.

  • There's also LocalAI[0] which allows the use of local LLMs with an OpenAI compatible API.

    [0] https://github.com/go-skynet/LocalAI

  • antiscope

    experimental model-based language structures

  • See also: `antiscope`, an experiment in subjunctive programming

    https://github.com/MillionConcepts/antiscope

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts