aipl
llm
aipl | llm | |
---|---|---|
4 | 23 | |
119 | 2,991 | |
- | - | |
9.2 | 9.4 | |
6 months ago | 4 days ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
aipl
-
Ask HN: Tell us about your project that's not done yet but you want feedback on
AIPL is an "Array-Inspired Pipeline Language", a tiny DSL in Python to make it easier to explore and experiment with AI pipelines.
https://github.com/saulpw/aipl
When you want to run some prompts through an LLM over a dataset, with some preprocessing and/or chaining prompts together, AIPL makes it much easier than writing a Python script.
-
The Problem with LangChain
Yes! This is why I started working on AIPL. The scripts are much more like recipes (linear, contained in a single-file, self-evident even to people who don't know the language). For instance, here's a multi-level summarizer of a webpage: https://github.com/saulpw/aipl/blob/develop/examples/summari...
The goal is to capture all that knowledge that langchain has, into consistent legos that you can combine and parameterize with the prompts, without all the complexity and boilerplate of langchain, nor having to learn all the Python libraries and their APIs. Perfect for prototypes and experiments (like a notebook, as you suggest), and then if you find something that really works, you can hand-off a single text file to an engineer and they can make it work in a production environment.
-
Langchain Is Pointless
I agree, and that's why I've been working on AIPL[0]. Our first v0.1 release should be in the next few days. https://github.com/saulpw/aipl
It's basically just a simple scripting language with array semantics and inline prompt construction, and you can drop into Python any time you like.
-
Re-implementing LangChain in 100 lines of code
I also was underwhelmed by langchain, and started implementing my own "AIPL" (Array-Inspired Pipeline Language) which turns these "chains" into straightforward, linear scripts. It's very early days but already it feels like the right direction for experimenting with this stuff. (I'm looking for collaborators if anyone is interested!)
https://github.com/saulpw/aipl
llm
- FLaNK AI-April 22, 2024
-
Show HN: I made a tool to clean and convert any webpage to Markdown
That's a great use case, you might be able to do this if you've got a copy and paste on the command line with
https://github.com/simonw/llm
In between. An alias like pdfwtf translating to "paste | llm command | copy"
-
Command R+: A Scalable LLM Built for Business
I added support for this model to my LLM CLI tool via a new plugin: https://github.com/simonw/llm-command-r
So now you can do this:
pipx install llm
-
The Next Generation of Claude (Claude 3)
If you're willing to use the CLI, Simon Willison's llm library[0] should do the trick.
[0] https://github.com/simonw/llm
- Show HN: I made an app to use local AI as daily driver
-
Localllm lets you develop gen AI apps on local CPUs
I'm not thrilled about https://github.com/GoogleCloudPlatform/localllm/blob/main/ll... calling their Python package "llm" and installing "llm" as a CLI command, when my similar https://llm.datasette.io/ project has that namespace reserved on PyPI already: https://pypi.org/project/llm/
- FLaNK 15 Jan 2024
- Show HN: Simple Script for Enhanced LLM Interaction in Vim
-
Bash One-Liners for LLMs
I've been gleefully exploring the intersection of LLMs and CLI utilities for a few months now - they are such a great fit for each other! The unix philosophy of piping things together is a perfect fit for how LLMs work.
I've mostly been exploring this with my https://llm.datasette.io/ CLI tool, but I have a few other one-off tools as well: https://github.com/simonw/blip-caption and https://github.com/simonw/ospeak
I'm puzzled that more people aren't loudly exploring this space (LLM+CLI) - it's really fun.
-
Semantic Kernel
Seems nice if you're using c# or java. It also supports python, but for that Simon's llm library is nice because he designed it as both a library and a command line tool: https://github.com/simonw/llm
What are some alternatives?
modelfusion - The TypeScript library for building AI applications.
ollama - Get up and running with Llama 3, Mistral, Gemma, and other large language models.
hamilton - Hamilton helps data scientists and engineers define testable, modular, self-documenting dataflows, that encode lineage and metadata. Runs and scales everywhere python does.
langroid - Harness LLMs with Multi-Agent Programming
multi-gpt - A Clojure interface into the GPT API with advanced tools like conversational memory, task management, and more
exllama - A more memory-efficient rewrite of the HF transformers implementation of Llama for use with quantized weights.
haystack - :mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
llm-gpt4all - Plugin for LLM adding support for the GPT4All collection of models
jehuty - Fluent API to interact with chat based GPT model
llm-api - Fully typed & consistent chat APIs for OpenAI, Anthropic, Groq, and Azure's chat models for browser, edge, and node environments.
llm-replicate - LLM plugin for models hosted on Replicate