last_layer
prompttools
last_layer | prompttools | |
---|---|---|
2 | 4 | |
87 | 2,500 | |
- | 2.7% | |
7.6 | 9.4 | |
6 days ago | about 1 month ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
last_layer
prompttools
-
Did GPT-4 really get worse? We built an evaluation framework so you can find out
Here's an example where we compare a few versions of GPT-4 against a locally run Llama 2 model: https://github.com/hegelai/prompttools/blob/main/examples/notebooks/GPT4vsLlama2.ipynb
- Experiment with HuggingFace, OpenAI, and other models using prompttools
- Prompttools: An AGPL-3.0 library for prompt testing and experimentation
-
prompttools: an open source python package for prompt engineers
I wanted to share a project I've been working on that I thought might be relevant to you all, prompttools! It's an open source library with tools for testing prompts, creating CI/CD, and running experiments across models and configurations. It uses notebooks and code so it'll be most helpful for folks approaching prompt engineering from a software background.
What are some alternatives?
spacy-llm - 🦙 Integrating LLMs into structured NLP pipelines
cerche - Experimental search engine for conversational AI such as parl.ai, large language models such as OpenAI GPT3, and humans (maybe).
graph-of-thoughts - Official Implementation of "Graph of Thoughts: Solving Elaborate Problems with Large Language Models"
ChatGPT-API-Python - Building a Chatbot in Python using OpenAI's Official ChatGPT API
DPL - [NeurIPS 2023] Multi-fidelity hyperparameter optimization with deep power laws that achieves state-of-the-art results across diverse benchmarks.
FlexGen - Running large language models like OPT-175B/GPT-3 on a single GPU. Focusing on high-throughput generation. [Moved to: https://github.com/FMInference/FlexGen]
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
FlexGen - Running large language models on a single GPU for throughput-oriented scenarios.