Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
Top 23 Python Gpt Projects
-
MetaGPT
🌟 The Multi-Agent Framework: First AI Software Company, Towards Natural Language Programming
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
RWKV-LM
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
-
h2ogpt
Private chat with local GPT with document, images, video, etc. 100% private, Apache 2.0. Supports oLLaMa, Mixtral, llama.cpp, and more. Demo: https://gpt.h2o.ai/ https://codellama.h2o.ai/
-
petals
🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
-
promptflow
Build high-quality LLM apps - from prototyping, testing to production deployment and monitoring.
-
VALL-E-X
An open source implementation of Microsoft's VALL-E X zero-shot TTS model. Demo is available in https://plachtaa.github.io
-
awesome-pretrained-chinese-nlp-models
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
Project mention: gpt4-openai-api VS gpt4free - a user suggested alternative | libhunt.com/r/gpt4-openai-api | 2024-01-04I cant install
Project mention: How to build your Developer Portfolio with MindsDB: The symbiotic relationship between developers and Opensource in 2024. | dev.to | 2024-05-23Developers are able to check for issues to fix on MindsDB’s Github Issues Page. The issues are marked with labels which indicate what you can work on,which you can find here. Fixing bugs showcases that you are a problem solver and capable of resolving issues. Companies find this capability very valuable as it has an impact on the quality of their product and user experience.
Project mention: AI leaderboards are no longer useful. It's time to switch to Pareto curves | news.ycombinator.com | 2024-04-30I guess the root cause of my claim is that OpenAI won't tell us whether or not GPT-3.5 is an MoE model, and I assumed it wasn't. Since GPT-3.5 is clearly nondeterministic at temp=0, I believed the nondeterminism was due to FPU stuff, and this effect was amplified with GPT-4's MoE. But if GPT-3.5 is also MoE then that's just wrong.
What makes this especially tricky is that small models are truly 100% deterministic at temp=0 because the relative likelihoods are too coarse for FPU issues to be a factor. I had thought 3.5 was big enough that some of its token probabilities were too fine-grained for the FPU. But that's probably wrong.
On the other hand, it's not just GPT, there are currently floating-point difficulties in vllm which significantly affect the determinism of any model run on it: https://github.com/vllm-project/vllm/issues/966 Note that a suggested fix is upcasting to float32. So it's possible that GPT-3.5 is using an especially low-precision float and introducing nondeterminism by saving money on compute costs.
Sadly I do not have the money[1] to actually run a test to falsify any of this. It seems like this would be a good little research project.
[1] Or the time, or the motivation :) But this stuff is expensive.
https://github.com/BlinkDL/RWKV-LM#rwkv-discord-httpsdiscord... lists a number of implementations of various versions of RWKV.
https://github.com/BlinkDL/RWKV-LM#rwkv-parallelizable-rnn-w... :
> RWKV: Parallelizable RNN with Transformer-level LLM Performance (pronounced as "RwaKuv", from 4 major params: R W K V)
> RWKV is an RNN with Transformer-level LLM performance, which can also be directly trained like a GPT transformer (parallelizable). And it's 100% attention-free. You only need the hidden state at position t to compute the state at position t+1. You can use the "GPT" mode to quickly compute the hidden state for the "RNN" mode.
> So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding (using the final hidden state).
> "Our latest version is RWKV-6,*
Project mention: "[D]" Using data from Alpaca for a commercial version of a Open LLM | /r/MachineLearning | 2023-07-02
Project mention: Multi AI Agent Systems Using OpenAI's New GPT-4o Model | news.ycombinator.com | 2024-05-17
Things like [petals](https://github.com/bigscience-workshop/petals) exist, distributed computing over willing participants. Right now corporate cash is being rammed into the space so why not snap it up while you can, but the moment it dries up projects like petals will see more of the love they deserve.
I envision a future where crypto-style booms happen over tokens useful for purchasing priority computational time, which is earned by providing said computational time. This way researchers can daisy-chain their independent smaller rigs together into something with gargantuan capabilities.
Project mention: A suite of tools designed to streamline the development cycle of LLM-based apps | news.ycombinator.com | 2024-04-12
Project mention: Ask HN: What are the drawbacks of caching LLM responses? | news.ycombinator.com | 2024-03-15Just found this: https://github.com/zilliztech/GPTCache which seems to address this idea/issue.
Extract from awesome-open-gpt
I've never tried it myself, but Prefect does have something like this with their Marvin AI library for Python.
https://github.com/PrefectHQ/marvin?tab=readme-ov-file#-buil...
Project mention: AI Search That Understands the Way Your Customer's Think | news.ycombinator.com | 2024-05-28
Python Gpt related posts
-
Open-source SDK for adding code interpreters to AI apps
-
Elia, A snappy, keyboard-centric terminal UI for interacting with LLM
-
Multi AI Agent Systems using OpenAI's new GPT-4o Model
-
I'm puzzled how anyone trusts ChatGPT for code
-
Agents of Change: Navigating the Rise of AI Agents in 2024
-
Open-source SDK for adding custom code interpreters to AI apps
-
AI leaderboards are no longer useful. It's time to switch to Pareto curves
-
A note from our sponsor - InfluxDB
www.influxdata.com | 30 May 2024
Index
What are some of the best open-source Gpt projects in Python? This list will help you:
Project | Stars | |
---|---|---|
1 | gpt4free | 58,313 |
2 | MetaGPT | 40,206 |
3 | MindsDB | 21,531 |
4 | LLaMA-Factory | 22,989 |
5 | vllm | 20,017 |
6 | best-of-ml-python | 15,702 |
7 | DocsGPT | 14,282 |
8 | RWKV-LM | 11,798 |
9 | dolly | 10,790 |
10 | h2ogpt | 10,801 |
11 | awesome-chatgpt-zh | 10,088 |
12 | AudioGPT | 9,826 |
13 | petals | 8,763 |
14 | promptflow | 8,369 |
15 | text-generation-inference | 8,098 |
16 | VALL-E-X | 7,269 |
17 | GPTCache | 6,550 |
18 | awesome-open-gpt | 5,203 |
19 | marvin | 4,844 |
20 | TaskingAI | 5,226 |
21 | awesome-pretrained-chinese-nlp-models | 4,306 |
22 | marqo | 4,219 |
23 | Baichuan2 | 3,972 |
Sponsored