hamilton
minGPT
hamilton | minGPT | |
---|---|---|
21 | 35 | |
1,321 | 18,875 | |
3.7% | - | |
9.8 | 0.0 | |
6 days ago | 7 days ago | |
Jupyter Notebook | Python | |
GNU General Public License v3.0 or later | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
hamilton
- Show HN: Hamilton's UI – observability, lineage, and catalog for data pipelines
-
Building an Email Assistant Application with Burr
Note that this uses simple OpenAI calls — you can replace this with Langchain, LlamaIndex, Hamilton (or something else) if you prefer more abstraction, and delegate to whatever LLM you like to use. And, you should probably use something a little more concrete (E.G. instructor) to guarantee output shape.
-
Using IPython Jupyter Magic commands to improve the notebook experience
In this post, we’ll show how your team can turn any utility function(s) into reusable IPython Jupyter magics for a better notebook experience. As an example, we’ll use Hamilton, my open source library, to motivate the creation of a magic that facilitates better development ergonomics for using it. You needn’t know what Hamilton is to understand this post.
-
FastUI: Build Better UIs Faster
We built an app with it -- https://blog.dagworks.io/p/building-a-lightweight-experiment. You can see the code here https://github.com/DAGWorks-Inc/hamilton/blob/main/hamilton/....
Usually we've been prototyping with streamlit, but found that at times to be clunky. FastUI still has rough edges, but we made it work for our lightweight app.
- Show HN: On Garbage Collection and Memory Optimization in Hamilton
-
Facebook Prophet: library for generating forecasts from any time series data
This library is old news? Is there anything new that they've added that's noteworthy to take it for another spin?
[disclaimer I'm a maintainer of Hamilton] Otherwise FYI Prophet gels well with https://github.com/DAGWorks-Inc/hamilton for setting up your features and dataset for fitting & prediction[/disclaimer].
- Show HN: Declarative Spark Transformations with Hamilton
-
Langchain Is Pointless
I had been hearing these pains from Langchain users for quite a while. Suffice to say I think:
1. too many layers of OO abstractions are a liability in production contexts. I'm biased, but a more functional approach is a better way to model what's going on. It's easier to test, wrap a function with concerns, and therefore reason about.
2. as fast as the field is moving, the layers of abstractions actually hurt your ability to customize without really diving into the details of the framework, or requiring you to step outside it -- in which case, why use it?
Otherwise I definitely love the small amount of code you need to write to get an LLM application up with Langchain. However you read code more often than you write it, in which case this brevity is a trade-off. Would you prefer to reduce your time debugging a production outage? or building the application? There's no right answer, other than "it depends".
To that end - we've come up with a post showing how one might use Hamilton (https://github.com/dagWorks-Inc/hamilton) to easily create a workflow to ingest data into a vector database that I think has a great production story. https://open.substack.com/pub/dagworks/p/building-a-maintain...
Note: Hamilton can cover your MLOps as well as LLMOps needs; you'll invariably be connecting LLM applications with traditional data/ML pipelines because LLMs don't solve everything -- but that's a post for another day.
-
Free access to beta product I'm building that I'd love feedback on
This is me. I drive an open source library Hamilton that people doing time-series/ML work love to use. I'm building a paid product around it at DAGWorks, and I'm after feedback on our current version. Can I entice anyone to:
-
IPyflow: Reactive Python Notebooks in Jupyter(Lab)
From a nuts and bolts perspective, I've been thinking of building some reactivity on top of https://github.com/dagworks-inc/hamilton (author here) that could get at this. (If you have a use case that could be documented, I'd appreciate it.)
minGPT
- FLaNK AI Weekly for 29 April 2024
-
Ask HN: Daily practices for building AI/ML skills?
minGPT (Karpathy): https://github.com/karpathy/minGPT
Next, some foundational textbooks for general ML and deep learning:
-
[D] What are some examples of being clever with batching for training efficiency?
Language Model novice here. I was going through the README section of minGPT and read this line.
-
LLM Visualization: 3D interactive model of a GPT-style LLM network running inference.
The first network displayed with working weights is a tiny such network, which sorts a small list of the letters A, B, and C. This is the demo example model from Andrej Karpathy's minGPT implementation.
- LLM Visualization
- Learn Machine Learning
-
Facebook Prophet: library for generating forecasts from any time series data
Tried it once. Its promise is to take the dataset's seasonal trend into account, which makes sense for Facebook's original use case.
We ran it on such a dataset and found out that directly using https://github.com/karpathy/minGPT consistently gives a better result. So we ended up using the output of Prophet as an input feature to a neural network, but the result was not improved in any significant way.
-
Tokenization of numerical series
Sure, im trying to regenerate a bunch of complex numbers based on their absolute value. So im trying to embed these absolute values and then using gpt model(probably mini gpt) try to recover the original comples numbers. There is a certain connection between these complex numbers and their order which im not capable of explaining yet. Im hoping the model would be capable of recognizing certain sequences of these absolute values and match them with the desired complex counterparts (by training the model).
-
Anyone know of any articles on training a LLM from scratch on a single GPU?
minGPT (https://github.com/karpathy/minGPT)
-
Understanding LLMs(to the best of our knowledge)
Check out minGPT and nanoGPT from Karpathy, he puts out some of the best machine learning tutorials and teaching content.
What are some alternatives?
dagster - An orchestration platform for the development, production, and observation of data assets.
nanoGPT - The simplest, fastest repository for training/finetuning medium-sized GPTs.
haystack - :mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
gpt-2 - Code for the paper "Language Models are Unsupervised Multitask Learners"
tree-of-thought-llm - [NeurIPS 2023] Tree of Thoughts: Deliberate Problem Solving with Large Language Models
simpletransformers - Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
snowpark-python - Snowflake Snowpark Python API
Pytorch-Simple-Transformer - A simple transformer implementation without difficult syntax and extra bells and whistles.
aipl - Array-Inspired Pipeline Language
nn-zero-to-hero - Neural Networks: Zero to Hero
vscode-reactive-jupyter - A simple Reactive Python Extension for Visual Studio Code
huggingface_hub - The official Python client for the Huggingface Hub.