llmo
deeplake
llmo | deeplake | |
---|---|---|
3 | 13 | |
40 | 7,751 | |
- | 1.6% | |
6.3 | 9.8 | |
12 months ago | 7 days ago | |
Python | Python | |
Apache License 2.0 | Mozilla Public License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
llmo
-
Six tips for better coding with ChatGPT
Aider is such an awesome project! I didn't know about it until I read this comment. I also wanted a way to provide my code as context from within the terminal without having to copy and paste back and forth. The tool I wrote (llmo) seems pretty similar to yours, although it uses the Textual library and Rich.
https://github.com/knowsuchagency/llmo
I'm really excited to try out aider, thanks for making it!
-
Show HN: LLMO – An LLM pair programmer in your terminal
Hello HN!
LLMO (Elmo) is an AI pair programming tool I created that's become an indispensable part of my workflow.
https://github.com/knowsuchagency/llmo
LLMO is designed to meet you where you are – your terminal. It provides real-time, interactive programming assistance. With its "staging area" feature, you can keep files in the context window and update the AI about your ongoing coding tasks without the hassle of copying and pasting every time you make changes to your code.
Key features include: - Interactive Chat: Get real-time programming assistance directly in your terminal. - Staging Area: No need to copy and paste updates. Simply add your files to the AI's context. - Model Customization: Choose the OpenAI model that fits your needs. - Personality: By default, Elmo loves to make bodybuilding references. You can turn this off through a CLI flag or environment variable
The recommended way to install LLMO is via `pipx install llmo` https://pypa.github.io/pipx/
As a sidenote, LLMO uses Textual which runs the terminal in application mode, meaning that you can't simply copy content as you would normally. In iterm2, you can hold down the `option` key to select text. You'll need to refer to the documentation for your own terminal for more information.
I hope you find LLMO as useful as I have!
- LLMO – An LLM pair programmer in your terminal
deeplake
- FLaNK AI Weekly 25 March 2025
-
Qdrant, the Vector Search Database, raised $28M in a Series A round
I think Activeloop(YC) is too: https://github.com/activeloopai/deeplake/
-
[P] I built a Chatbot to talk with any Github Repo. 🪄
This repository contains two Python scripts that demonstrate how to create a chatbot using Streamlit, OpenAI GPT-3.5-turbo, and Activeloop's Deep Lake. The chatbot searches a dataset stored in Deep Lake to find relevant information and generates responses based on the user's input.
-
[P] Chat With Any GitHub Repo - Code Understanding with @LangChainAI & @activeloopai
Deep Lake GitHub
- [P] A 'ChatGPT Interface' to Explore Your ML Datasets -> app.activeloop.ai
-
Build ChatGPT for Financial Documents with LangChain + Deep Lake
As the world is increasingly generating vast amounts of financial data, the need for advanced tools to analyze and make sense of it has never been greater. This is where LangChain and Deep Lake come in, offering a powerful combination of technology to help build a question-answering tool based on financial data. After participating in a LangChain hackathon last week, I created a way to use Deep Lake, the data lake for deep learning (a package my team and I are building) with LangChain. I decided to put together a guide of sorts on how you can approach building your own question-answering tools with LangChain and Deep Lake as the data store.
-
Launch HN: Activeloop (YC S18) – Data lake for deep learning
Re: HF - we know them and admire their work (primarily, until very recently, focused on NLP, while we focus mostly on CV). As mentioned in the post, a large part of Deep Lake, including the Python-based dataloader and dataset format, is open source as well - https://github.com/activeloopai/deeplake.
Likewise, we curate a list of large open source datasets here -> https://datasets.activeloop.ai/docs/ml/, but our main thing isn't aggregating datasets (focus for HF datasets), but rather providing people with a way to manage their data efficiently. That being said, all of the 125+ public datasets we have are available in seconds with one line of code. :)
We haven't benchmarked against HF datasets in a while, but Deep Lake's dataloader is much, much faster in third-party benchmarks (see this https://arxiv.org/pdf/2209.13705 and here for an older version, that was much slower than what we have now, see this: https://pasteboard.co/la3DmCUR2iFb.png). HF under the hood uses Git-LFS (to the best of my knowledge) and is not opinionated on formats, so LAION just dumps Parquet files on their storage.
While your setup would work for a few TBs, scaling to PB would be tricky including maintaining your own infrastructure. And yep, as you said NAS/NFS would neither be able to handle the scale (especially writes with 1k workers). I am also slightly curious about your use of mmap files with image/video compressed data (as zero-copy won’t happen) unless you decompress inside the GPU ;), but would love to learn more from you! Re: pricing thanks for the feedback, storage is one component and customly priced for PB-scale workloads.
-
[P] Launching Deep Lake: the data lake for deep learning applications - https://activeloop.ai/
Deep Lake is fresh off the "press", so we would really appreciate your feedback here or in our community, a star on GitHub. If you're interested to learn more, you can read the Deep Lake academic paper or the whitepaper (that talks more about our vision!).
-
Researchers at Activeloop AI Introduce ‘Deep Lake,’ an Open-Source Lakehouse for Deep Learning Applications
Continue reading | heck out the paper and github
GIthub: https://github.com/activeloopai/deeplake
What are some alternatives?
embedchain - Personalizing LLM Responses
lance - Modern columnar data format for ML and LLMs implemented in Rust. Convert from parquet in 2 lines of code for 100x faster random access, vector index, and data versioning. Compatible with Pandas, DuckDB, Polars, Pyarrow, with more integrations coming..
easy-chat - A ChatGPT UI for young readers, written by ChatGPT
auto-maple - Artificial intelligence software for MapleStory that uses various machine learning and computer vision techniques to navigate challenging in-game environments
bloop - bloop is a fast code search engine written in Rust.
tensorstore - Library for reading and writing large multi-dimensional arrays.
promptflow - Build high-quality LLM apps - from prototyping, testing to production deployment and monitoring.
langchain - ⚡ Building applications with LLMs through composability ⚡ [Moved to: https://github.com/langchain-ai/langchain]
barfi - Python Flow Based Programming environment that provides a graphical programming environment.
super-image - Image super resolution models for PyTorch.
GPflow - Gaussian processes in TensorFlow
cascade - Lightweight and modular MLOps library targeted at small teams or individuals