Other PDF SDKs promise a lot - then break. Laggy scrolling, poor mobile UX, tons of bugs, and lack of support cost you endless frustrations. Nutrient’s SDK handles billion-page workloads - so you don’t have to debug PDFs. Used by ~1 billion end users in more than 150 different countries. Learn more →
Files-to-prompt Alternatives
Similar projects and alternatives to files-to-prompt
-
cline
Autonomous coding agent right in your IDE, capable of creating/editing files, executing commands, using the browser, and more with your permission every step of the way.
-
CodeRabbit
CodeRabbit: AI Code Reviews for Developers. Revolutionize your code reviews with AI. CodeRabbit offers PR summaries, code walkthroughs, 1-click suggestions, and AST-based analysis. Boost productivity and code quality across all major languages with each PR.
-
repopack
Discontinued 📦 Repomix (formerly Repopack) is a powerful tool that packs your entire repository into a single, AI-friendly file. Perfect for when you need to feed your codebase to Large Language Models (LLMs) or other AI tools like Claude, ChatGPT, and Gemini. [Moved to: https://github.com/yamadashy/repomix]
-
1filellm
Discontinued Specify a github or local repo, github pull request, arXiv or Sci-Hub paper, Youtube transcript or documentation URL on the web and scrape into a text file and clipboard for easier LLM ingestion [Moved to: https://github.com/jimmc414/onefilellm]
-
swarm
Educational framework exploring ergonomic, lightweight multi-agent orchestration. Managed by OpenAI Solution team.
-
repo2file
Dump selected files from your repo into single file to easily use in LLMs (Claude, Openai, etc..)
-
-
gitingest
Replace 'hub' with 'ingest' in any github url to get a prompt-friendly extract of a codebase
-
Nutrient
Nutrient – The #1 PDF SDK Library, trusted by 10K+ developers. Other PDF SDKs promise a lot - then break. Laggy scrolling, poor mobile UX, tons of bugs, and lack of support cost you endless frustrations. Nutrient’s SDK handles billion-page workloads - so you don’t have to debug PDFs. Used by ~1 billion end users in more than 150 different countries.
-
code2prompt
A CLI tool to convert your codebase into a single LLM prompt with source tree, prompt templating, and token counting.
-
ingest
Parse files (e.g. code repos) and websites to clipboard or a file for ingestions by AI / LLMs
-
-
shell-tooling
A fun and nerdy collection of bash aliases and scripts to make your workflow smoother than butter on a hot pancake.
-
repomix
📦 Repomix (formerly Repopack) is a powerful tool that packs your entire repository into a single, AI-friendly file. Perfect for when you need to feed your codebase to Large Language Models (LLMs) or other AI tools like Claude, ChatGPT, DeepSeek, Perplexity, Gemini, Gemma, Llama, Grok, and more.
-
your-source-to-prompt.html
Quickly and securely turn your code projects into LLM prompts, all locally on your own machine!
-
-
jsonltui
A fast TUI application (with optional webui) to visually navigate and inspect JSON and JSONL data. Easily localize parse errors in large JSONL files. Made with LLM fine-tuning workflows in mind.
-
claude-artifact-runner
A template project for easily converting Claude AI’s Artifacts into React applications, ready to run out of the box or extend as needed.
-
-
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
files-to-prompt discussion
files-to-prompt reviews and mentions
-
How I use LLMs as a staff engineer
I'm in your boat with having to write a significant amount of English documents. I always write them myself, and have ChatGPT analyze them as well. I just had a thought - I wonder if I could paste in technical documentation, and code, to validate my documentation? Will have to try that later.
CoPilot is used for simple boilerplate code, and also for the autocomplete. It's often a starting point for unit tests (but a thorough review is needed - you can't just accept it, I've seen it misinterpret code). I started experimenting with RA.Aid (https://github.com/ai-christianson/RA.Aid) after seeing a post on it here today. The multi-step actions are very promising. I'm about to try files-to-prompt (https://github.com/simonw/files-to-prompt) mentioned elsewhere in the thread.
For now, LLMs are a level-up in tooling but not a replacement for developers (at least yet)
-
Yek: Serialize your code repo (or part of it) to feed into any LLM
I have to add https://github.com/simonw/files-to-prompt as a marker guid.
I think "the part of it" is key here. For packaging a codebase, I'll select a collection of files using rg/fzf and then concatenate them into a markdown document, # headers for paths ```filetype ``` for the contents.
The selection of the files is key to let the LLM focus on what is important for the immediate task. I'll also give it the full file list and have the LLM request files as needed.
-
Things we learned out about LLMs in 2024
I use my https://github.com/simonw/files-to-prompt tool like this:
files-to-prompt . -e py -e md -c | pbcopy
-
Show HN: Source to Prompt- Turn your code into an LLM prompt with more features
Your Source to Prompt-- Turn your code into an LLM prompt, but with way more features!
I just made this useful tool as a single html file that lets you easily turn your coding projects into targeted single text files for use in LLM prompts for AI-aided development.
Unlike the many other existing competing projects-- to name just a few:
1. [files-to-prompt](https://github.com/simonw/files-to-prompt)
- Source to Prompt- Turn your code into an LLM prompt, but with more features
-
Show HN: Replace "hub" by "ingest" in GitHub URLs for a prompt-friendly extract
If I understand correctly, this sounds like https://github.com/simonw/files-to-prompt/.
It's quite useful, with some filtering options (hidden files, gitignore, extensions) and support for Claude-style tags.
-
Everything I built with Claude Artifacts this week
Yes, I paste stuff in from larger projects all the time.
I'm very selective about what I give them. For example, if I'm working on a Django project I'll paste in just the Django ORM models for the part of the codebase I'm working on - that's enough for it to spit out forms and views and templates, it doesn't need to know about other parts of the codebase.
Another trick I sometimes use is Claude Projects, which allow you to paste up to 200,000 tokens into persistent context for a model. That's enough to fit a LOT of code, so I've occasionally dumped my entire codebase (using my https://github.com/simonw/files-to-prompt/ tool) in there, or selected pieces that are important like the model and URL definitions.
-
Show HN: Vomitorium – all of your project in 1 text file
Similar https://github.com/simonw/files-to-prompt
-
Show HN: Dump entire Git repos into a single file for LLM prompts
Until I saw this post, I wasn't aware of any of those.
What makes his better? Since you're asking, I tried these and here's my verdict:
- [files-to-prompt](https://github.com/simonw/files-to-prompt) (from the GOAT simonw)
-
Anthropic Introduces Claude Projects
My https://github.com/simonw/files-to-prompt tool might also be useful here, for turning a bunch of different files into a single file to upload.
-
A note from our sponsor - Nutrient
www.nutrient.io | 19 Feb 2025
Stats
simonw/files-to-prompt is an open source project licensed under Apache License 2.0 which is an OSI approved license.
The primary programming language of files-to-prompt is Python.