llm-local

By PeterHedman
We couldn't find one of the projects to be compared. Maybe you clicked on a broken link.

Llm-local Alternatives

Similar projects and alternatives to llm-local

  • askai

    1,758 llm-local VS askai

    Command Line Interface for OpenAi ChatGPT (by yudax42)

  • private-gpt

    Interact with your documents using the power of GPT, 100% privately, no data leaks

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • LocalAI

    :robot: The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.

  • localGPT

    Chat with your documents on your local device using GPT models. No data leaves your device and 100% private.

  • awesome-gpt

    🏆 An awe-inspiring collection of resources, encompassing a wide range of tools, documents, resources, applications, and use cases related to ChatGPT.

  • SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a better llm-local alternative or higher similarity.

llm-local reviews and mentions

Posts with mentions or reviews of llm-local. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-07-23.
  • Ask HN: How do I train a custom LLM/ChatGPT on my own documents?
    8 projects | news.ycombinator.com | 23 Jul 2023
    I've used https://github.com/PromtEngineer/localGPT for this and thought it was nice. So I packaged it in a docker container for easy use.

    docker run -itd --gpus all -p $(PORT):5111 --name llm-local-wizardlm-7b obald/llm-launcher:0.0.2

    just use localhost:port in the browser and upload docs then ask questions in the gui.

    Really nice for easy lookup of rules in boardgames and such. As it provides the relevant text from the docs in addition to the query answer.

    https://gitlab.com/PeterHedman/llm-local

Stats

Basic llm-local repo stats
1
-
-
-

Popular Comparisons


Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com