DistiLlama: Chrome Extension to Summarize Web Pages Using locally running LLMs

This page summarizes the projects mentioned and recommended in the original post on /r/LocalLLaMA

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • DistiLlama

    Chrome Extension to Summarize or Chat with Web Pages/Local Documents Using locally running LLMs. Keep all of your data and conversations private. 🔐

  • https://github.com/shreyaskarnik/DistiLlama feedback/suggestions and PRs are welcome.

  • ollama

    Get up and running with Llama 3, Mistral, Gemma, and other large language models.

  • I saw that it says using Ollama but that is not available on Windows yet.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • open-webui VS LibreChat - a user suggested alternative

    2 projects | 29 Feb 2024
  • Apple Silicon Llama 7B running in docker?

    5 projects | /r/LocalLLaMA | 7 Dec 2023
  • Ask HN: How to structure Rust, Axum, and SQLx for clean architecture?

    2 projects | news.ycombinator.com | 7 May 2024
  • Jack Dorsey says that he's not on the Bluesky board anymore

    1 project | news.ycombinator.com | 8 May 2024
  • Run Large and Small Language Models locally with ollama

    2 projects | dev.to | 7 May 2024