Getting hands-on with local LLMs using OLLAMA

This page summarizes the projects mentioned and recommended in the original post on dev.to

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • ollama

    Get up and running with Llama 3, Mistral, Gemma, and other large language models.

  • In a nutshell, Ollama has its own collection of models that users can access. These models can be downloaded to your computer and interacted with through a simple command line. Alternatively, Ollama offers a server for inference, usually at port 11434, allowing interaction via APIs and libraries like Langchain.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • Practical use Cases of AI and Java

    1 project | dev.to | 6 May 2024
  • Anomaly Detection with FiftyOne and Anomalib

    4 projects | dev.to | 6 May 2024
  • Introducing Jan

    4 projects | dev.to | 5 May 2024
  • LocalAI: Self-hosted OpenAI alternative reaches 2.14.0

    1 project | news.ycombinator.com | 3 May 2024
  • Ollama v0.1.33

    1 project | news.ycombinator.com | 3 May 2024