cannot for the life of me compile libllama.dll

This page summarizes the projects mentioned and recommended in the original post on /r/LocalLLaMA

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • LLamaSharp

    A C#/.NET library to run LLM models (đŸ¦™LLaMA/LLaVA) on your local device efficiently.

  • I searched through GitHub and nothing comes up that is new. I wanted to run the model through the C# wrapper linked on LLaMASharp which requires compiling llama.cpp and extracting the libllama dll into the C# project files. When I build llama.cpp with OpenBLAS, everything shows up fine in the command line. Just as the link suggests I make sure to set DBUILD_SHARED_LIBS=ON when in CMake. However, the output in the Visual Studio Developer Command Line interface ignores the setup for libllama.dll in the CMakeFiles.txt entirely. The only dll to compile is llama.dll; I know this is a fairly technical question but does anyone know how to fix?

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • Unleash the Power of Video-LLaMA: Revolutionizing Language Models with Video and Audio Understanding!

    4 projects | dev.to | 12 Jun 2023
  • Show HN: I Remade the Fake Google Gemini Demo, Except Using GPT-4 and It's Real

    4 projects | news.ycombinator.com | 10 Dec 2023
  • Llamafile lets you distribute and run LLMs with a single file

    12 projects | news.ycombinator.com | 29 Nov 2023
  • GitHub - jackmpcollins/magentic: Seamlessly integrate LLMs as Python functions

    1 project | /r/Python | 7 Nov 2023
  • AI — weekly megathread!

    2 projects | /r/artificial | 15 Oct 2023