SaaSHub helps you find the best software and product alternatives Learn more →
CSS llama Projects
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
Project mention: Ask HN: What are the capabilities of consumer grade hardware to work with LLMs? | news.ycombinator.com | 2023-08-03I agree, I've definitely seen way more information about running image synthesis models like Stable Diffusion locally than I have LLMs. It's counterintuitive to me that Stable Diffusion takes less RAM than an LLM, especially considering it still needs the word vectors. Goes to show I know nothing.
I guess it comes down to the requirement of a very high end (or multiple) GPU that makes it impractical for most vs just running it in Colab or something.
Tho there are some efforts:
https://github.com/cocktailpeanut/dalai
NOTE:
The open source projects on this list are ordered by number of github stars.
The number of mentions indicates repo mentiontions in the last 12 Months or
since we started tracking (Dec 2020).
CSS llama related posts
- Ask HN: What are the capabilities of consumer grade hardware to work with LLMs?
- How can I run a large language model locally?
- meirl
- FreedomGPT: AI with no censorship
- Newbie , installed dalai with llama locally, trying to make sense of responses
- Deep Learning Newbie Question
- Printing Dalai full output in Node.Js?
-
A note from our sponsor - SaaSHub
www.saashub.com | 28 Apr 2024
Index
Project | Stars | |
---|---|---|
1 | dalai | 13,044 |
Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com