GPT-J 6B locally on my computer

This page summarizes the projects mentioned and recommended in the original post on /r/KoboldAI

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • Basic-UI-for-GPT-J-6B-with-low-vram

    A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.

  • I found this yesterday, is it somehow possible to use this with KoboldAI to run GPT-J on weaker graphics cards?

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • How to run this service with a local GPU?

    1 project | /r/PygmalionAI | 27 Jan 2023
  • Tesla M40 and GPT-J-6B

    1 project | /r/KoboldAI | 8 Aug 2021
  • How is any of this even possible?

    1 project | /r/GPT3 | 21 Jul 2021
  • WebGPT: GPT Model on the Browser with WebGPU

    1 project | news.ycombinator.com | 1 Apr 2024
  • WebGPT: Run GPT model on the browser with WebGPU

    1 project | news.ycombinator.com | 12 Aug 2023