New GPT-3 model: text-DaVinci-003

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • arkose

    GPT-enhanced document editor

  • In case it interests anyone, I built a document editor + GPT implementation and updated it to use text-davinci-003: https://github.com/typpo/arkose/

    If you don't have an OpenAI API key, I've set up a quick demo here for people to give it a try, at least until I hit my billing cap (normally users would supply their own API key): https://arkose.pages.dev/

  • InvokeAI

    InvokeAI is a leading creative engine for Stable Diffusion models, empowering professionals, artists, and enthusiasts to generate and create visual media using the latest AI-driven technologies. The solution offers an industry leading WebUI, supports terminal use through a CLI, and serves as the foundation for multiple commercial products.

  • No I hadn't. Thanks I'll check it out! Of course the open source nature of SD is really nice as the community[0] keep pumping out usability improvements and features.. like the Pokemon trained model[1] which is an absolute blast for my kid!

    [0]: https://github.com/invoke-ai/InvokeAI

    [1]: https://huggingface.co/justinpinkney/pokemon-stable-diffusio...

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • hn_summary

    Summarizes top stories from Hacker News using a large language model and post them to a Telegram channel.

  • KoboldAI-Client

  • biggest barrier to this is the hardware requirements. I saw an estimate on r/machinelearning that based on the parameter count, gpt-3 needs around 350GB of VRAM. maybe you could cut that in half, or even one-eighth if someone figures out some crazy quantization scheme, but it's still firmly outside of the realm of consumer hardware right now.

    stuff like koboldai can let you run smaller models on your hardware though (https://github.com/KoboldAI/KoboldAI-Client).

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts