Running Vicuna 7B on my Personal Website w/WebGPU

This page summarizes the projects mentioned and recommended in the original post on /r/LocalLLaMA

Our great sponsors
  • SurveyJS - Open-Source JSON Form Builder to Create Dynamic Forms Right in Your App
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • web-llm

    Bringing large-language models and chat to web browsers. Everything runs inside the browser with no server support.

    I've integrated WebLLM (https://mlc.ai/web-llm/) into my AI Chat app on my personal website (https://dustinbrett.com/) It runs all inference locally in the browser.

  • daedalOS

    Desktop environment in the browser

    I've integrated WebLLM (https://mlc.ai/web-llm/) into my AI Chat app on my personal website (https://dustinbrett.com/) It runs all inference locally in the browser.

  • SurveyJS

    Open-Source JSON Form Builder to Create Dynamic Forms Right in Your App. With SurveyJS form UI libraries, you can build and style forms in a fully-integrated drag & drop form builder, render them in your JS app, and store form submission data in any backend, inc. PHP, ASP.NET Core, and Node.js.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts