Running Vicuna 7B on my Personal Website w/WebGPU

This page summarizes the projects mentioned and recommended in the original post on /r/LocalLLaMA

Our great sponsors
  • Appwrite - The open-source backend cloud platform
  • InfluxDB - Collect and Analyze Billions of Data Points in Real Time
  • Onboard AI - Learn any GitHub repo in 59 seconds
  • web-llm

    Bringing large-language models and chat to web browsers. Everything runs inside the browser with no server support.

    I've integrated WebLLM (https://mlc.ai/web-llm/) into my AI Chat app on my personal website (https://dustinbrett.com/) It runs all inference locally in the browser.

  • daedalOS

    Desktop environment in the browser

    I've integrated WebLLM (https://mlc.ai/web-llm/) into my AI Chat app on my personal website (https://dustinbrett.com/) It runs all inference locally in the browser.

  • Appwrite

    Appwrite - The open-source backend cloud platform. Add Auth, Databases, Functions, and Storage to your product and build any application at any scale while using your preferred coding languages and tools.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts