dolly-v2-12b

This page summarizes the projects mentioned and recommended in the original post on /r/LocalLLM

Judoscale - Save 47% on cloud hosting with autoscaling that just works
Judoscale integrates with Django, FastAPI, Celery, and RQ to make autoscaling easy and reliable. Save big, and say goodbye to request timeouts and backed-up task queues.
judoscale.com
featured
CodeRabbit: AI Code Reviews for Developers
Revolutionize your code reviews with AI. CodeRabbit offers PR summaries, code walkthroughs, 1-click suggestions, and AST-based analysis. Boost productivity and code quality across all major languages with each PR.
coderabbit.ai
featured
  1. dolly

    Databricks’ Dolly, a large language model trained on the Databricks Machine Learning Platform

    Databricks’ dolly-v2-12b, an instruction-following large language model trained on the Databricks machine learning platform that is licensed for commercial use. Based on pythia-12b, Dolly is trained on ~15k instruction/response fine tuning records databricks-dolly-15k generated by Databricks employees in capability domains from the InstructGPT paper, including brainstorming, classification, closed QA, generation, information extraction, open QA and summarization. dolly-v2-12bis not a state-of-the-art model, but does exhibit surprisingly high quality instruction following behavior not characteristic of the foundation model on which it is based.

  2. Judoscale

    Save 47% on cloud hosting with autoscaling that just works. Judoscale integrates with Django, FastAPI, Celery, and RQ to make autoscaling easy and reliable. Save big, and say goodbye to request timeouts and backed-up task queues.

    Judoscale logo
  3. DALLE-mtf

    Open-AI's DALL-E for large scale training in mesh-tensorflow.

    dolly-v2-12bis a 12 billion parameter causal language model created by Databricks that is derived from EleutherAI’s Pythia-12b and fine-tuned on a ~15K record instruction corpus generated by Databricks employees and released under a permissive license (CC-BY-SA)

  4. dbt-databricks

    A dbt adapter for Databricks.

    dolly-v2-12bis a 12 billion parameter causal language model created by Databricks that is derived from EleutherAI’s Pythia-12b and fine-tuned on a ~15K record instruction corpus generated by Databricks employees and released under a permissive license (CC-BY-SA)

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • The open source learning curve for AI researchers

    1 project | news.ycombinator.com | 20 Jul 2023
  • EleutherAI: Empowering Open-Source Artificial Intelligence Research

    1 project | news.ycombinator.com | 11 Jul 2023
  • Seeking advice on fine-tuning Pythia for semantic search in a non-English language

    1 project | /r/learnmachinelearning | 23 May 2023
  • Does anyone want to collaborate to make anti-capitalist AI?

    1 project | /r/antiwork | 17 May 2023
  • ChatGPT is bonkers.

    1 project | /r/Praise_AI_Overlords | 21 Apr 2023

Did you know that Python is
the 2nd most popular programming language
based on number of references?