neptune-client
lora
neptune-client | lora | |
---|---|---|
24 | 83 | |
536 | 6,616 | |
5.6% | - | |
9.7 | 0.0 | |
7 days ago | about 1 month ago | |
Python | Jupyter Notebook | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
neptune-client
-
Show HN: A gallery of dev tool marketing examples
Hi I am Jakub. I run marketing at a dev tool startup https://neptune.ai/ and I share learnings on dev tool marketing on my blog https://www.developermarkepear.com/.
Whenever I'd start a new marketing project I found myself going over a list of 20+ companies I knew could have done something well to ācopy-pasteā their approach as a baseline (think Tailscale, DigitalOCean, Vercel, Algolia, CircleCi, Supabase, Posthog, Auth0).
So past year and a half, Iāve been screenshoting examples of how companies that are good at dev marketing do things like pricing, landing page design, ads, videos, blog conversion ideas. And for each example I added a note as to why I thought it was good.
Now, it is ~140 examples organized by tags so you can browse all or get stuff for a particular topic.
Hope it is helpful to some dev tool founders and marketers in here.
wdyt?
Also, I am always looking for new companies/marketing ideas to add to this, so if youād like to share good examples Iād really appreciate it.
-
How to structure/manage a machine learning experiment? (medical imaging)
There are a lot of tools out there for experiment tracking (eg neptune.ai), but I'm really not sure whether that sort of thing is over the top for what I need to do.
-
How to grow a developer blog to 3M annual visitors? with Jakub Czakon (Neptune.ai)
Welcome to another episode of The Developer-led Podcast, where we dive into the strategies modern companies use to build and grow their developer tools. In this exciting episode, we're joined by Jakub Czakon, the CMO at Neptune.ai, a startup that assists developers in efficiently managing their machine-learning model data. Jakub is renowned not only for his role at Neptune.ai but also for his developer marketing endeavors, including the influential newsletter Developer Markepear and a thriving developer marketing Slack community.
-
[D] Is there any all in one deep learning platform or software
tbh I have done a pretty good search on this topic, I couldn't find any. I thought maybe community could help me find one, if people like you (who works at neptune.ai) have the same opinion then it is what it is :). anyway thank you for the suggestions that you gave, probably gonna use that.
-
New Data Scientist, want to get into MLOps, where to start?
To get started with MLOps, you will need to have some foundational skills in Python, SQL, mathematics, and machine learning algorithms and libraries. You will also need to learn about databases, model deployment, continuous integration, continuous delivery, continuous monitoring, and other best practices of MLOps. You can find some useful resources for each of these topics in the following blogs on neptune.ai (disclosure: I work for Neptune):
-
Does a fully sentient (Or at least as sentient as you and me) AI with free will have a soul?
arxiv.org2. apro-software.com3. en.wikipedia.org4. neptune.ai
-
[D] The hype around Mojo lang
Other companies followed the same route to promote their paid product, e.g. plotly -> dash, Pytorch Lightning -> Lightning AI, run.ai, neptune.ai . It's actually a fair strategy, but some people may fear the conflict of interest. Especially, when the tools require some time investment, and it seems like a serious vendor lock-in. Investing some time to learn a tool is not such a big deal, but once you adapt a workflow of an entire team it can be tough to go back.
-
[P] New Open Source Framework and No-Code GUI for Fine-Tuning LLMs: H2O LLM Studio
track and compare your model performance visually. In addition, Neptune integration can be used.
-
[D] New features and current problems with ml infrastructure?
I am working on a startup, I was wondering what people think are some gaps in current machine learning infrastructure solutions like WandB, or Neptune.ai.
- All your ML model metadata in a single place
lora
-
You can now train a 70B language model at home
Diffusion unet has an "extended" version nowadays that applies to the resnet part as well as the cross-attention: https://github.com/cloneofsimo/lora
-
How it feels right now
Absolutely. But that doesn't matter because you only have to train it at scale, once. There are papers released already that show it's possible to update weights in small sections. You won't have to wait for the next monolithic LLM to drop to get up to date information. It will start to learn in bits and pieces.
-
LoRA tuning in julia
No, it's a deep learning thing
-
What does Lora mean?
Low Rank Adaptation of Large Language Models.
-
[D] An ELI5 explanation for LoRA - Low-Rank Adaptation.
Recently, I have seen the LoRA technique (Low-Rank Adaptation of Large Language Models) as a popular method for fine-tuning LLMs and other models.
-
Combining LoRA, Retro, and Large Language Models for Efficient Knowledge Retrieval and Retention
Enter LoRA, a method proposed for adapting pre-trained models to specific tasks[2]. By freezing pre-trained model weights and injecting trainable rank decomposition matrices into the transformer architecture, LoRA can reduce the number of trainable parameters and the GPU memory requirement, making the adaptation of LLMs for downstream tasks more feasible.
-
100K Context Windows
Open-source LLM projects have largely solved this using Low-Rank Adaptation of Large Language Models (LoRA): https://arxiv.org/abs/2106.09685
Apparently an RTX 4090 running overnight is sufficient to produce a fine-tuned model that can spit out new Harry Potter stories, or whatever...
-
President Biden meets with AI CEOs at the White House amid ethical criticism
Alpaca was trained for $600 ($100 for the smaller model) and offers outputs competitive with ChatGTP. https://arxiv.org/abs/2106.09685
- LoRA: Low-Rank Adaptation of Large Language Models
- LORA: Low-Rank Adaptation of Large Language Models
What are some alternatives?
MLflow - Open source platform for the machine learning lifecycle
stable-diffusion-webui - Stable Diffusion web UI
Serpent.AI - Game Agent Framework. Helping you create AIs / Bots that learn to play any game you own!
LyCORIS - Lora beYond Conventional methods, Other Rank adaptation Implementations for Stable diffusion.
Caffe - Caffe: a fast open framework for deep learning.
sd_dreambooth_extension
mxnet - Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more
kohya-trainer - Adapted from https://note.com/kohya_ss/n/nbf7ce8d80f29 for easier cloning
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
ControlNet - Let us control diffusion models!
Porcupine Ā - On-device wake word detection powered by deep learning
sd-webui-additional-networks