lamby
mrsk
lamby | mrsk | |
---|---|---|
11 | 26 | |
580 | 6,294 | |
0.3% | - | |
5.9 | 9.4 | |
3 months ago | 8 months ago | |
Ruby | TeX | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
lamby
-
Understanding AWS Lambda Proactive Initialization
AWS Serverless Hero Ken Collins maintains a very popular Rails-Lambda package. After some discussion, he added the capability to track Proactive Initializations and came to a similar conclusion - in his case after a 3-day test using Ruby with a custom runtime, 80% of initializations were proactive:
-
💔 Goodbye Cold Starts ❤️Hello Proactive Initialization
This means the Monitoring with CloudWatch is just half the picture. But how much is your application potentially benefiting from proactive inits? Since Lamby v5.1.0, you can now find out easily using CloudWatch Metrics. To turn metrics on, enable the config like so:
-
The Elusive Lambda Console; A Specification Proposal.
After years of smashing Cloud & Rails together, I've come up with an idea. Better than an idea, a working specification! One where us Rails & Lambda enthusiasts can once again "console into" our "servers" and execute CLI tasks like migrations or interact via our beloved IRB friend, the Rails console. Today, I would like to present, the Lambda Console project. An open specification proposal for any AWS Lambda runtime to adopt.
-
The Next Generation of Serverless Is Happening
> Does this mean you have a cron job just pinging the serverless function every 3 minutes? I'm curious how much this adds on to your costs. It means that the whole "don't pay for non-usage" thing is not quite true, but maybe it's still significantly cheaper than running an EC2 instance or whatnot. I'm curious about the cost calculation here.
Yes, specifically it kicks off a Lambda function that does a parallel GET to our website at a special endpoint that has a 100ms "wait" and basic DB call. This keeps the lambda process alive/in-memory.
To keep a function alive costs ~125ms (100ms wait + 25ms full func roundtrip). every 3 minutes. ~0.041% of 1x CPU time. Our website server costs are tiny and lower for Staging and UAT. Benefit - can scale to 1000x (AWS Limit) servers at the speed of your cold start time.
> Another thing I'm curious about, since you have a container-based deployment, did you compare with Fargate?
Yes we use Fargate for our core product which is built in Rails before containers could be deployed in Lambda. Rails works fine on Lambda[0] but the transition cost wasn't worth it for us. Fargate is great, but as you point out it is expensive if your application isn't a user heavy one like ours. To be highly available, we always have a minimum of 2 online but we're a b2b application so our night usage, 10pm-6am is zero. But I have 2 machines just sitting there. This is why i love Lambda >> Fargate.
Also, scalaing Fargate machines is slow if you get a traffic spike.
[0] https://github.com/rails-lambda/lamby
-
Using Tailscale on Lambda for a Live Development Proxy
If you are curious to learn more about how Rails & Lambda work together, check out our Lamby project. The architecture of Lambda Containers works so well with Rails since our framework distills everything from HTTP, Jobs, Events, & WebSocket connections down to amazing CMD Docker contract. The architecture above at the proxy layer was easy to build and connect up to our single delegate function, Lamby.cmd. Shown below.
-
Ruby on Rails on Lambda on Arm64/Graviton2!
Today I am happy to announce that Lamby (Simple Rails & AWS Lambda Integration using Rack) now demonstrates just how easy it is to use multi-platform arm64 images on AWS Lambda. If this sounds interesting to you, jump right into our Quick Start guide and deploy a new Rails 7 on Ruby 3.2 Ubuntu image to see it for yourself.
-
Rails on Docker · Fly
(I have not actually used this myself). The folks over at CustomInk maintain Lamby, a project to run Rails in a quickly-bootable Lambda environment. Might be worth checking out, if you otherwise do enjoy working with Rails: https://lamby.custominktech.com
- Ruby on Jets: Like Rails but Serverless
- Rails on Lambda with Lamby v4
-
How to make your RoR app infinitely scale?
In any case you can try out https://github.com/customink/lamby which is a gem responsible to allow to run a ror app native on aws lambda.
mrsk
-
Deploy Anycable with MRSK
Here we'll deploy Anycable wih MRSK.
-
Fly.io Postgres cluster went down for 3 days, no word from them about it
Honestly these days I am leaning towards this approach: https://github.com/mrsked/mrsk/
It's all just docker.
-
The Curse of Scalable Technology
Did you consider MRSK[1], k3s[2], or dokku[3]? They are all significantly simpler to operate than Kubernetes, curious to hear your take.
[1] https://github.com/mrsked/mrsk
-
How to cache MRSK deployments in CI
https://github.com/mrsked/mrsk/pull/159 Closed PR about --cache-to option in MRSK
-
Thoughts on MSRK?
Yes, that thing with the setup is misleading in the docs. I'll make a PR now. There's this issue about it: https://github.com/mrsked/mrsk/issues/301
-
Rails Foundation announces first-ever conference!
god or bad, dhh is doing noise and people know about rails. just look at there https://github.com/mrsked/mrsk
-
MRSK vs. Fly.io
I don't think there's a writeup out there, but mrsk just uses docker under the hood. So, if you have a CMD in your Dockerfile, it will use that.
If you have an image that can run multiple things, like a rails app that can run the app process for web traffic by default, but it can also run job workers with the right command, you can provide the cmd in the mrsk config. You can see this in the jobs role in the example: https://github.com/mrsked/mrsk#using-different-roles-for-ser....
-
Looking to use Docker & Docker Compose in production and need advice.
You may want to checkout MRSK if you are going to be using docker compose in production on a single VPS https://github.com/mrsked/mrsk
-
Deploying with MRSK
"MRSK basically is Capistrano for Containers, without the need to carefully prepare servers in advance" https://github.com/mrsked/mrsk
-
Need some advice on how to deploy images to our vending machines
https://github.com/mrsked/mrsk might be interesting to you.
What are some alternatives?
jets - Ruby on Jets [Moved to: https://github.com/rubyonjets/jets]
awesome-compose - Awesome Docker Compose samples
socksify-ruby - Redirect any TCP connection initiated by a Ruby script through a SOCKS5 proxy
Dokku - A docker-powered PaaS that helps you build and manage the lifecycle of applications
bgems - Binary rubygems
kubero - A free and self-hosted Heroku PaaS alternative for Kubernetes that implements GitOps
buildah - A tool that facilitates building OCI images.
docker-phoenix-example - A production ready example Phoenix app that's using Docker and Docker Compose.
dockerfiles - Various Dockerfiles I use on the desktop and on servers.
deploy - Ansible role to deploy scripting applications like PHP, Python, Ruby, etc. in a capistrano style
runner-images - GitHub Actions runner images
docker-flask-example - A production ready example Flask app that's using Docker and Docker Compose.