tuna
llrt
Our great sponsors
tuna | llrt | |
---|---|---|
4 | 10 | |
1,263 | 7,582 | |
- | 6.7% | |
0.0 | 9.6 | |
about 2 months ago | 4 days ago | |
Python | JavaScript | |
GNU General Public License v3.0 only | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
tuna
-
Is AWS Lambda Cold Start Still an Issue?
Every minor detail matters and adds to the total import time as part of the cold start. We need to optimize our code and imports. If you use Python, you can analyze your code with a tool like Tuna and optimize your libraries (perhaps replace slower ones) and your imports.
- Make Python Run Faster
- Scanning Function calls in a script - is there a tool?
-
Creating a Python CLI with Go(lang)-comparable startup times
I started to examine the output of python -X importtime -m gefyra 2> import.log just to check the imports. There is an awesome tool to analyze the Python imports: tuna (see: https://github.com/nschloe/tuna). tuna allows analyzing the import times from the log. Run it like so tuna import.log. It opens a browser window and visualizes the import times. With that I was able to manually move all imports to the functions in which they are needed (and bring in some other optimizations). This greatly violates PEP 8 (https://peps.python.org/pep-0008/#imports) but leads to very fast startup times.
llrt
-
Unlocking Next-Gen Serverless Performance: A Deep Dive into AWS LLRT
FROM --platform=arm64 busybox WORKDIR /var/task/ COPY app.mjs ./ ADD https://github.com/awslabs/llrt/releases/latest/download/llrt-container-arm64 /usr/bin/llrt RUN chmod +x /usr/bin/llrt ENV LAMBDA_HANDLER "app.handler" CMD [ "llrt" ]
-
Is AWS Lambda Cold Start Still an Issue?
Let’s get the simplest use case out of the way: cases where the cold starts are so fast that it’s not an issue for you. That’s usually the case for function that use runtimes such as C++, Go, Rust, and LLRT. However, you must follow the best practices and optimizations in every runtime to maintain a low impact cold start.
-
JavaScript News, Updates, and Tutorials: February 2024 Edition
But compared to other runtimes, LLRT is not so good in terms of performance when it comes to dealing with large data processing, Monte Carlo simulations, or performing tasks with a large number of iterations. The AWS team says that it is best suited for working with smaller Serverless functions dedicated to tasks such as data transformation, real-time processing, AWS service integrations, authorization, validation, etc. Visit the GitHub repository of this project to learn more information.
- FLaNK Stack 26 February 2024
-
People Matter more than Technology when Building Serverless Applications
And lastly, lean into your cloud vendor. Stop trying to build a better mouse trap. Advances in technology are happening all the time. The speed of AWS' Lambda has been rapidly improving over the past couple of years with the launch of things like SnapStart and LLRT
- Hono v4.0.0
-
LLRT: A low-latency JavaScript runtime from AWS
It seems they just added the mention to QuickJS, I assume, based on your feedback:
https://github.com/awslabs/llrt/commit/054aefc4d8486f738ed3a...
Props to them on the quick fix!
What are some alternatives?
SnakeViz - An in-browser Python profile viewer
winterjs - Winter is coming... ❄️
Altair - Declarative statistical visualization library for Python
h3 - ⚡️ Minimal H(TTP) framework built for high performance and portability
ggplot - ggplot port for python
hono - Web Framework built on Web Standards
seaborn - Statistical data visualization in Python
hermes - A JavaScript engine optimized for running React Native.
vincent
pljs - PLJS - Javascript Language Plugin for PostreSQL
Apache Superset - Apache Superset is a Data Visualization and Data Exploration Platform [Moved to: https://github.com/apache/superset]
workerd - The JavaScript / Wasm runtime that powers Cloudflare Workers