starter-workflows
PostgreSQL
starter-workflows | PostgreSQL | |
---|---|---|
311 | 517 | |
10,608 | 18,412 | |
1.3% | 1.2% | |
7.4 | 10.0 | |
21 days ago | about 9 hours ago | |
TypeScript | C | |
GNU General Public License v3.0 or later | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
starter-workflows
- Is the LLMs.txt a waste of TIME?
-
10 AI Tools That Took My SaaS Website from Zero to Launch!
8. Code Management & CI/CD (GitHub Actions)
-
How to Set Up a CI/CD Pipeline Using GitHub Actions
GitHub Actions is GitHub’s own automation platform. Instead of installing a separate CI/CD system, you define workflows inside your repository. These workflows are written in YAML and specify jobs that run on GitHub-hosted virtual machines (like Ubuntu, Windows, or macOS).
-
Modern CI Is Too Complex and Misdirected
The most concerning part about modern CI to me is how most of it is running on GitHub Actions, and how GitHub itself has been deprioritizing GitHub Actions maintenance and improvements over AI features.
Seriously, take a look at their pinned repo: https://github.com/actions/starter-workflows
> Thank you for your interest in this GitHub repo, however, right now we are not taking contributions.
>
> We continue to focus our resources on strategic areas that help our customers be successful while making developers' lives easier. While GitHub Actions remains a key part of this vision, we are allocating resources towards other areas of Actions and are not taking contributions to this repository at this time. The GitHub public roadmap is the best place to follow along for any updates on features we’re working on and what stage they’re in.
-
10 DevOps Tasks I’ve Stopped Doing Manually (Kudos to 'This' CLI Agent)
Rather than manually writing complex CI/CD YAML or pipeline scripts, I simply describe what I need and let Forge draft it. For example, I once fed Forge a legacy GitHub Actions workflow and asked it to explain each step. In seconds it “parsed the config and output a human-readable summary of each job”. That meant I quickly understood a tricky build pipeline without poring through docs. Similarly, you can prompt Forge to generate or modify your pipeline config: e.g. “create a Jenkinsfile that runs tests and deploys to staging.” It will scaffold the boilerplate so you can tweak the details. This keeps our delivery pipeline airtight and saves hours of YAML debugging.
-
5 Tools That Helped Me Catch 70% More Bugs in the Codebase [Important!]
3. CI/CD Pipelines & Automated Tests
-
Adding CI/CD Integration to My Cloud Resume Challenge
For this project, I used GitHub Actions as my CI/CD tool due to its seamless integration with GitHub repositories and support for AWS.
-
DevOps in 2025: the future is automated, git-ified, and kinda scary but fun.
GitHub Actions for CI/CD pipelines
-
What tools can help streamline cloud deployment processes?
If your code lives on GitHub (which it probably does), GitHub Actions should be your go-to for CI/CD.
-
Getting started with FrankenPHP, Laravel and Docker
My base target is used for development use, but my production target is used for production use. I'm using a GitHub Actions workflow to checkout my code, installing dependencies without development dependencies, and building my application. When that's done, I build the Docker image and send it to my container registry.
PostgreSQL
-
How to Securely Connect to Medusa.js Production Database on AWS?
You're minding your own business, managing AWS infrastructure for a client with a pretty standard e-commerce setup: a Medusa.js backend, a Next.js storefront, and most importantly for this story, a PostgreSQL RDS instance safely stashed away in a private subnet where nothing from the outside world can touch it. Exactly how the AWS gods intended.
-
Is Your Fraud Screening Process Ignoring Local Patterns?
Your Database: This is your system's memory. It can be a fast in-memory store like Redis for temporary data (perfect for velocity checks) or a persistent relational database like PostgreSQL for long-term data (like blacklists).
-
High Availability Postgres
#! /bin/bash set -ex IMG=postgres/test-0.0.1 IMG_ID=`docker images ${IMG} -q` PG_TAG=REL_18_BETA1 if [ "${IMG_ID}" = "" ]; then if [ ! -d "postgres-${PG_TAG}" ]; then wget https://github.com/postgres/postgres/archive/refs/tags/${PG_TAG}.tar.gz && tar -xzf ${PG_TAG}.tar.gz fi ID=991 USR=postgres USR_HOME=/home/postgres cat > Dockerfile << EOF FROM ubuntu:latest RUN groupadd -g ${ID} ${USR} && useradd -r -u ${ID} -g ${USR} ${USR} ADD postgres-${PG_TAG} ${USR_HOME} WORKDIR ${USR_HOME} RUN chown -R ${USR}:${USR} ${USR_HOME} RUN apt-get update && apt-get install -y g++ zlib1g-dev make curl tar gzip perl liblz4-dev libreadline-dev flex bison libicu-dev liburing-dev RUN apt-get install --reinstall -y pkg-config && ./configure --with-liburing --enable-debug --with-lz4 && make -j4 && make all && make install RUN echo "export PATH=/usr/local/pgsql/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" >> /etc/bash.bashrc && \ chown -R ${USR}:${USR} /usr/local/pgsql USER ${USR} EOF docker build -t ${IMG}:latest . rm Dockerfile rm -rf postgres-${PG_TAG} ${PG_TAG}.tar.gz else echo "Image ${IMG} already exists with ID ${IMG_ID}" fi
-
Why did I build a transparent, account-free, open-source URL shortener?
Choosing the database was straightforward: PostgreSQL. I have the most experience with it, very easy to spin up, has a lot of trust and is open-source (I guess the last two come hand-in-hand).
-
Left to Right Programming: Programs Should Be Valid as They Are Typed
I was explaining why it is the way that it is. If you'd like your own version of a parser, here's Postgres' [0]. Personally, I really like SQL's syntax and find that it makes sense when reading it.
[0]: https://github.com/postgres/postgres/tree/master/src/backend...
-
Turning PostgreSQL into GraphQL: Lessons from the Field
If you’ve never tried PostgreSQL before, the official site is a great starting point: PostgreSQL: The world’s most advanced open source database.
-
pg_dphyp: teach PostgreSQL to JOIN tables in a different way
/* https://github.com/postgres/postgres/blob/144ad723a4484927266a316d1c9550d56745ff67/src/backend/optimizer/path/costsize.c#L3375 */ void final_cost_nestloop(PlannerInfo *root, NestPath *path, JoinCostWorkspace *workspace, JoinPathExtraData *extra) { /* ... */ if (path->jpath.path.param_info) path->jpath.path.rows = path->jpath.path.param_info->ppi_rows; else path->jpath.path.rows = path->jpath.path.parent->rows; /* ... */ } /* https://github.com/postgres/postgres/blob/144ad723a4484927266a316d1c9550d56745ff67/src/backend/optimizer/path/costsize.c#L3873 */ void final_cost_mergejoin(PlannerInfo *root, MergePath *path, JoinCostWorkspace *workspace, JoinPathExtraData *extra) { /* ... */ if (path->jpath.path.param_info) path->jpath.path.rows = path->jpath.path.param_info->ppi_rows; else path->jpath.path.rows = path->jpath.path.parent->rows; /* ... */ } /* https://github.com/postgres/postgres/blob/144ad723a4484927266a316d1c9550d56745ff67/src/backend/optimizer/path/costsize.c#L4305 */ void final_cost_hashjoin(PlannerInfo *root, HashPath *path, JoinCostWorkspace *workspace, JoinPathExtraData *extra) { /* ... */ if (path->jpath.path.param_info) path->jpath.path.rows = path->jpath.path.param_info->ppi_rows; else path->jpath.path.rows = path->jpath.path.parent->rows; /* ... */ }
-
NestJS Multi-tenancy API Key Authorization
PostgreSQL as database
-
Strategies for Fast Lexers
> As introduced in the previous chapters, all identifers are hashed, thus we can also hash the known keywords at startup and make comparing them very fast.
One trick that postgres uses [1][2] is perfect hashing [3]. Since you know in advance what your keywords are, you can design such hashing functions that for each w(i) in list of i keywords W, h(w(i)) = i. It essentially means no collisions and it's O(i) for the memory requirement.
[1] https://github.com/postgres/postgres/blob/master/src/tools/P...
[2] https://github.com/postgres/postgres/blob/master/src/tools/g...
[3] https://en.wikipedia.org/wiki/Perfect_hash_function
-
Create ER Diagrams for PostgreSQL with a Free Design Tool
Understanding a database starts with understanding its structure. For PostgreSQL users, one of the most effective ways to visualize and manage your schema is by using an Entity-Relationship Diagram (ERD). Either if you're working with a large legacy database or starting something new, an ER diagram shows how your tables are connected and how your data is organized.
What are some alternatives?
pages-gem - A simple Ruby Gem to bootstrap dependencies for setting up and maintaining a local Jekyll environment in sync with GitHub Pages
ClickHouse - ClickHouse® is a real-time analytics database management system
CppCon2020 - Slides and other materials from CppCon 2020
MySQL - MySQL Server, the world's most popular open source database, and MySQL Cluster, a real-time, open source transactional database.
vision_blender - A Blender addon for generating synthetic ground truth data for Computer Vision applications
Firebird - FB/Java plugin for Firebird