Skytrax-Data-Warehouse
dbd
Our great sponsors
Skytrax-Data-Warehouse | dbd | |
---|---|---|
1 | 4 | |
126 | 55 | |
- | - | |
0.0 | 0.0 | |
almost 4 years ago | about 2 years ago | |
Python | Python | |
MIT License | BSD 3-clause "New" or "Revised" License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Skytrax-Data-Warehouse
-
Open source contributions for a Data Engineer?
Always open to accept contributions to my project (Skytrax Data Warehouse). If you are into data stuff support my work at youtube as well (One Developer Pirate), I mostly make data-oriented videos. These days I'm making a SQL course from a data analysis perspective that is expected to release in next week.
dbd
-
Easy loading Kaggle dataset to a database
I've created two examples of how to use the dbd tool to load Kaggle dataset data files (csv, json, xls, parquet) to your Postgres, MySQL, or SQLite database.Basically, you don't have to create any tables, nor run any SQL INSERT or COPY statements. Everything is automated. You just reference the datasets and files with a URL and execute a 'dbd run' command.The examples are here. Perhaps you find it useful. Let me know, what you think!
-
Easy loading dataset files to a database
I've created two examples of how to use the [dbd](https://github.com/zsvoboda/dbd) tool to load Kaggle dataset data files (csv, json, xls, parquet) to your Postgres, MySQL, or SQLite database.
What are some alternatives?
ethereum-etl - Python scripts for ETL (extract, transform and load) jobs for Ethereum blocks, transactions, ERC20 / ERC721 tokens, transfers, receipts, logs, contracts, internal transactions. Data is available in Google BigQuery https://goo.gl/oY5BCQ
jaydebeapi - JayDeBeApi module allows you to connect from Python code to databases using Java JDBC. It provides a Python DB-API v2.0 to that database.
sqlfluff - A modular SQL linter and auto-formatter with support for multiple dialects and templated code.
dbt-spotify-analytics - Containerized end-to-end analytics of Spotify data using Python, dbt, Postgres, and Metabase
airflow-api-tests - This is a collection of Pytest for the 2.0 Stable Rest Apis for Apache Airflow. I have another repo where you could setup airflow locally and play around with these. I am used to RestAssured, but trying out pytest here.
dagster - An orchestration platform for the development, production, and observation of data assets.
DataGristle - Tough and flexible tools for data analysis, transformation, validation and movement.
airbyte - The leading data integration platform for ETL / ELT data pipelines from APIs, databases & files to data warehouses, data lakes & data lakehouses. Both self-hosted and Cloud-hosted.
meltano
pgsync - Postgres to Elasticsearch/OpenSearch sync
paaster - Paaster is a secure and user-friendly pastebin application that prioritizes privacy and simplicity. With end-to-end encryption and paste history, Paaster ensures that your pasted code remains confidential and accessible.
convtools-ita - convtools is a python library to declaratively define conversions for processing collections, doing complex aggregations and joins.