odbc2parquet
sql-spark-connector
odbc2parquet | sql-spark-connector | |
---|---|---|
5 | 1 | |
206 | 263 | |
- | -0.4% | |
9.3 | 2.7 | |
1 day ago | 7 months ago | |
Rust | Scala | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
odbc2parquet
- Postgres and Parquet in the Data Lke
-
MySQL table data to direct parquet output
Although, I found a GitHub page (odbc2parquet) which can export the table (also a query output) to parquet.
-
Parquet best practices
Is this a one-time task? Maybe check out ODBC2PARQUET https://github.com/pacman82/odbc2parquet
-
Thoughts on Using Airbyte to read/write to S3?
I tried writing parquet to s3 with Airbyte a few months ago and gave up. It was extremely slow for small tables and would not work at all for larger tables. I wound up using this https://github.com/pacman82/odbc2parquet + aws cli
-
Extract data from ERP systems to Snowflake - Which tools (besides Airbyte)?
Yes, I have been tinkering around with odbc2parquet (https://github.com/pacman82/odbc2parquet) and storing it in a variant column. For the dependency/workflow management maybe prefect
sql-spark-connector
-
Parquet best practices
I do not whether you have a spark cluster available or not, but you can try it out this way. Install SQL server spark JDBC driver https://github.com/microsoft/sql-spark-connector into spark cluster, query out the SQL server, and save into parquet files. I used it before, it was pretty fast.
What are some alternatives?
roapi - Create full-fledged APIs for slowly moving datasets without writing a single line of code.
sqlpad - Web-based SQL editor. Legacy project in maintenance mode.
geoparquet - Specification for storing geospatial vector data (point, line, polygon) in Parquet
duckdb_fdw - DuckDB Foreign Data Wrapper for PostgreSQL
FreeSql - 🦄 .NET aot orm, C# orm, VB.NET orm, Mysql orm, Postgresql orm, SqlServer orm, Oracle orm, Sqlite orm, Firebird orm, 达梦 orm, 人大金仓 orm, 神通 orm, 翰高 orm, 南大通用 orm, 虚谷 orm, 国产 orm, Clickhouse orm, QuestDB orm, MsAccess orm.
postgres_vectorization_test - Vectorized executor to speed up PostgreSQL
cstore_fdw - Columnar storage extension for Postgres built as a foreign data wrapper. Check out https://github.com/citusdata/citus for a modernized columnar storage implementation built as a table access method.
parquet2 - Fastest and safest Rust implementation of parquet. `unsafe` free. Integration-tested against pyarrow
parquet_fdw - Parquet foreign data wrapper for PostgreSQL
parquet-wasm - Rust-based WebAssembly bindings to read and write Apache Parquet data
delta - An open-source storage framework that enables building a Lakehouse architecture with compute engines including Spark, PrestoDB, Flink, Trino, and Hive and APIs