geoparquet
odbc2parquet
geoparquet | odbc2parquet | |
---|---|---|
3 | 5 | |
723 | 206 | |
4.1% | - | |
5.5 | 9.3 | |
5 days ago | 6 days ago | |
Python | Rust | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
geoparquet
-
Friends don't let friends export to CSV
That's why I'm working on the GeoParquet spec [0]! It gives you both compression-by-default and super fast reads and writes! So it's usually as small as gzipped CSV, if not smaller, while being faster to read and write than GeoPackage.
Try using `GeoDataFrame.to_parquet` and `GeoPandas.read_parquet`
[0]: https://github.com/opengeospatial/geoparquet
-
COMTiles (Cloud Optimized Map Tiles) hosted on Amazon S3 and Visualized with MapLibre GL JS
GeoParquet
-
Postgres and Parquet in the Data Lke
> "Generating Parquet"
It is also useful for moving data from Postgres to BigQuery! ( batch load )
https://cloud.google.com/bigquery/docs/loading-data-cloud-st...
Thanks for the "ogr2ogr" trick! :-)
I hope the next blog post will be about GeoParquet and storing complex geometries in parquet format :-)
https://github.com/opengeospatial/geoparquet
odbc2parquet
- Postgres and Parquet in the Data Lke
-
MySQL table data to direct parquet output
Although, I found a GitHub page (odbc2parquet) which can export the table (also a query output) to parquet.
-
Parquet best practices
Is this a one-time task? Maybe check out ODBC2PARQUET https://github.com/pacman82/odbc2parquet
-
Thoughts on Using Airbyte to read/write to S3?
I tried writing parquet to s3 with Airbyte a few months ago and gave up. It was extremely slow for small tables and would not work at all for larger tables. I wound up using this https://github.com/pacman82/odbc2parquet + aws cli
-
Extract data from ERP systems to Snowflake - Which tools (besides Airbyte)?
Yes, I have been tinkering around with odbc2parquet (https://github.com/pacman82/odbc2parquet) and storing it in a variant column. For the dependency/workflow management maybe prefect
What are some alternatives?
mbtiles-spec - specification documents for the MBTiles tileset format
sql-spark-connector - Apache Spark Connector for SQL Server and Azure SQL
geemap - A Python package for interactive geospatial analysis and visualization with Google Earth Engine.
roapi - Create full-fledged APIs for slowly moving datasets without writing a single line of code.
flatgeobuf - A performant binary encoding for geographic data based on flatbuffers
sqlpad - Web-based SQL editor. Legacy project in maintenance mode.
postgres_vectorization_test - Vectorized executor to speed up PostgreSQL
duckdb_fdw - DuckDB Foreign Data Wrapper for PostgreSQL
BlenderGIS - Blender addons to make the bridge between Blender and geographic data
FreeSql - 🦄 .NET aot orm, C# orm, VB.NET orm, Mysql orm, Postgresql orm, SqlServer orm, Oracle orm, Sqlite orm, Firebird orm, 达梦 orm, 人大金仓 orm, 神通 orm, 翰高 orm, 南大通用 orm, 虚谷 orm, 国产 orm, Clickhouse orm, QuestDB orm, MsAccess orm.
parquet_fdw - Parquet foreign data wrapper for PostgreSQL