pyspark-style-guide
spark-style-guide
pyspark-style-guide | spark-style-guide | |
---|---|---|
3 | 2 | |
946 | 250 | |
3.7% | - | |
0.0 | 0.0 | |
over 2 years ago | 12 months ago | |
Python | Jupyter Notebook | |
MIT License | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
pyspark-style-guide
-
PySpark style guide
For completeness, here is the Palantir PySpark style guide that has different guidance that you might also find interesting.
-
Suggestions
Even small things like data engineering best practices or a style guide feels like greenfield in this industry. (i.e.: https://github.com/palantir/pyspark-style-guide )
-
Courses/content on writing good python code for data engineering?
While this isn’t exactly what you are asking for, you might gain some value from this PySpark style guide: https://github.com/palantir/pyspark-style-guide
spark-style-guide
-
Ask HN: How do you test SQL?
Spark makes it easy to wrap SQL in functions that are easy to test. I am the author of the popular Scala Spark (spark-fast-tests) and PySpark (chispa) testing libraries. Some additional tips to speed up Spark tests (can speed up tests between 70-90%):
* reuse the same Spark session throughout the test suite
* Set shuffle partitions to 2 (instead of default which is 200)
* Use dependency injection to avoid disk I/O in the test suite
* Use fast DataFrame equality when possible. assertSmallDataFrameEquality is 4x faster than assertLargeDataFrameEquality.
* Use column equality to test column functions. Don't compare DataFrames unless you're testing custom DataFrame transformations. See the spark-style-guide for definitions for these terms: https://github.com/MrPowers/spark-style-guide/blob/main/PYSP...
Spark is an underrated tool for testing SQL. Spark makes it really easy to abstract SQL into unit testable chunks. Configuring your tests properly takes some knowledge, but you can make the tests run relatively quickly.
-
PySpark style guide
I created a PySpark style guide to help the community write code that's easy to reuse, unit test, and debug. Feel free to open issues / PRs if you have any suggestions / improvements.
What are some alternatives?
policy-bot - A GitHub App that enforces approval policies on pull requests
dbt-unit-testing - This dbt package contains macros to support unit testing that can be (re)used across dbt projects.
windows-event-forwarding - A repository for using windows event forwarding for incident detection and response
pg_temp - create a temporary, disposable, userland pg database
tslint - :vertical_traffic_light: An extensible linter for the TypeScript language
datajudge - Assessing whether data from database complies with reference information.
python-language-server - An implementation of the Language Server Protocol for Python
data-diff - Compare tables within or across databases
@blueprintjs/core - A React-based UI toolkit for the web
integresql - IntegreSQL manages isolated PostgreSQL databases for your integration tests.
sqlfluff - A modular SQL linter and auto-formatter with support for multiple dialects and templated code.
SS-Unit - A 100% T-SQL based unit testing framework for SQL Server