dbt-unit-testing
spark-style-guide
dbt-unit-testing | spark-style-guide | |
---|---|---|
7 | 2 | |
404 | 250 | |
1.5% | - | |
7.7 | 0.0 | |
16 days ago | 11 months ago | |
Shell | Jupyter Notebook | |
MIT License | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
dbt-unit-testing
-
The SQL Unit Testing Landscape: 2023
If you use dbt for transformations Dbt Unit Testing (https://github.com/EqualExperts/dbt-unit-testing) is getting some attention (https://www.thoughtworks.com/radar/languages-and-frameworks?blipid=202304042)
-
Data-eng related highlights from the latest Thoughtworks Tech Radar
dbt-unit-testing
- I'm not getting it...what's the point of DBT?
-
Ask HN: How do you test SQL?
We use this and take an example-based tests approach for any non-trivial tables: https://github.com/EqualExperts/dbt-unit-testing
-
SQL should be your default choice for data engineering pipelines
> How do you test some SQL logic in isolation?
I do this using sql
1. Extracting an 'ephemeral model' to different model file
2. Mock out this model in upstream model in unit tests https://github.com/EqualExperts/dbt-unit-testing
3. Write unit tests for this model.
This is not different than regular software development in a language like java.
I would argue its even better better because unit tests are always in tabular format and pretty easy to understand. Java unit tests on other hand are never read by devs in practice.
-
Unit testing with dbt
I haven't done it yet but there are some popular blogs as well as a DBT package someone created.
-
Modern Data Modeling: Start with the End?
> I really don’t understand the communities obsession with unwieldy tools like DBT.
It lets me write test first sql transforms. I never thought TDD sql would be possible. My sql is so much more readable with common logic extracted into ephmeral models. I practice same method to write clear code to write sql, eg: too many mocks = refactor into separate model ( class) .
I think DBT made this possible with refs that can be swapped out with mocks. This is the awesome library I am using https://github.com/EqualExperts/dbt-unit-testing
spark-style-guide
-
Ask HN: How do you test SQL?
Spark makes it easy to wrap SQL in functions that are easy to test. I am the author of the popular Scala Spark (spark-fast-tests) and PySpark (chispa) testing libraries. Some additional tips to speed up Spark tests (can speed up tests between 70-90%):
* reuse the same Spark session throughout the test suite
* Set shuffle partitions to 2 (instead of default which is 200)
* Use dependency injection to avoid disk I/O in the test suite
* Use fast DataFrame equality when possible. assertSmallDataFrameEquality is 4x faster than assertLargeDataFrameEquality.
* Use column equality to test column functions. Don't compare DataFrames unless you're testing custom DataFrame transformations. See the spark-style-guide for definitions for these terms: https://github.com/MrPowers/spark-style-guide/blob/main/PYSP...
Spark is an underrated tool for testing SQL. Spark makes it really easy to abstract SQL into unit testable chunks. Configuring your tests properly takes some knowledge, but you can make the tests run relatively quickly.
-
PySpark style guide
I created a PySpark style guide to help the community write code that's easy to reuse, unit test, and debug. Feel free to open issues / PRs if you have any suggestions / improvements.
What are some alternatives?
sqlglot - Python SQL Parser and Transpiler
pg_temp - create a temporary, disposable, userland pg database
data-diff - Compare tables within or across databases
pyspark-style-guide - This is a guide to PySpark code style presenting common situations and the associated best practices based on the most frequent recurring topics across the PySpark repos we've encountered.
dbt-expectations - Port(ish) of Great Expectations to dbt test macros
datajudge - Assessing whether data from database complies with reference information.
sqlx - 🧰 The Rust SQL Toolkit. An async, pure Rust SQL crate featuring compile-time checked queries without a DSL. Supports PostgreSQL, MySQL, and SQLite.
SS-Unit - A 100% T-SQL based unit testing framework for SQL Server
integresql - IntegreSQL manages isolated PostgreSQL databases for your integration tests.
hash-db - Experimental distributed pseudomultimodel keyvalue database (it uses python dictionaries) imitating dynamodb querying with join only SQL support, distributed joins and simple Cypher graph support and document storage
sqlfluff - A modular SQL linter and auto-formatter with support for multiple dialects and templated code.