dbt-unit-testing
dbt-expectations
dbt-unit-testing | dbt-expectations | |
---|---|---|
7 | 10 | |
404 | 947 | |
1.5% | 2.4% | |
7.7 | 6.6 | |
15 days ago | 11 days ago | |
Shell | Shell | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
dbt-unit-testing
-
The SQL Unit Testing Landscape: 2023
If you use dbt for transformations Dbt Unit Testing (https://github.com/EqualExperts/dbt-unit-testing) is getting some attention (https://www.thoughtworks.com/radar/languages-and-frameworks?blipid=202304042)
-
Data-eng related highlights from the latest Thoughtworks Tech Radar
dbt-unit-testing
- I'm not getting it...what's the point of DBT?
-
Ask HN: How do you test SQL?
We use this and take an example-based tests approach for any non-trivial tables: https://github.com/EqualExperts/dbt-unit-testing
-
SQL should be your default choice for data engineering pipelines
> How do you test some SQL logic in isolation?
I do this using sql
1. Extracting an 'ephemeral model' to different model file
2. Mock out this model in upstream model in unit tests https://github.com/EqualExperts/dbt-unit-testing
3. Write unit tests for this model.
This is not different than regular software development in a language like java.
I would argue its even better better because unit tests are always in tabular format and pretty easy to understand. Java unit tests on other hand are never read by devs in practice.
-
Unit testing with dbt
I haven't done it yet but there are some popular blogs as well as a DBT package someone created.
-
Modern Data Modeling: Start with the End?
> I really don’t understand the communities obsession with unwieldy tools like DBT.
It lets me write test first sql transforms. I never thought TDD sql would be possible. My sql is so much more readable with common logic extracted into ephmeral models. I practice same method to write clear code to write sql, eg: too many mocks = refactor into separate model ( class) .
I think DBT made this possible with refs that can be swapped out with mocks. This is the awesome library I am using https://github.com/EqualExperts/dbt-unit-testing
dbt-expectations
-
Dbt tests vs Soda SQL
Have not used Soda, but dbt indeed is pretty good especially when adding dbt-expectations
-
Data-eng related highlights from the latest Thoughtworks Tech Radar
dbt-expectations
-
Data Quality Dimensions: Assuring Your Data Quality with Great Expectations
I highly.. highly.. recommend the dbt-expectations extension from Catologica for dbt. It's a port of Great Expectations, except you can quickly thunk it in your schema.yml's and have it run as part of your dbt test process. Super powerful and it's prevented us from shipping bad data many times.
-
Managing SQL Tests
I'm used to utilising dbt and defining my tests there (along with dbt-utils or https://github.com/calogica/dbt-expectations): I simply add a list item to a column definition and can already define a great number of tests without having to copy code. I can even extend the pre-defined using generic tests. Writing custom tests also integrates nicely. Additionally it's very convenient to tag tests or define a severity. The learning curve for a business engineer is almost flat as long as they know some SQL.
-
What are some Data Quality check related frameworks for datasets ranging from 100GB to 1TB in size?
Use dbt's testing functionality during your transformations with catalogica/dbt-expectations (Great Expectations framework ported to dbt)
-
Great Expectations is annoyingly cumbersome
Check out dbt-expectations https://github.com/calogica/dbt-expectations
-
CI/CD in data engineering - help a noob
There are certain things I would like to add such as data quality, I can use something like dbt great expectations, but I am not sure how much more I should force it before getting an airflow setup..
- How do you query and quality check data produced in intermediate steps in analytics pipeline?
-
ETL Pipelines with Airflow: The Good, the Bad and the Ugly
[dbt Labs employee here]
Check out dbt-expectations package[1]. It's a port of the Great Expectations checks to dbt as tests. The advantage of this is you don't need another tool for these pretty standard tests, and can be early incorporated into dbt workflows.
[1] https://github.com/calogica/dbt-expectations
-
Unit testing SQL in DBT
Also check out dbt-expectations that is a port of Great Expectations that greatly expands the configurable (non-assert) tests.
What are some alternatives?
sqlglot - Python SQL Parser and Transpiler
dbt-utils - Utility functions for dbt projects.
data-diff - Compare tables within or across databases
dbt-oracle - A dbt adapter for oracle db backend
sqlx - 🧰 The Rust SQL Toolkit. An async, pure Rust SQL crate featuring compile-time checked queries without a DSL. Supports PostgreSQL, MySQL, and SQLite.
materialize - The data warehouse for operational workloads.
SS-Unit - A 100% T-SQL based unit testing framework for SQL Server
Scio - A Scala API for Apache Beam and Google Cloud Dataflow.
hash-db - Experimental distributed pseudomultimodel keyvalue database (it uses python dictionaries) imitating dynamodb querying with join only SQL support, distributed joins and simple Cypher graph support and document storage
NVTabular - NVTabular is a feature engineering and preprocessing library for tabular data designed to quickly and easily manipulate terabyte scale datasets used to train deep learning based recommender systems.
spark-style-guide - Spark style guide
cuetils - CLI and library for diff, patch, and ETL operations on CUE, JSON, and Yaml