dbx
nutter
dbx | nutter | |
---|---|---|
5 | 2 | |
434 | 262 | |
2.3% | 2.3% | |
4.6 | 0.0 | |
2 months ago | 13 days ago | |
Python | Python | |
GNU General Public License v3.0 or later | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
dbx
-
Snowpark equivalent on Databricks?
Pyspark is the python API for spark. You can write code in a notebook on databricks and run it on a cluster or you can write code in an IDE and run it using dbx through the dbx execute command. If you’re more familiar with Pandas API, you can use Koalas which is a pandas API on Spark
- how/where do you define your databricks jobs, tasks and workflows?
-
Unit & integration testing in Databricks
Hey, Databricks person here. Check out DBX for a template on how to do unit and integration tests: https://github.com/databrickslabs/dbx
-
My top 5 learnings from driving an OSS project
Approximately 1 year ago I've released the first version of dbx - a CLI tool for simple and efficient development and deployment of Databricks jobs.
- Anyone use Pyspark notebook in production ?
nutter
-
How much object orienteered do you use in your projects? Bonus points for integration and unit tests
From my experience OO gives you much more flexibility in designing your pipeline but you're risking to make the project way more complicated. The worst example I have seen is the Nutter library (https://github.com/microsoft/nutter), which uses endless classes that are all nested in each other. I once had a bug when using it, and it was a huge pain in the ass to understand what's going on when the code is executed. It is a very good example of what can go wrong when you're overusing OO. However, in one project, I carefully created few classes, just out of curiosity, and I was very impressed how it helped me to organize/structure my code. A functions hase a clear dedicated use, but a good class is like a Swiss army knife with an solid set of functionalities. If you know how to use it in a smart way, you are likely to increase the quality of your code, but the contrary is also very likely, especially when the team members are not ready for it.
-
How do you test your pipelines?
- https://github.com/microsoft/nutter
What are some alternatives?
databricks-cli - The missing command line client for Databricks SQL
cicd-templates - Manage your Databricks deployments and CI with code.
dbt-databricks - A dbt adapter for Databricks.
jupyterlab-integration - DEPRECATED: Integrating Jupyter with Databricks via SSH
azure-devops-python-api - Azure DevOps Python API
fastdbfs - fastdbfs - An interactive command line client for Databricks DBFS.
Redash - Make Your Company Data Driven. Connect to any data source, easily visualize, dashboard and share your data.
databricks-nutter-projects-demo - Demo of using the Nutter for testing of Databricks notebooks in the CI/CD pipeline [Moved to: https://github.com/alexott/databricks-nutter-repos-demo]
terraform-provider-azuredevops - Terraform Azure DevOps provider