api
dbd
api | dbd | |
---|---|---|
6 | 4 | |
669 | 55 | |
- | - | |
8.5 | 0.0 | |
over 2 years ago | about 2 years ago | |
Python | Python | |
GNU General Public License v3.0 or later | BSD 3-clause "New" or "Revised" License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
api
- need help!! how to update JSON file without buying server
-
This is how they cherrypick the data to create hate against a certain community
I have calculated this for all states/UT. You can have a look here—data taken from COVID19-India API | api (covid19india.org) .
-
India's daily vaccination number ?
You can do that at the end of the page, right side, right below the graphs. Or you could download CSV files from the API page https://api.covid19india.org/ but they are going to be a little difficult to comprehend
-
COVID-19 India Dashboard
I have developed an application showing the Impact of COVID-19 spread in India and States of India. I have used the data from https://api.covid19india.org/
-
[OC] Extent of Second COVID-19 wave in India
Source : https://api.covid19india.org/
-
Help extracting data from a json in node
The endpoint is detailed here: https://github.com/covid19india/api/blob/master/documentation/v4_data.md
dbd
-
Easy loading Kaggle dataset to a database
I've created two examples of how to use the dbd tool to load Kaggle dataset data files (csv, json, xls, parquet) to your Postgres, MySQL, or SQLite database.Basically, you don't have to create any tables, nor run any SQL INSERT or COPY statements. Everything is automated. You just reference the datasets and files with a URL and execute a 'dbd run' command.The examples are here. Perhaps you find it useful. Let me know, what you think!
-
Easy loading dataset files to a database
I've created two examples of how to use the [dbd](https://github.com/zsvoboda/dbd) tool to load Kaggle dataset data files (csv, json, xls, parquet) to your Postgres, MySQL, or SQLite database.
-
dbd: create your database from data files on your directory
I work on the new open-sourced tool called dbd that enables you to load data from your local data files to your database and transform it using insert-from-select statements. The tool supports templating (Jinja2). It works with Postgres, MySQL, SQLite, Snowflake, Redshift, and BigQuery.
-
New opensource ELT tool
I was looking for some declarative ELT tool for creating my analytics solutions, and DBT was the closest I've found. I liked its concept, but I came across quite a few limitations when I wanted to use it. I couldn't specify and create basic things like data types, indexes, primary/foreign keys, etc. In the end, I decided to implement my own - more straightforward and more flexible. I've published the result - dbd on GitHub. Perhaps, you can find it helpful. Your feedback is greatly appreciated!
What are some alternatives?
python-vulture-action - Run Vulture on your Python codebase to identify dead code.
Skytrax-Data-Warehouse - A full data warehouse infrastructure with ETL pipelines running inside docker on Apache Airflow for data orchestration, AWS Redshift for cloud data warehouse and Metabase to serve the needs of data visualizations such as analytical dashboards.
covid-certificate-scanner - Web application that scans EU Digital Covid Certificates to extract and display all the information they enclose.
ethereum-etl - Python scripts for ETL (extract, transform and load) jobs for Ethereum blocks, transactions, ERC20 / ERC721 tokens, transfers, receipts, logs, contracts, internal transactions. Data is available in Google BigQuery https://goo.gl/oY5BCQ
covid19-vaccine-tracker-india
pgsync - Postgres to Elasticsearch/OpenSearch sync
sbi-tt-rates-historical - Historical SBI TT rates since 02 July 2020. These are one of important rates required for ITR purposes and not made readily available by RBI/SBI unfortunately.
data-toolset - Upgrade from avro-tools and parquet-tools jars to a more user-friendly Python package.
nextrelease - One-click release publishing by merging an automated PR.
sqlmesh - Efficient data transformation and modeling framework that is backwards compatible with dbt.
covid-19_india_data - COVID-19 India Dashboard and Vaccination Centre Search Application
pydwt - Modeling tool like DBT to use SQL Alchemy core with a DataFrame interface like