PyMySQL
python-mysql-replication
Our great sponsors
PyMySQL | python-mysql-replication | |
---|---|---|
4 | 5 | |
7,551 | 2,254 | |
0.7% | - | |
7.4 | 9.2 | |
28 days ago | 23 days ago | |
Python | Python | |
MIT License | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
PyMySQL
-
What is the defacto python driver for Mysql?
Pymysql is the one i have seen being used most commonly. https://github.com/PyMySQL/PyMySQL
-
There are many options for connecting MySQL from Python, but let's use PyMySQL or mysql-connector-python for now.
PyMySQL
-
Python 3.10.1
However, please note you don't need to use MySQLdb as your connector- I was very happy with pure-Python MySQL client library pyMySQL: https://pypi.org/project/PyMySQL/#installation https://github.com/PyMySQL/PyMySQL/ So far, I have been happy with its maintenance and compatibility of newer MySQL and MariaDB features (e.g. newer authentication methods). It is MIT-licensed.
-
Python3 Mysql Connector Issue. Help!!!
My preferred connector is pymysql, which I know supports it https://github.com/PyMySQL/PyMySQL/issues/651 but either use anyone that works for you or disable this feature on MySQL server, as suggested on that ticket.
python-mysql-replication
-
Is anyone using PyPy for real work?
I'm maintaining an internal change-data-capture application that uses a python library to decode mysql binlog and store the change records as json in the data lake (like Debezium). For our most busiest databases a single Cpython process couldn't process the amount of incoming changes in real time (thousands of events per second). It's not something that can be easily parallelized, as the bulk of the work is happening in the binlog decoding library (https://github.com/julien-duponchelle/python-mysql-replicati...).
So we've made it configurable to run some instances with Pypy - which was able to work through the data in realtime, i.e. without generating a lag in the data stream. The downside of using pypy was increased memory usage (4-8x) - which isn't really a problem. An actually problem that I didn't really track down was that the test suite (running pytest) was taking 2-3 times longer with Pypy than with CPython.
A few months ago I upgraded the system to run with CPython 3.11 and the performance improvements of 10-20% that come with that version now actually allowed us to drop Pypy and only run CPython. Which is more convenient and makes the deployment and configuration less complex.
-
Why Binlog size grows drastically when isolation level set to "Repeatable Read" & When isolation level set to "Read Committed" the size of Binlog file reduces ?
doing the using Python, https://github.com/julien-duponchelle/python-mysql-replication, the recommended way of doing this
-
How to Use BinLogs to Make an Aurora MySQL Event Stream
The BinLogStreamReader has several inputs that we need to retrieve. First we'll retrieve the cluster's secret with the database host/username/password and then we'll fetch the serverId we stored in S3.
-
How is everyone ingesting backend relational data?
From backend relational tables to data warehouses my team has mostly relied on change data capture replication. We use MySQL upstream, and historically used AWS DMS or Attunity Replicate to replicate directly to SQL server. Recently we made the switch to Snowflake, and used mostly AWS DMS to replicate CDC data to S3 (lists individual inserts, updates, deletes), and then from there use snowpipes to copy to snowflake and then a job to merge that data into the target table to get the latest state. In addition we've used this library in production https://github.com/noplay/python-mysql-replication, and still use it today for one high volume, critical data source. Generally we see data go end to end in a matter of minutes, but occasionally there are spikes in latency.
- Robust data transfer mechanism?
What are some alternatives?
mysqlclient - MySQL database connector for Python (with Python 3 support)
AWS Data Wrangler - pandas on AWS - Easy integration with Athena, Glue, Redshift, Timestream, Neptune, OpenSearch, QuickSight, Chime, CloudWatchLogs, DynamoDB, EMR, SecretManager, PostgreSQL, MySQL, SQLServer and S3 (Parquet, CSV, JSON and EXCEL).
mysql-python - MySQLdb is a Python DB API-2.0 compliant library to interact with MySQL 3.23-5.1 (unofficial mirror)
PonyORM - Pony Object Relational Mapper
awesome-mysql - A curated list of awesome MySQL software, libraries, tools and resources
sparc-curation - code and files for SPARC curation workflows
asyncpg - A fast PostgreSQL Database Client Library for Python/asyncio.
preshed - 💥 Cython hash tables that assume keys are pre-hashed
oursql - oursql is a set of MySQL bindings for python with a focus on wrapping the MYSQL_STMT API to provide real parameterization and real server-side cursors.
mycli - A Terminal Client for MySQL with AutoCompletion and Syntax Highlighting.
awesome-postgres - A curated list of awesome PostgreSQL software, libraries, tools and resources, inspired by awesome-mysql
psycopg2cffi - Port to cffi with some speed improvements