differential-privacy-library
PyDP
differential-privacy-library | PyDP | |
---|---|---|
2 | 1 | |
779 | 481 | |
1.3% | 1.9% | |
4.8 | 7.0 | |
9 days ago | 4 months ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
differential-privacy-library
-
Well, crackers.
Differential privacy. Basically i wanted to create a randomly generated database file, akin to medical records, create a Private Aggregation of Teacher Ensembles algorithms based on 20-60% of its content and then use this teacher model on the other 80-40% of database which was just a plaintext, not that that matters. The problem is, I've barely got ideas on how it all works, and the one example I've found used Cryptonumeric's library called cn.protect. And that went like I've already described. I've fallen back on practical part of the paper and found another way of getting any practical usage as the assignment requires and now am trying to use https://github.com/IBM/differential-privacy-library and the example on 30s guide to instead make the practical part about choosing epsilon ( a measure of how much information you can give away as a result of one query on the database to a third malicious party) by tracking associated accuracy of result dataset compared to original. I hope I'll manage to edit the code to accept my text file after parsing it through into ndarray from txt, separating the last column to use as a target and going from there.
-
Differential Privacy project on Python
IBM's Diffprivlib is a well-documented implementation of differential privacy in Python. Source code and getting started documentation is available on the IBM differential-privacy-library Github repository.
PyDP
-
How to make the medical data in order to protect the data privacy?
As the title mentioned, I studied some tutorials about differential privacy and the examples of PyDP, but they only deal with simple cases(structure text). Which paper/direction I should focus on if I want to make the unstructured medical data private? Is it possible to make the data private with some preprocessing before I feed the data into the model? A naive idea is find out the sensitive part(ex : name), change them to non sensitive text manually. Thanks
What are some alternatives?
data-science-ipython-notebooks - Data science Python notebooks: Deep learning (TensorFlow, Theano, Caffe, Keras), scikit-learn, Kaggle, big data (Spark, Hadoop MapReduce, HDFS), matplotlib, pandas, NumPy, SciPy, Python essentials, AWS, and various command lines.
mailjet-apiv3-python - [API v3] Python Mailjet wrapper
awesome-machine-unlearning - Awesome Machine Unlearning (A Survey of Machine Unlearning)
CuVec - Unifying Python/C++/CUDA memory: Python buffered array ↔️ `std::vector` ↔️ CUDA managed memory
fides - The Privacy Engineering & Compliance Framework
Ciphey - ⚡ Automatically decrypt encryptions without knowing the key or cipher, decode encodings, and crack hashes ⚡
PrivacyEngCollabSpace - Privacy Engineering Collaboration Space
Keras - Deep Learning for humans
Code-the-Problem - Register for Hacktoberfest and make four pull requests (PRs) between October 1st-31st to grab free T-shirt and more.
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
cookietemple - A collection of best practice cookiecutter templates for all domains and languages with extensive Github support ⛺