huggingface_hub
scikit-learn
huggingface_hub | scikit-learn | |
---|---|---|
104 | 81 | |
1,688 | 58,130 | |
4.9% | 0.5% | |
9.6 | 9.9 | |
4 days ago | 7 days ago | |
Python | Python | |
Apache License 2.0 | BSD 3-clause "New" or "Revised" License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
huggingface_hub
-
OpenAI's employees were given two explanations for why Sam Altman was fired
Something to think about:
https://github.com/huggingface/huggingface_hub
- Thoughts on a "Text Generation CivitAI"
-
Civitai alternatives.
Yes! We have a well documented Python library (https://github.com/huggingface/huggingface_hub) and public endpoints (https://huggingface.co/docs/hub/api#endpoints-table) you can use to retrieve information about the models and potentially build UIs with specific use cases in mind
-
Fox Fairy @ Diffusion Forest: Unreal Engine + Stable Diffusion
i think if you search for pixel art here there are some models worth checking out: https://huggingface.co/
- ASK HN: AI is really exciting but where do I start?
- j'ai entraîné une IA à générer Éric Duhaime en clown !
-
[Guide] DreamBooth Training with ShivamShrirao's Repo on Windows Locally
I received another error saying OSError: We couldn't connect to 'https://huggingface.co' to load this model, couldn't find it in the cached files and it looks like ./vae is not the path to a directory containing a file named diffusion_pytorch_model.bin
-
Training a Deep Learning Language Model for Latin text Generation
I plan to release it on https://huggingface.co/, where all this cool AI stuff is available for free for everyone that wishes to try it.
-
Image Upscaling Models Compared (General, Photo and Faces)
For this I used mainly the chainner application with models from here but I also used the google colab automatic1111 stable diffusion webui (for example for Lanczos) and also spaces fromhuggingface like this one or then from the replicate.com website super resolution collection.
-
2D Illustration Styles are scarce on Stable Diffusion so i created a dreambooth model inspired by Hollie Mengert's work
you will now need to create a huggingface account ( https://huggingface.co/) if you haven't already. When you have, go here and accept the terms, https://huggingface.co/runwayml/stable-diffusion-v1-5. When you have done both, click on your profile icon and go to settings. Click access tokens and then create token, name it whatever you want, select "write". When you are finished with all this, then you can run the next cell which is the hugging face cell. It will ask for a token, you copy and paste what you just created.
scikit-learn
-
AutoCodeRover resolves 22% of real-world GitHub in SWE-bench lite
Thank you for your interest. There are some interesting examples in the SWE-bench-lite benchmark which are resolved by AutoCodeRover:
- From sympy: https://github.com/sympy/sympy/issues/13643. AutoCodeRover's patch for it: https://github.com/nus-apr/auto-code-rover/blob/main/results...
- Another one from scikit-learn: https://github.com/scikit-learn/scikit-learn/issues/13070. AutoCodeRover's patch (https://github.com/nus-apr/auto-code-rover/blob/main/results...) modified a few lines below (compared to the developer patch) and wrote a different comment.
There are more examples in the results directory (https://github.com/nus-apr/auto-code-rover/tree/main/results).
-
Polars
sklearn is adding support through the dataframe interchange protocol (https://github.com/scikit-learn/scikit-learn/issues/25896). scipy, as far as I know, doesn't explicitly support dataframes (it just happens to work when you wrap a Series in `np.array` or `np.asarray`). I don't know about PyTorch but in general you can convert to numpy.
-
[D] Major bug in Scikit-Learn's implementation of F-1 score
Wow, from the upvotes on this comment, it really seems like a lot of people think that this is the correct behavior! I have to say I disagree, but if that's what you think, don't just sit there upvoting comments on Reddit; instead go to this PR and tell the Scikit-Learn maintainers not to "fix" this "bug", which they are currently planning to do!
- Contraction Clustering (RASTER): A fast clustering algorithm
-
Ask HN: Learning new coding patterns – how to start?
I was in a similar boat to yours - Worked in data science and since then have made a move to data engineering and software engineering for ML services.
I would recommend you look into the Design Patterns book by the Gang of Four. I found it particularly helpful to make extensible code that doesn't break specially with abstract classes, builders and factories. I would also recommend looking into the book The Object Oriented Thought Process to understand why traditional OOP is build the way it is.
You can also look into the source code of popular data science libraries such as sklearn (https://github.com/scikit-learn/scikit-learn/tree/main/sklea...) and see how a lot of them have Base classes to define shared functionality between object of the same nature.
As others mentioned, I would also encourage you to try and implement design patterns in your everyday work - maybe you can make a Factory to load models or preprocessors that follow the same Abstract class?
-
Transformers as Support Vector Machines
It looks like you've been the victim of some misinformation. As Dr_Birdbrain said, an SVM is a convex problem with unique global optimum. sklearn.SVC relies on libsvm which initializes the weights to 0 [0]. The random state is only used to shuffle the data to make probability estimates with Platt scaling [1]. Of the random_state parameter, the sklearn documentation for SVC [2] says
Controls the pseudo random number generation for shuffling the data for probability estimates. Ignored when probability is False. Pass an int for reproducible output across multiple function calls. See Glossary.
[0] https://github.com/scikit-learn/scikit-learn/blob/2a2772a87b...
[1] https://en.wikipedia.org/wiki/Platt_scaling
[2] https://scikit-learn.org/stable/modules/generated/sklearn.sv...
-
How to Build and Deploy a Machine Learning model using Docker
Scikit-learn Documentation
- Planning to get a laptop for ML/DL, is this good enough at the price point or are there better options at/below this price point?
-
Link Prediction With node2vec in Physics Collaboration Network
Firstly, we need a connection to Memgraph so we can get edges, split them into two parts (train set and test set). For edge splitting, we will use scikit-learn. In order to make a connection towards Memgraph, we will use gqlalchemy.
-
WiFilter is a RaspAP install extended with a squidGuard proxy to filter adult content. Great solution for a family, schools and/or public access point
The ML component is based on scikit-learn which differentiates it from purely list-based filters. It couples this with a full-featured wireless router (RaspAP) in a single device, so it fulfills the needs of a use case not entirely addressed by Pi-hole.
What are some alternatives?
civitai - A repository of models, textual inversions, and more
Prophet - Tool for producing high quality forecasts for time series data that has multiple seasonality with linear or non-linear growth.
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Surprise - A Python scikit for building and analyzing recommender systems
spaCy - 💫 Industrial-strength Natural Language Processing (NLP) in Python
Keras - Deep Learning for humans
mammography_metarepository - Meta-repository of screening mammography classifiers
tensorflow - An Open Source Machine Learning Framework for Everyone
KoboldAI-Client
gensim - Topic Modelling for Humans
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
H2O - H2O is an Open Source, Distributed, Fast & Scalable Machine Learning Platform: Deep Learning, Gradient Boosting (GBM) & XGBoost, Random Forest, Generalized Linear Modeling (GLM with Elastic Net), K-Means, PCA, Generalized Additive Models (GAM), RuleFit, Support Vector Machine (SVM), Stacked Ensembles, Automatic Machine Learning (AutoML), etc.