courses
course
courses | course | |
---|---|---|
7 | 4 | |
4,556 | 1,928 | |
- | 4.1% | |
5.4 | 9.3 | |
15 days ago | 12 days ago | |
Python | MDX | |
- | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
courses
-
If you are looking for free courses about AI, LLMs, CV, or NLP, I created the repository with links to resources that I found super high quality and helpful. The link is in the comment.
I found it: https://github.com/SkalskiP/courses
https://github.com/SkalskiP/courses/discussions/20. Wouldhttps://github.com/SkalskiP/courses/discussions/20 that format be helpful?
- Repo con cursos gratis sobre IA (en inglés)
- GitHub - SkalskiP/courses: This repository is a curated collection of links to various courses and resources about Artificial Intelligence (AI)
-
Anyone know of any good video lectures for Computer Vision? From great professors at well-regarded universities
This is a popular repo with cv specific resources https://github.com/SkalskiP/courses
- Curated collection of high quality ai resources
-
If you are looking for courses about Artificial Intelligence, I created the repository with links to resources that I found super high quality and helpful. The link is in the comment.
link: https://github.com/SkalskiP/courses
course
-
How can I learn more about models, trends, news, etc?
r/learnmachinelearning may be a better subreddit for asking for broad ML questions, but some resources linked in this sub have included pages like the Hugging Face NLP Course.
-
OpenOrca: open source dataset and instruct-tuned LLMs
I recommend the Huggingface courses: https://huggingface.co/course
The courses are geared toward writing Python applications around these models. They're fairly hands on, so it would still be a good thing to complement them by reading papers or watching videos on fundamental principles of AI and ML.
-
[N] First-Ever Course on Transformers: NOW PUBLIC
Btw, Huggingface had a course on Transformers for around a year now I think with a lot of content and even guides on how to build a demo: (course webpage, chapter 1) , releasing alongside the material (github repo), so I am not sure whether yours is the first ever..
-
The Hugging Face course goes open-source!
Hey u/QQut thanks for the offer - we'd love to have your help with the translations! You can open an issue here if your language is not yet listed: https://github.com/huggingface/course/issues
What are some alternatives?
DNABERT - DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
gradio - Build and share delightful machine learning apps, all in Python. 🌟 Star to support our work!
WhereIsAI - AI company, product, and tool collection.
Awesome-Transformer-Attention - An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
vectory - Vectory provides a collection of tools to track and compare embedding versions.
AutoGPTQ - An easy-to-use LLMs quantization package with user-friendly apis, based on GPTQ algorithm.
bearid - Hypraptive BearID project. FaceNet for bears.
refinery - The data scientist's open-source choice to scale, assess and maintain natural language data. Treat training data like a software artifact.
Mask3D - Mask3D predicts accurate 3D semantic instances achieving state-of-the-art on ScanNet, ScanNet200, S3DIS and STPLS3D.
long-range-arena - Long Range Arena for Benchmarking Efficient Transformers
wit - WIT (Wikipedia-based Image Text) Dataset is a large multimodal multilingual dataset comprising 37M+ image-text sets with 11M+ unique images across 100+ languages.
transformers-interpret - Model explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.