How to pre-train BERT on different objective tasks using HuggingFace

This page summarizes the projects mentioned and recommended in the original post on /r/deeplearning

Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
  • notebooks

    Jupyter notebooks for the Natural Language Processing with Transformers book (by nlp-with-transformers)

  • HuggingFace has an excellent book called "Natural Language Processing with Transformers". It explains HF ecosystem pretty nicely. And they have accompanying GitHub repo which has notebooks for all chapters.

  • d2l-en

    Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard, and Cambridge.

  • There might is bert library for pre-train bert model in huggingface, But I suggestion that you train bert model in native pytorch to understand detail, Limu's course is recommended for you

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

    WorkOS logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts