[Research] Deep Critical Learning (i.e., Deep Robustness) In The Era of Big Data

This page summarizes the projects mentioned and recommended in the original post on /r/MachineLearning

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • Improving-Mean-Absolute-Error-against-CCE

    Mean Absolute Error Does Not Treat Examples Equally and Gradient Magnitude’s Variance Matters

  • Here are related papers on the fitting and generalization of deep learning: * ProSelfLC: Progressive Self Label Correction Towards A Low-Temperature Entropy State * Understanding deep learning requires rethinking generalization * A Closer Look at Memorization in Deep Networks * ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural Networks * Blog link: https://xinshaoamoswang.github.io/blogs/2020-06-07-Progressive-self-label-correction/ * Mean Absolute Error Does Not Treat Examples Equally and Gradient Magnitude’s Variance Matters * Derivative Manipulation: Example Weighting via Emphasis Density Funtion in the context of DL * Novelty: moving from loss design to derivative design

  • DerivativeManipulation

    In the context of Deep Learning: What is the right way to conduct example weighting? How do you understand loss functions and so-called theorems on them?

  • Here are related papers on the fitting and generalization of deep learning: * ProSelfLC: Progressive Self Label Correction Towards A Low-Temperature Entropy State * Understanding deep learning requires rethinking generalization * A Closer Look at Memorization in Deep Networks * ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural Networks * Blog link: https://xinshaoamoswang.github.io/blogs/2020-06-07-Progressive-self-label-correction/ * Mean Absolute Error Does Not Treat Examples Equally and Gradient Magnitude’s Variance Matters * Derivative Manipulation: Example Weighting via Emphasis Density Funtion in the context of DL * Novelty: moving from loss design to derivative design

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • fitting-random-labels

    Example code for the paper "Understanding deep learning requires rethinking generalization"

  • Code for https://arxiv.org/abs/1611.03530 found: https://github.com/pluskid/fitting-random-labels

  • IDN

    AAAI 2021: Beyond Class-Conditional Assumption: A Primary Attempt to Combat Instance-Dependent Label Noise

  • ProSelfLC

    Discontinued noisy labels; missing labels; semi-supervised learning; entropy; uncertainty; robustness and generalisation. [Moved to: https://github.com/XinshaoAmosWang/ProSelfLC-AT]

  • Code for https://arxiv.org/abs/2005.03788 found: https://github.com/XinshaoAmosWang/ProSelfLC

  • SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • [Research] Not all our papers get published, therefore it is enjoyable to see our released papers become a true foundation for other works

    2 projects | /r/MachineLearning | 25 Jun 2022
  • [R] CVPR 2021-Progressive Self Label Correction (ProSelfLC) for Training Robust Deep Neural Networks

    2 projects | /r/MachineLearning | 4 Jun 2021
  • [D] Should expert opinion be a bigger part of the Machine Learning world?

    2 projects | /r/MachineLearning | 25 Mar 2022