Our great sponsors
-
DropoutUncertaintyExps
Experiments used in "Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning"
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
I found a piece of code by Gal, who defines a log likelihood, and they use that to determine the best model: https://github.com/yaringal/DropoutUncertaintyExps/blob/master/net/net.py. Is that somewhat the standard approach?
I used this with the popular pytorch implementation of EfficientNet. You can see what I'm talking about here https://github.com/lukemelas/EfficientNet-PyTorch/blob/master/efficientnet_pytorch/model.py on line 127. Once you understand this code it is pretty straightforward to modify your forward pass to allow "stochastic depth" during inference.