ReLU6 is a modification of the rectified linear unit where we limit the activation to a maximum size of $6$. This is due to increased robustness when used with low-precision computation.
Image Credit: PyTorch
Source: MobileNets: Efficient Convolutional Neural Networks for Mobile Vision ApplicationsPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Classification | 15 | 14.85% |
Object Detection | 10 | 9.90% |
Classification | 6 | 5.94% |
Quantization | 5 | 4.95% |
Semantic Segmentation | 5 | 4.95% |
Decoder | 5 | 4.95% |
Bayesian Optimization | 3 | 2.97% |
Computational Efficiency | 3 | 2.97% |
Neural Network Compression | 2 | 1.98% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |