Representative Batch Normalization with Feature Calibration
Oral, IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2021
Introduction
Batch Normalization (BatchNorm) has become the default component in modern neural networks to stabilize training. In BatchNorm, centering and scaling operations, along with mean and variance statistics, are utilized for feature standardization over the batch dimension. The batch dependency of BatchNorm enables stable training and better representation of the network, while inevitably ignores the representation differences among instances. We propose to add a simple yet effective feature calibration scheme into the centering and scaling operations of BatchNorm, enhancing the instance-specific representations with the negligible computational cost. The centering calibration strengthens informative features and reduces noisy features. The scaling calibration restricts the feature intensity to form a more stable feature distribution. Our proposed variant of BatchNorm, namely Representative BatchNorm, can be plugged into existing methods to boost the performance of various tasks such as classification, detection, and segmentation.
Codes
Source Code and pre-trained model: https://github.com/ShangHua-Gao/RBN
Paper Links
Representative Batch Normalization with Feature Calibration, Shang-Hua Gao, Qi Han, Duo Li, Ming-Ming Cheng*, Pai Peng, IEEE CVPR (oral), 2021. [pdf |bib|project|code]
Citation
@inproceedings{gao2021rbn,
title={Representative Batch Normalization with Feature Calibration},
author={Gao, Shang-Hua and Han, Qi and Li, Duo and Peng, Pai and Cheng, Ming-Ming and Pai Peng},
booktitle=CVPR,
year={2021} }
Q&A
If you have any questions, feel free to E-mail Shang-Hua Gao (shgao(at)live.com) and Qi Han(hqer(at)foxmail.com).
Acknowledgement
致谢项目:中国人工智能学会-华为 MindSpore 学术奖励基金项目(CAAI-Huawei Open Fund)