作者:Tsung-Yi Lin,Priya Goyal,Ross Girshick,Kaiming He,Piotr Doll´ar

机构:Facebook AI Research (FAIR)

发布时间:2017

发布期刊:IEEE Transactions on Pattern Analysis & Machine Intelligence

论文全称:Focal Loss for Dense Object Detection

论文地址:https://arxiv.org/abs/1708.02002

代码:*https://github.com/facebookresearch/Detectron.*

地位:提出了Focal Loss

一、摘要

The highest accuracy object detectors to date are based on a two-stage approach popularized by R-CNN, where a classififier is applied to a sparse set of candidate object lo cations. In contrast, one-stage detectors that are applied over a regular, dense sampling of possible object locations have the potential to be faster and simpler, but have trailed the accuracy of two-stage detectors thus far. In this paper, we investigate why this is the case. We discover that the extreme foreground-background class imbalance encountered during training of dense detectors is the central cause. We propose to address this class imbalance by reshaping the standard cross entropy loss such that it down-weights the loss assigned to well-classifified examples. Our novel Focal Loss focuses training on a sparse set of hard examples and prevents the vast number of easy negatives from overwhelming the detector during training. To evaluate the effectiveness of our loss, we design and train a simple dense detector we call RetinaNet. Our results show that when trained with the focal loss, RetinaNet is able to match the speed of previous one-stage detectors while surpassing the accuracy of all existing state-of-the-art two-stage detectors.

二、研究背景

这里要注意的是前一点我们说了negative的loss很大,是因为negative的绝对数量多,所以总loss大后一点说easy negative的loss小,是针对单个example而言

三、 Focal Loss(本文主要创新点)

Untitled