Skip to content

Latest commit

 

History

History
11 lines (6 loc) · 815 Bytes

focal_loss.md

File metadata and controls

11 lines (6 loc) · 815 Bytes

tl;dr: Focal loss solves the class imbalance problem by modifying the model with a new loss function that focuses on hard negative samples. Concretely, it modulates cross entropy loss by a L2 loss.

  • Focal loss can be used for classification, as shown here.

Takeaways

  • Imbalanced training, balanced test: When trained on imblanced data (up to 100:1), the model trained with focal loss has evenly distributed prediction error when test data is balanced.

  • Imbalanced training, imbalanced test: traning with focal loss yields better accuracy than trained with cross entropy. Again it has evenly distributed prediction error.