date: 2024-08-25
title: "Loss Function"
status: DONE
author:
  - AllenYGY
tags:
  - NOTE
  - LossFunction
  - DeepLearning
publish: True
Loss Function
- Mean Squared Error (MSE)
- Cross Entropy Loss
Mean Squared Error (MSE):
- Used for regression tasks. Measures the average squared difference between actual values  and predicted values .
Mean Absolute Error (MAE):
- Also used for regression tasks. It measures the average absolute difference between actual and predicted values.
Cross-Entropy Loss (Log Loss):
- 
通常用于处理分类问题 
-  即为分类总数
-  即为真实类别概率
-  即为对该类别的预测概率
 
- 
通常, , 因为真实值是确定的,可化简得到 
Cross Entropy
The derivative of cross entropy
  
 Hinge Loss:
- Typically used for "maximum-margin" classifiers like Support Vector Machines (SVM).
Huber Loss:
- A loss function used in regression that is less sensitive to outliers than MSE.
Kullback-Leibler (KL) Divergence:
- Measures how one probability distribution  diverges from a second probability distribution . Often used in variational inference or generative models.
Negative Log-Likelihood Loss (NLL):
- Used for classification problems, particularly in models like neural networks with a softmax output.