Loss Function

#DeepLearning #LossFunction

  • Mean Squared Error (MSE)
  • Cross Entropy Loss

Mean Squared Error (MSE):

  • Used for regression tasks. Measures the average squared difference between actual values and predicted values .

Mean Absolute Error (MAE):

  • Also used for regression tasks. It measures the average absolute difference between actual and predicted values.

Cross-Entropy Loss (Log Loss):

  • 通常用于处理分类问题

    • 即为分类总数
    • 即为真实类别概率
    • 即为对该类别的预测概率
  • 通常, , 因为真实值是确定的,可化简得到

Flex

Cross Entropy

Cross Entropy012345678910−3-2.5−2-1.5−1-0.500.51

The derivative of cross entropy

The derivative of cross entropy012345678910-1.8-1.6-1.4-1.2−1-0.8-0.6-0.4-0.20

Hinge Loss:

  • Typically used for "maximum-margin" classifiers like Support Vector Machines (SVM).

Huber Loss:

  • A loss function used in regression that is less sensitive to outliers than MSE.

Kullback-Leibler (KL) Divergence:

  • Measures how one probability distribution diverges from a second probability distribution . Often used in variational inference or generative models.

Negative Log-Likelihood Loss (NLL):

  • Used for classification problems, particularly in models like neural networks with a softmax output.