The main purpose of a loss function is to sum the quantity that a model can use in order to minimize the prediction errors during the training of the model.
Keras provides different loss functions that can be used for a variety of machine learning tasks. Here are some most commonly used loss functions:
It is the squared differences between the values predicted by the model and the actual values. This is used in regression models.
It can be mathematically represented as
where
In our model we can import it as follows:
from keras.losses import mean_squared_error
It is the mean of the absolute differences between the values predicted by the model and the actual values. We can use it for regression related tasks or models.
It can be mathematically represented as
In our model we can import it as follows:
from keras.losses import mean_absolute_percentage_error
Note: To avoid division by zero, we add a small value to the denominator.
It is a probabilistic loss. It is used for binary (
It can be mathematically represented as
In our model we can import it as follows:
from keras.losses import binary_crossentropy
It is used for classification problems where there are more than
Note: One-hot encoding is a way to convert categorical data into a numerical format that can be used for machine learning algorithms or neural networks.
For example:
Apple: [1, 0, 0]
Banana: [0, 1, 0]
Orange: [0, 0, 1]
It can be mathematically represented as:
In our model we can import it as follows:
from keras.losses import categorical_crossentropy
It is similar to the categorical cross-entropy but it is used when we use integer values instead of one-hot encoded vectors.
In our model we can import it as follows:
from keras.losses import sparse_categorical_crossentropy
It is the loss function that is used in the Support Vector Machine (SVM) algorithm and for margin-based classification tasks.
It can be mathematically represented as:
In our model we can import it as follows:
from keras.losses import hinge
compile()
All built-in loss functions can be passed as string identifiers in the compile()
function. To use a loss function in the compile( )
function, assign the name of the loss function to the loss
parameter.
Here's an example of importing the categorical cross-entropy loss function by using the compile()
function:
model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
Free Resources