MeanSquaredError
classkeras.metrics.MeanSquaredError(name="mean_squared_error", dtype=None)
Computes the mean squared error between y_true
and y_pred
.
Formula:
loss = mean(square(y_true - y_pred))
Arguments
Example
>>> m = keras.metrics.MeanSquaredError()
>>> m.update_state([[0, 1], [0, 0]], [[1, 1], [0, 0]])
>>> m.result()
0.25
RootMeanSquaredError
classkeras.metrics.RootMeanSquaredError(name="root_mean_squared_error", dtype=None)
Computes root mean squared error metric between y_true
and y_pred
.
Formula:
loss = sqrt(mean((y_pred - y_true) ** 2))
Arguments
Example
Example
>>> m = keras.metrics.RootMeanSquaredError()
>>> m.update_state([[0, 1], [0, 0]], [[1, 1], [0, 0]])
>>> m.result()
0.5
>>> m.reset_state()
>>> m.update_state([[0, 1], [0, 0]], [[1, 1], [0, 0]],
... sample_weight=[1, 0])
>>> m.result()
0.70710677
Usage with compile()
API:
model.compile(
optimizer='sgd',
loss='mse',
metrics=[keras.metrics.RootMeanSquaredError()])
MeanAbsoluteError
classkeras.metrics.MeanAbsoluteError(name="mean_absolute_error", dtype=None)
Computes the mean absolute error between the labels and predictions.
Formula:
loss = mean(abs(y_true - y_pred))
Arguments
Examples
>>> m = keras.metrics.MeanAbsoluteError()
>>> m.update_state([[0, 1], [0, 0]], [[1, 1], [0, 0]])
>>> m.result()
0.25
>>> m.reset_state()
>>> m.update_state([[0, 1], [0, 0]], [[1, 1], [0, 0]],
... sample_weight=[1, 0])
>>> m.result()
0.5
Usage with compile()
API:
model.compile(
optimizer='sgd',
loss='mse',
metrics=[keras.metrics.MeanAbsoluteError()])
MeanAbsolutePercentageError
classkeras.metrics.MeanAbsolutePercentageError(
name="mean_absolute_percentage_error", dtype=None
)
Computes mean absolute percentage error between y_true
and y_pred
.
Formula:
loss = 100 * mean(abs((y_true - y_pred) / y_true))
Arguments
Example
Example
>>> m = keras.metrics.MeanAbsolutePercentageError()
>>> m.update_state([[0, 1], [0, 0]], [[1, 1], [0, 0]])
>>> m.result()
250000000.0
>>> m.reset_state()
>>> m.update_state([[0, 1], [0, 0]], [[1, 1], [0, 0]],
... sample_weight=[1, 0])
>>> m.result()
500000000.0
Usage with compile()
API:
model.compile(
optimizer='sgd',
loss='mse',
metrics=[keras.metrics.MeanAbsolutePercentageError()])
MeanSquaredLogarithmicError
classkeras.metrics.MeanSquaredLogarithmicError(
name="mean_squared_logarithmic_error", dtype=None
)
Computes mean squared logarithmic error between y_true
and y_pred
.
Formula:
loss = mean(square(log(y_true + 1) - log(y_pred + 1)))
Arguments
Example
Example
>>> m = keras.metrics.MeanSquaredLogarithmicError()
>>> m.update_state([[0, 1], [0, 0]], [[1, 1], [0, 0]])
>>> m.result()
0.12011322
>>> m.reset_state()
>>> m.update_state([[0, 1], [0, 0]], [[1, 1], [0, 0]],
... sample_weight=[1, 0])
>>> m.result()
0.24022643
Usage with compile()
API:
model.compile(
optimizer='sgd',
loss='mse',
metrics=[keras.metrics.MeanSquaredLogarithmicError()])
CosineSimilarity
classkeras.metrics.CosineSimilarity(name="cosine_similarity", dtype=None, axis=-1)
Computes the cosine similarity between the labels and predictions.
Formula:
loss = sum(l2_norm(y_true) * l2_norm(y_pred))
See: Cosine Similarity.
This metric keeps the average cosine similarity between predictions
and
labels
over a stream of data.
Arguments
-1
. The dimension along which the cosine
similarity is computed.Example
Example
>>> # l2_norm(y_true) = [[0., 1.], [1./1.414, 1./1.414]]
>>> # l2_norm(y_pred) = [[1., 0.], [1./1.414, 1./1.414]]
>>> # l2_norm(y_true) . l2_norm(y_pred) = [[0., 0.], [0.5, 0.5]]
>>> # result = mean(sum(l2_norm(y_true) . l2_norm(y_pred), axis=1))
>>> # = ((0. + 0.) + (0.5 + 0.5)) / 2
>>> m = keras.metrics.CosineSimilarity(axis=1)
>>> m.update_state([[0., 1.], [1., 1.]], [[1., 0.], [1., 1.]])
>>> m.result()
0.49999997
>>> m.reset_state()
>>> m.update_state([[0., 1.], [1., 1.]], [[1., 0.], [1., 1.]],
... sample_weight=[0.3, 0.7])
>>> m.result()
0.6999999
Usage with compile()
API:
model.compile(
optimizer='sgd',
loss='mse',
metrics=[keras.metrics.CosineSimilarity(axis=1)])
LogCoshError
classkeras.metrics.LogCoshError(name="logcosh", dtype=None)
Computes the logarithm of the hyperbolic cosine of the prediction error.
Formula:
error = y_pred - y_true
logcosh = mean(log((exp(error) + exp(-error))/2), axis=-1)
Arguments
Example
Example
>>> m = keras.metrics.LogCoshError()
>>> m.update_state([[0, 1], [0, 0]], [[1, 1], [0, 0]])
>>> m.result()
0.10844523
>>> m.reset_state()
>>> m.update_state([[0, 1], [0, 0]], [[1, 1], [0, 0]],
... sample_weight=[1, 0])
>>> m.result()
0.21689045
Usage with compile()
API:
model.compile(optimizer='sgd',
loss='mse',
metrics=[keras.metrics.LogCoshError()])
R2Score
classkeras.metrics.R2Score(
class_aggregation="uniform_average", num_regressors=0, name="r2_score", dtype=None
)
Computes R2 score.
Formula:
sum_squares_residuals = sum((y_true - y_pred) ** 2)
sum_squares = sum((y_true - mean(y_true)) ** 2)
R2 = 1 - sum_squares_residuals / sum_squares
This is also called the coefficient of determination.
It indicates how close the fitted regression line is to ground-truth data.
This metric can also compute the "Adjusted R2" score.
Arguments
multioutput
argument in Scikit-Learn.
Should be one of
None
(no aggregation), "uniform_average"
,
"variance_weighted_average"
.0
.Example
>>> y_true = np.array([[1], [4], [3]], dtype=np.float32)
>>> y_pred = np.array([[2], [4], [4]], dtype=np.float32)
>>> metric = keras.metrics.R2Score()
>>> metric.update_state(y_true, y_pred)
>>> result = metric.result()
>>> result
0.57142854