-
Notifications
You must be signed in to change notification settings - Fork 220
Loss Graphs
TensorRec allows you to define the algorithm that will be used to compute loss for a set of recommendations.
You can define a custom loss function yourself, or you can use a pre-made loss function that comes with TensorRec in tensorrec.loss_graphs.
This loss function returns the root mean square error between the predictions and the true interactions.
Interactions can be any positive or negative values, and this loss function is sensitive to magnitude.
This loss function returns the root mean square error between the predictions and the true interactions, including all non-interacted values as 0s.
Interactions can be any positive or negative values, and this loss function is sensitive to magnitude.
This loss function models the explicit positive and negative interaction predictions as normal distributions and returns the probability of overlap between the two distributions.
Interactions can be any positive or negative values, but this loss function ignores the magnitude of the interaction -- interactions are grouped in to {i <= 0}
and {i > 0}
.
This loss function models all positive and negative interaction predictions as normal distributions and returns the probability of overlap between the two distributions. This loss function includes non-interacted items as negative interactions.
Interactions can be any positive or negative values, but this loss function ignores the magnitude of the interaction -- interactions are grouped in to {i <= 0}
and {i > 0}
.
Approximation of WMRB: Learning to Rank in a Scalable Batch Training Approach .
Interactions can be any positive values, but magnitude is ignored. Negative interactions are ignored.
This loss graph extends WMRB by making it sensitive to interaction magnitude and weighting the loss of each item by 1 / sum(interactions)
per item.
Interactions can be any positive values. Negative interactions are ignored.
import tensorflow as tf
import tensorrec
# Define a custom loss graph
class SimpleLossGraph(tensorrec.loss_graphs.AbstractLossGraph):
def connect_loss_graph(self, tf_prediction_serial, tf_interactions_serial, **kwargs):
"""
This loss function returns the absolute simple error between the predictions and the interactions.
:param tf_prediction_serial: tf.Tensor
The recommendation scores as a Tensor of shape [n_samples, 1]
:param tf_interactions_serial: tf.Tensor
The sample interactions corresponding to tf_prediction_serial as a Tensor of shape [n_samples, 1]
:param kwargs:
Other TensorFlow nodes.
:return:
A tf.Tensor containing the learning loss.
"""
return tf.reduce_mean(tf.abs(tf_interactions_serial - tf_prediction_serial))
# Build a model with the custom loss function
model = tensorrec.TensorRec(loss_graph=SimpleLossGraph())
# Generate some dummy data
interactions, user_features, item_features = tensorrec.util.generate_dummy_data(num_users=100,
num_items=150,
interaction_density=.05)
# Fit the model for 5 epochs
model.fit(interactions, user_features, item_features, epochs=5, verbose=True)