pyepo.func.spoplus

SPO+ Loss function

Module Contents

Classes

SPOPlus

An autograd module for SPO+ Loss, as a surrogate loss function of SPO Loss,

SPOPlusFunc

A autograd function for SPO+ Loss

class pyepo.func.spoplus.SPOPlus(optmodel, processes=1, solve_ratio=1, reduction='mean', dataset=None)

Bases: pyepo.func.abcmodule.optModule

An autograd module for SPO+ Loss, as a surrogate loss function of SPO Loss, which measures the decision error of the optimization problem.

For SPO/SPO+ Loss, the objective function is linear and constraints are known and fixed, but the cost vector needs to be predicted from contextual data.

The SPO+ Loss is convex with subgradient. Thus, it allows us to design an algorithm based on stochastic gradient descent.

Reference: <https://doi.org/10.1287/mnsc.2020.3922>

forward(pred_cost, true_cost, true_sol, true_obj)

Forward pass

class pyepo.func.spoplus.SPOPlusFunc(*args, **kwargs)

Bases: torch.autograd.Function

A autograd function for SPO+ Loss

static forward(ctx, pred_cost, true_cost, true_sol, true_obj, module)

Forward pass for SPO+

Parameters:
  • pred_cost (torch.tensor) – a batch of predicted values of the cost

  • true_cost (torch.tensor) – a batch of true values of the cost

  • true_sol (torch.tensor) – a batch of true optimal solutions

  • true_obj (torch.tensor) – a batch of true optimal objective values

  • module (optModule) – SPOPlus modeul

Returns:

SPO+ loss

Return type:

torch.tensor

static backward(ctx, grad_output)

Backward pass for SPO+