pyepo.metric.metrics

Metrics for SKlearn model

Module Contents

Functions

SPOError(pred_cost, true_cost, model_type, args)

A function to calculate normalized true regret

makeSkScorer(optmodel)

A function to create sklearn scorer

makeAutoSkScorer(optmodel)

A function to create Auto-SKlearn scorer

testMSE(pred_cost, true_cost, model_type, args)

A function to calculate MSE for testing

makeTestMSEScorer(optmodel)

A function to create MSE scorer for testing

pyepo.metric.metrics.SPOError(pred_cost, true_cost, model_type, args)

A function to calculate normalized true regret

Parameters:
  • pred_cost (numpy.array) – predicted costs

  • true_cost (numpy.array) – true costs

  • model_type (ABCMeta) – optModel class type

  • args (dict) – optModel args

Returns:

regret loss

Return type:

float

pyepo.metric.metrics.makeSkScorer(optmodel)

A function to create sklearn scorer

Parameters:

optmodel (optModel) – optimization model

Returns:

callable object that returns a scalar score; less is better.

Return type:

scorer

pyepo.metric.metrics.makeAutoSkScorer(optmodel)

A function to create Auto-SKlearn scorer

Parameters:

optmodel (optModel) – optimization model

Returns:

callable object that returns a scalar score; less is better.

Return type:

scorer

pyepo.metric.metrics.testMSE(pred_cost, true_cost, model_type, args)

A function to calculate MSE for testing

Parameters:
  • pred_cost (array) – predicted costs

  • true_cost (array) – true costs

  • model_type (ABCMeta) – optModel class type

  • args (dict) – optModel args

Returns:

mse

Return type:

float

pyepo.metric.metrics.makeTestMSEScorer(optmodel)

A function to create MSE scorer for testing

Parameters:

optmodel (optModel) – optimization model

Returns:

callable object that returns a scalar score; less is better.

Return type:

scorer