ferret.TauLOO_Evaluation#

class ferret.TauLOO_Evaluation(model, tokenizer, task_name)[source]#
__init__(model, tokenizer, task_name)#

Methods

__init__(model, tokenizer, task_name)

compute_evaluation(explanation, ...)

Evaluate an explanation on the tau-LOO metric, i.e., the Kendall tau correlation between the explanation scores and leave one out (LOO) scores, computed by leaving one feature out and computing the change in the prediciton probability

Attributes

BEST_VALUE

LOWER_IS_BETTER

MAX_VALUE

METRIC_FAMILY

MIN_VALUE

NAME

SHORT_NAME

tokenizer