metrics
orchard.evaluation.metrics
¶
Metrics Computation Module.
Provides a standardized interface for calculating classification performance metrics from model outputs. Isolates statistical logic from inference loops.
compute_classification_metrics(labels, preds, probs)
¶
Computes accuracy, macro-averaged F1, and macro-averaged ROC-AUC.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
labels
|
NDArray[Any]
|
Ground truth class indices. |
required |
preds
|
NDArray[Any]
|
Predicted class indices. |
required |
probs
|
NDArray[Any]
|
Softmax probability distributions. |
required |
Returns:
| Type | Description |
|---|---|
dict[str, float]
|
dict[str, float]: Metric dictionary with keys:
|