4.4. Evaluating model performance#

There are several metrics that we could use to evaluate the performance of each iteration of the analysis pipeline, each of which is uniquely defined by 1) the query generation method, 2) our choice of topic model, and 3) the specification of the relevance function.

We considered several options, which are listed below.

Evaluation metric

Definition

Benefits

Other considerations

1

Precision

xx

xx

xx

2

Recall

xx

xx

xx

3

Percent unique

xx

xx

xx

Our primary metric will be XXX based on the work of YYY.