Set threshold for prediction labels

Dear SCVI-Team,

Many thanks for implementing SOLO! I have a question about the predict() module.

(i) Are the prediction scores returned with predict(soft=True) the probabilities of the logit plus calibration term?
(ii) When returning the labels with predict(soft=False) the label with the higher probability is returned by preds_df = preds_df.idxmax(axis=1). My apologies if I get confused here but how can I threshold the class prediction as described in the original publication (default 0.5)?

Many thanks for your help!

No we do not have the calibration here, I believe that for some reason it wasn’t necessary anymore (per first author Nick)

You can try Calico’s wrapper of our code which includes the calibration correction here.

We’d also be open to a contribution if you’d like to implement that functionality here.

It might be worth making an issue there with a link to this post.

1 Like