Same scVI model with different values on mac silicon and linux?!

Hi.

I have used scvi model to integrate data from different batches. I used HPC cluster and gpu for that.
when I checked the data on cluster

adata.layers['scvi_norm'] = vae.get_normalized_expression(adata=adata,library_size=1e4)
>>> adata.layers['scvi_norm'].min()
3.869816e-10
>>> adata.layers['scvi_norm'].max()
1822.8737

but when I analyse the data locally on mac (M silicon) I got different values?

>>> adata.layers['scvi_norm'] = vae.get_normalized_expression(adata=adata,library_size=1e4)
>>> adata.layers['scvi_norm'].min()
7.775142e-10
>>> adata.layers['scvi_norm'].max()
1753.7305

is this normal?

thanks

It’s definitely normal to get slightly different results between different OS and HW you run on. The question if the model was also trained on different machines (in that case, it’s even more likely to be different)

also scvi tools compute normalized gene expression for each cell by sampling from the latent variables of your trained model. So that sampling can inject randomness to it.

And above all make sure you set the seed in order to try to reproduce your results.

1 Like