Ah sorry, I stated above that mu and theta are the same, which is not the case. mu is infinite, which is why theta/mu becomes 0. But I also figured out that I should sample the library size from a normal distribution, as it get exponentiated during the model.module.generative()
, so by sampling from a normal distribution we will get a lognormal distributed library size.
So I use:
ln = torch.distributions.normal.Normal(torch.tensor(6.7649703), torch.tensor(0.16759828))
library = ln.sample(sample_shape=torch.Size([n_cells, 1]))