I am making a loop on jupyter that runs sc.rank_genes_groups on a subset anndata of a larger anndata object, deletes the temporary smaller object, and returns stats.
Despite putting ‘del adata’, the memory allocation keeps increasing with each loop.
This happens outside of the loop as well. ‘del adata’ makes the object uncallable but still takes up space.
I have also tried %reset_selective -f ‘adata’ and including gc.collect() at the end of the chunk.
I definitely have enough memory to do the loop I want. If the anndata object was deleted correctly, I would be easily below 20% of my workstation’s memory at any point. The loop is functional as well, it can run successfully for several rounds until the memory is full and the kernel crashes.
Does anyone have any idea how to fix this?