How to install solo-sc or use hashsolo from scvi?

When I attempt to install solo in conda environment lately, I have been encountering the following error.

pip install solo-sc
Collecting solo-sc
  Using cached solo-sc-1.3.tar.gz (18 kB)
  Preparing metadata (setup.py) ... done
Collecting ConfigArgParse (from solo-sc)
  Obtaining dependency information for ConfigArgParse from https://files.pythonhosted.org/packages/6f/b3/b4ac838711fd74a2b4e6f746703cf9dd2cf5462d17dac07e349234e21b97/ConfigArgParse-1.7-py3-none-any.whl.metadata
  Using cached ConfigArgParse-1.7-py3-none-any.whl.metadata (23 kB)
Collecting pandas (from solo-sc)
  Obtaining dependency information for pandas from https://files.pythonhosted.org/packages/4a/f6/f620ca62365d83e663a255a41b08d2fc2eaf304e0b8b21bb6d62a7390fe3/pandas-2.0.3-cp310-cp310-macosx_11_0_arm64.whl.metadata
  Downloading pandas-2.0.3-cp310-cp310-macosx_11_0_arm64.whl.metadata (18 kB)
Collecting seaborn (from solo-sc)
  Using cached seaborn-0.12.2-py3-none-any.whl (293 kB)
Collecting tqdm (from solo-sc)
  Obtaining dependency information for tqdm from https://files.pythonhosted.org/packages/00/e5/f12a80907d0884e6dff9c16d0c0114d81b8cd07dc3ae54c5e962cc83037e/tqdm-4.66.1-py3-none-any.whl.metadata
  Using cached tqdm-4.66.1-py3-none-any.whl.metadata (57 kB)
Collecting scvi-tools (from solo-sc)
  Obtaining dependency information for scvi-tools from https://files.pythonhosted.org/packages/44/7a/02fcfcfb8cc9c24da800cc53f127deb5626989da44038b8af58facbb6dfb/scvi_tools-1.0.3-py3-none-any.whl.metadata
  Using cached scvi_tools-1.0.3-py3-none-any.whl.metadata (10 kB)
Collecting leidenalg (from solo-sc)
  Obtaining dependency information for leidenalg from https://files.pythonhosted.org/packages/dc/b1/73130970435246d2a0629b1e315f5d5711e3d747f449e8b90c05cd9a133a/leidenalg-0.10.1-cp38-abi3-macosx_11_0_arm64.whl.metadata
  Using cached leidenalg-0.10.1-cp38-abi3-macosx_11_0_arm64.whl.metadata (10 kB)
Collecting scanpy (from solo-sc)
  Using cached scanpy-1.9.3-py3-none-any.whl (2.0 MB)
Collecting pytorch-lightning==1.3.1 (from solo-sc)
  Using cached pytorch_lightning-1.3.1-py3-none-any.whl (805 kB)
Collecting numpy>=1.17.2 (from pytorch-lightning==1.3.1->solo-sc)
  Obtaining dependency information for numpy>=1.17.2 from https://files.pythonhosted.org/packages/c3/ea/1d95b399078ecaa7b5d791e1fdbb3aee272077d9fd5fb499593c87dec5ea/numpy-1.25.2-cp310-cp310-macosx_11_0_arm64.whl.metadata
  Downloading numpy-1.25.2-cp310-cp310-macosx_11_0_arm64.whl.metadata (5.6 kB)
Collecting torch>=1.4 (from pytorch-lightning==1.3.1->solo-sc)
  Downloading torch-2.0.1-cp310-none-macosx_11_0_arm64.whl (55.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 55.8/55.8 MB 3.2 MB/s eta 0:00:00
Collecting future>=0.17.1 (from pytorch-lightning==1.3.1->solo-sc)
  Using cached future-0.18.3.tar.gz (840 kB)
  Preparing metadata (setup.py) ... done
Collecting PyYAML<=5.4.1,>=5.1 (from pytorch-lightning==1.3.1->solo-sc)
  Using cached PyYAML-5.4.1.tar.gz (175 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error
  
  Γ— Getting requirements to build wheel did not run successfully.
  β”‚ exit code: 1
  ╰─> [62 lines of output]
      /private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/config/setupcfg.py:293: _DeprecatedConfig: Deprecated config in `setup.cfg`
      !!
      
              ********************************************************************************
              The license_file parameter is deprecated, use license_files instead.
      
              By 2023-Oct-30, you need to update your project and remove deprecated calls
              or your builds will no longer be supported.
      
              See https://setuptools.pypa.io/en/latest/userguide/declarative_config.html for details.
              ********************************************************************************
      
      !!
        parsed = self.parsers.get(option_name, lambda x: x)(value)
      running egg_info
      writing lib3/PyYAML.egg-info/PKG-INFO
      writing dependency_links to lib3/PyYAML.egg-info/dependency_links.txt
      writing top-level names to lib3/PyYAML.egg-info/top_level.txt
      Traceback (most recent call last):
        File "/Users/ab/miniconda3/envs/solo/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
          main()
        File "/Users/ab/miniconda3/envs/solo/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
        File "/Users/ab/miniconda3/envs/solo/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
          return hook(config_settings)
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 355, in get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 325, in _get_build_requires
          self.run_setup()
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 341, in run_setup
          exec(code, locals())
        File "<string>", line 271, in <module>
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/__init__.py", line 107, in setup
          return distutils.core.setup(**attrs)
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 185, in setup
          return run_commands(dist)
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 201, in run_commands
          dist.run_commands()
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 969, in run_commands
          self.run_command(cmd)
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/dist.py", line 1233, in run_command
          super().run_command(command)
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
          cmd_obj.run()
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/command/egg_info.py", line 319, in run
          self.find_sources()
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/command/egg_info.py", line 327, in find_sources
          mm.run()
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/command/egg_info.py", line 549, in run
          self.add_defaults()
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/command/egg_info.py", line 587, in add_defaults
          sdist.add_defaults(self)
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/command/sdist.py", line 113, in add_defaults
          super().add_defaults()
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/_distutils/command/sdist.py", line 251, in add_defaults
          self._add_defaults_ext()
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/_distutils/command/sdist.py", line 336, in _add_defaults_ext
          self.filelist.extend(build_ext.get_source_files())
        File "<string>", line 201, in get_source_files
        File "/private/var/folders/wt/4f4m9q6s1vb0bb6v9ccjm5280000gn/T/pip-build-env-nkfiqeah/overlay/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 107, in __getattr__
          raise AttributeError(attr)
      AttributeError: cython_sources
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

Γ— Getting requirements to build wheel did not run successfully.
β”‚ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

This problem is seen in both apple silicon and linux platforms.

I also want to ask if there’s a tutorial or notebook to use hashsolo from scvi to demultiplex hashing data?
Thanks ahead.

Hi, Hashsolo is available through scanpy: scanpy.external.pp.hashsolo β€” Scanpy 1.9.3 documentation

With scanpy.external.pp.hashsolo, I can’t seem to demultiplex because the hashing counts is in .var instead of .obs for the outputs from cellranger multi. How can I modify the cellranger multi/count output to fit the scanpy.external.pp.hashsolo requirement? similar questions has also been asked on the biostars. How to demultiplex data with totalseq B antibodies using hashsolo?

I’m assuming that the hashtag oligo (HTO) names are in .var, which means that the hashing counts themselves are in adata.X. If this is the case, you can format your AnnData for hashsolo doing something like the following:

  1. Get the list of HTOs in your data:
htos = list(adata.var[adata.var["feature_types"] == "Antibody Capture"].index.unique())
  1. Iterate through each HTO, extract counts, and place in .obs:
for hto in htos:
    adata.obs[hto] = adata.X[:, adata.var.index == hto].toarray()

No guarantees this will work exactly with your data, you might have to change some variable names, etc.

Thank you. It worked for my dataset. The semicolon might be a typo.
The code is as the following:

import scanpy as sc

adata=sc.read_10x_h5("filtered_feature_bc_matrix.h5",gex_only=False)
htos = list(adata.var[adata.var["feature_types"] == "Multiplexing Capture"].index.unique())

for hto in htos:
    adata.obs[hto] = adata.X[:,adata.var.index == hto].toarray()

import scanpy.external as sce

sce.pp.hashsolo(adata, ['hash1', 'hash2', 'hash3','hash4'])
1 Like