AltschulerWu-Lab / scScope

48 stars 12 forks source link

demo for tensorflow CPU #3

Closed MatDal closed 3 years ago

MatDal commented 5 years ago

Hello, can you provide also a demo for tensorflow CPU version?

Thanks a lot! Mattia

feng-bao-ucsf commented 5 years ago

Dear Mattia,

Thanks for your interest in our work. To run scScope on CPU, you need to make following configurations:

  1. install "tensorflow"( instead of "tensorflow-gpu"): pip install tensorflow.
  2. install scScope CPU version: pip install scScope-cpu.
  3. aflter that you can run the following demo, it only takes a few mimutes to finish with the dataset we provided in the project page (https://github.com/AltschulerWu-Lab/scScope):

'''run scScope on CPU.'''
import scscope as DeepImpute
import pandas as pd
import phenograph
import pickle
from sklearn.metrics.cluster import adjusted_rand_score
import numpy as np

# For this demo we normalize data using scanpy which is not a required package for scScope.
# To install, use: pip install scanpy
import scanpy.api as sc

def RUN_MAIN():

    # 1. Load gene expression matrix of simulated data
    # gene expression with simulated dropouts
    counts_drop = pd.read_csv('counts_1.csv', header=0, index_col=0)
    # ground trouth subpopulation assignment
    cellinfo = pd.read_csv('cellinfo_1.csv', header=0, index_col=0)

    group = cellinfo.Group
    label_ground_truth = []
    for g in group:
        g = int(g.split('Group')[1])
        label_ground_truth.append(g)

    # 2. Normalize gene expression based on scanpy (normalize each cell to have same library size)
    # matrix of cells x genes
    gene_expression = sc.AnnData(counts_drop.values)
    # normalize each cell to have same count number
    sc.pp.normalize_per_cell(gene_expression)
    # update datastructure to use normalized data
    gene_expression = gene_expression.X

    latent_dim = 50

    # 3. scScope learning
    if gene_expression.shape[0] >= 100000:
        DI_model = DeepImpute.train(
            gene_expression, latent_dim, T=2, batch_size=512, max_epoch=10)
    else:
        DI_model = DeepImpute.train(
            gene_expression, latent_dim, T=2, batch_size=64, max_epoch=300)

    # 4. latent representations and imputed expressions
    latent_code, imputed_val, _ = DeepImpute.predict(
        gene_expression, DI_model)

    # 5. graph clustering
    if latent_code.shape[0] <= 10000:
        label, _, _ = phenograph.cluster(latent_code)
    else:
        label = DeepImpute.scalable_cluster(latent_code)

    # evaluate
    ARI = adjusted_rand_score(label, label_ground_truth)
    print(ARI)

if __name__ == '__main__':
    RUN_MAIN()`

Hope this could help.

Best, Feng