bowang-lab / scGPT

https://scgpt.readthedocs.io/en/latest/
MIT License
1k stars 192 forks source link

Where shall I find the codes for Supplementary Figure 1 #162

Open RYY0722 opened 6 months ago

RYY0722 commented 6 months ago

Hi, is there any update on implementing the Generative attention masking? Could you please also provide some explanation in https://github.com/bowang-lab/scGPT/blob/dev-temp/examples/pretrain.py regarding how the mechanism in Fig. S1 is implemented? Thanks a lot!!!

subercui commented 6 months ago

Hi, yes, the generative attention masking is implemented in the dev-temp branch. The specific code is here https://github.com/bowang-lab/scGPT/blob/dev-temp/scgpt/model/flash_layers.py#L19. It takes in two parts of input pcpt_total_embs, gen_total_embs. The first relates to the genes with expression value input, and the second is the genes whose expression values are to be predicted. We also recently worked on a faster version that uses the triton flash attention, and we are testing it and will release soon

RYY0722 commented 6 months ago

Thank you!