lucidrains / enformer-pytorch

Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
MIT License
435 stars 82 forks source link

AssertionError: if using tf gamma, only sequence length of 1536 allowed for now #40

Closed jerome-f closed 8 months ago

jerome-f commented 8 months ago

@lucidrains I think this is not getting set properly. As per the doc it says from_pretrained should set the use_tf_gamma. Then the pre trained model should be able to accept seqlength of 196_608 ? or am I missing something ?

import torch
import polars as pl
from torch import nn
from enformer_pytorch import from_pretrained,GenomeIntervalDataset
from enformer_pytorch.finetune import HeadAdapterWrapper

enformer = from_pretrained('EleutherAI/enformer-official-rough')

enformer(seq) #seq of length 196_680

get the AssertionError with above

abearab commented 1 month ago

hmm, I'm getting a similar error:

AssertionError: if using tf gamma, only sequence length of 1536 allowed for now

@jerome-f – how did you fix this?

abearab commented 1 month ago

nevermind ... I had to add use_tf_gamma = False to the from_pretrained() function!