Open viccc17 opened 8 months ago
Hi, the number of trainable parameters is mainly dependent on the vocabulary size and the embedding layer harbors the highest number of parameters. For example, the 12-mer model will have ~8M*100 parameters in the embedding layer and the number of parameters in other layers is minor.
Thanks for your reply! Actually, I'm trying to show the total parameters of the model for kmer=12 using model.summary(). However, it seems that the model is not defined using keras. I also cannot get the total parameter using tf.global_variables(). Can you help me with this? Thank you!
Maybe you can try: total_params = tf.reduce_sum([tf.reduce_prod(var.shape) for var in tf.trainable_variables()])
Hi! May I know the size of the DeepMicrobes model (the best performing one)? Like how many trainable parameters are there? Thank you!