-
on VAE : why did u use 2 layer of encoder and 3 layers of decoder?
```
def encoder(x):
# Encoder Hidden layer with sigmoid activation #1
layer_1 = tf.nn.sigmoid(tf.add(tf.matmul(x, weig…
ehfo0 updated
7 years ago
-
@hanzhanggit
@taoxugit
please help me, what is the main problem behind this?
(base) H:\StackGAN\StackGAN-Pytorch-master\code>python main.py --cfg cfg/coco_eval.yml --gpu 0
Using config:
{'C…
-
Congratulations on shipping FNA backward! Looking forward to using it.
On another note: would it be possible to support arbitrary masking?
MaskDiT outperformed regular DiT, with a 70% reduction …
-
I've found the MBConv to have some computational inconsistencies. The following corrected code works, where I've changed the stride of the projection operation (`self.proj`) and moved it out of the `i…
-
## Bug Description
Turning on survival biasing significantly reduces the effectiveness of weight windows. First image is the neutron flux across a block with a weight window applied and survival bi…
teade updated
9 months ago
-
Thank you for this great research!
I'm working on a reproduction of thumos'14 dataset as you explained on github, but there are so different results, can you tell me what's wrong?
![image](https…
-
for any given origin, the expected destination will be towards the center of mass of the network (or it's fringe).
While this is plausible for whole-city networks, it may introduce severe bias in pa…
-
# code:
https://www.kaggle.com/code/liuweiq/coincide-separation-detectron2-training
## Environment:
kaggle
# error:
```
[07/16 13:14:44 d2.engine.defaults]: Model:
GeneralizedRCNN(
(back…
-
Thanks for your working.
I noticed that you mentioned pre trained model weights in Section 4.1 of the article, I have some interest in this. How should I do this part of pre training? What should I…
-
## 🚀 Feature
Loss functions in `torch.nn` module should support complex tensors whenever the operations make sense for complex numbers.
## Motivation
Complex Neural Nets are an active area of …