Closed wentaoyuan closed 4 years ago
@wentaoyuan Hello! The reason is because at initialization, we don't really know the dimensions of the representation emitted by the designated hidden layer (to be fed into the projection network).
I decide to lazy instantiate it to save users from having to run a torch summary and specify it manually.
That's brilliant. Thanks!
Glad you like it! Go make some neural nets smarter :)
Forgive me for my unfamiliarity with software design, but I'm wondering why it is necessary to write a singleton wrapper for
projector
andtarget_encoder
. Is there any disadvantage of initializing them in__init__
?