Closed gauravkuppa closed 4 years ago
Experiment Number: 4.2
Branch: master
Timestamp: 09/12/2020 10am PT
Epochs: 1
Module Parameters: batch size: 4
train
val
Experiment Number: 4.2.0
Branch: master
Timestamp: 09/14/2020 6pm PT
Epochs: finish
I'm extremely impressed at the validation loss curve for L1 and VGG
python train.py \
--model unet \
--batch 16 \
--person_inputs densepose agnostic \
--cloth_inputs cloth \
--val_check_interval 0.05 \
--self_attn \
--accumulated_batches 4 \
-j 12 \
--activation gelu \
--name 4.2.1 \
--vvt_dataroot ~/data/fw_gan_vvt
To report a result, copy this into a comment below:
# Result Description
<!---
For Experiment Number, use "Major.minor.patch", e.g. 1.2.0.
Major.minor should match the [M.m] in the title.
Patch describes a bug fix (change in the code or branch).
-->
**Experiment Number:** 1.2.0
**Branch:** `master`
**Timestamp:** MM/DD/YYYY 9pm PT
**Epochs:**
# Architecture
**Model Layers:**
<!-- Paste the printed Model Layers -->
**Module Parameters:**
<!-- Paste the Params table -->
# Loss Graphs
<!--- Put detailed loss graphs here. Please include all graphs! -->
# Image Results
<!--- Put detailed image results here. Please include all images! Multiple screenshots is good. -->
# Comments, Observations, or Insights
<!--- Optional -->
Description
Explain why we're running this and what we expect.
Planned Start Date:
Depends on Previous Experiment? Y/N Effects of GELU on UNet training
Train Command
Report Results
To report a result, copy this into a comment below: