-
Hi,
Currently, I have used 4 algorithms from stable-baselines for the task of Roboschool HumanoidFlagrunHarder. My evaluation metric is the mean reward of 100 episodes. Basically: PPO2 is perfect, A2…
-
Hello,
I am attempting to perform pre-training and fine-tuning on the AtrialFibrillation dataset, but I am unable to locate the hyperparameters and corresponding performance metrics in the relevant…
-
Problem: I am unable to catch the TOutOfMemoryError using a try-catch block while fine-tuning the hyperparameters with Optuna.
catboost version: 1.2.3
Operating System: ubuntu
-
# What
Hyperparameter Optimization is searching for the best set of hyperparameters for a machine learning model to improve its performance.
# Possible Solution
Perform hyperparameter tuning us…
-
First of all, thanks for your work.
The following experiment was conducted in my computer vision task:
I used soap instead of Adamw and Lamb optimizers, and under the high learning rate setting, th…
-
For iNaturalist:
learning rate is decay at 60th and 80th epoch by 0.1 (in paper), but lr is decay at 120 and 160 in your [code](https://github.com/Megvii-Nanjing/BBN/blob/7992e908842f5934f0d1ee3f430d…
-
hi
i use u5 to train my own dataset (with pretrain weigts: --weights weights/yolov4l-mish.pt). I notice the losses are very low as shows like
` Epoch gpu_mem GIoU obj cls t…
-
How sensitive are the hyperparameters to different ImageNet models? I currently tested targeted attacks (slight modifications to attack loss and instance update rule) on densenet model, but the failur…
-
Hi,
Your work is truly impressive and insightful, and I’m genuinely interested in it! I’ve successfully implemented the code, but I haven't **been able to reproduce the exact same results for each …
-
Hi, first of all great job! I was just wondering what is the role of the tuners. As far as I know, generators are used to create the new dataset with synthetic data, so what is the point of the tuners…