AugurWorks / UI

Augurworks UI code
0 stars 0 forks source link

NN Configuration Updates #189

Open augurworks1 opened 9 years ago

augurworks1 commented 9 years ago

I've been running several tests over the last few week. It seems 2500 rounds and 4 layers is the best combo. The results using 5 layers tend to rarely go negative and 3 tends to be better than 5, but 4 seems to be the better yet. Also, the more training rounds seems to track with the stock better. See below.

Results: Rounds=1000, Depth=4 cms 9feb 1000-4 -1 jpg

Rounds=2500, Depth=4 cms 9feb 2500-4 -1 jpg

safreiberg commented 9 years ago

Is this on Dev or Prod? Curious about whether the sigmoid normalization seems to be making a difference in the positive or negative direction. Also, do we know how long a net takes to train with a depth of 4 and 2500 training rounds?

augurworks1 commented 9 years ago

Both have similar characteristics. I haven't seen much variance so far between sigmoid and linear settings. I have run as high a 5000 training rounds, but seems like diminishing returns.

Actually, after I restart alfred, 8 inputs and 1 month date range takes about 15 minutes!! Using sentiment takes a bit longer.

augurworks1 commented 9 years ago

This was prod