Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refine the learning rate schedule #46

Open
alinaselega opened this issue Mar 4, 2015 · 5 comments
Open

Refine the learning rate schedule #46

alinaselega opened this issue Mar 4, 2015 · 5 comments
Assignees

Comments

@alinaselega
Copy link
Contributor

Make the learning rate decay slower as it might be currently getting too small.

@alinaselega alinaselega self-assigned this Mar 4, 2015
@alinaselega
Copy link
Contributor Author

Testing a model based on alexnet_based.yaml with increased epoch at which learning rate (and momentum) saturate and smaller scaling factors for the learning rate. Also monitoring valid_y_nll.

@alinaselega
Copy link
Contributor Author

valid_y_nll was looking pretty low after only a few epochs but the model scored slightly worse on the holdout set than the original alexnet_based. After letting it run for longer, the score actually worsened.

@alinaselega
Copy link
Contributor Author

Now trying two models, both with saturating epoch = 100 (4 times larger than the original) and original scaling factors for learning rate (0.9 and 1.1). One model still monitors valid_y_nll and the second monitors valid_objective again. Pending results.

@gngdb gngdb added this to the Pylearn2 models milestone Mar 4, 2015
@alinaselega
Copy link
Contributor Author

The model that was monitoring valid_objective did a little better. Currently running a variation of the current best model (alexnet with extra convolutional layer and 8-factor augmentation) with the epoch of learning rate and momentum saturation set to 100.

@alinaselega
Copy link
Contributor Author

The model didn't do better than the current best after 100 epochs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants