You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jan 6, 2022. It is now read-only.
Yes and no. Yes because I obviously don't want to generate the illusion that mlrHyperopt is a link between mlr and the TPE part of hyperopt. No because hyperopt seems to be the obvious short version for hyperparameter optimization and hyperopt itself is not a package entirely devoted to one specific optimization method and it lives in the Python world.
I was working on a much less sophisticated optimization wrapper before I discovered mlrMBO, basically an iterative grid search that iteratively zooms in on better performing parameter spaces. I was planning on calling it "autotune", both because the name makes sense for the problem we're trying to solve, and because it's catchy and memorable ... it's what they call this technique in pop music recording: https://youtu.be/koQksuxzJ4w
Just a thought!
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
because of this
https://github.com/hyperopt/hyperopt
The text was updated successfully, but these errors were encountered: