-
Notifications
You must be signed in to change notification settings - Fork 12
Open
Description
Julia HP optimization packages:
- Hyperopt.jl @baggepinnen (Random search, Latin hypercube sampling, Bayesian opt)
- TreeParzen.jl (port of Hyperopt.py to Julia) @IQVIA-ML @iqml
- NaiveGAflux.jl (helps automate Flux) @DrChainsaw
Other HP optimization packages:
- Hyperopt.py (Hyperopt-sklearn.py)
- Optuna.py
- mlaut.py (Benchmark paper)
- Ax.py
- TPOT.py
- AutoKeras.py
- AutoML org: SMAC3.py, AutoSKLearn.py, AutoPyTorch.py, AutoWeka.java (unmaintained)
- H2O (Python & R)
- mlrMBO.r: AutoXGBoost.r (uses mlrMBO) (paper), tuneRanger.r (uses mlrMBO)
- liquidSVM (R, Python, MATLAB / Octave, Java, Spark. Paper.)
- Gama.py
- oboe.py
- Spearmint.py Bayesian optimization (unmaintained)
- Google Vizier (unmaintained)
- Katib
- GPyOpt.py
- Autotune (SAS)
- Tune
There are projects that benchmark different AutoML systems: https://openml.github.io/automlbenchmark/
From our conversation: JuliaAI/MLJ.jl#416 (comment)
I wanted to tell you guys about Optuna (repo & paper) a new framework for HP optimization.
A nice comparison w/ Hyperopt shows what can be done for HP visualization:
https://neptune.ai/blog/optuna-vs-hyperopt
A 3 minute clip: https://www.youtube.com/watch?v=-UeC4MR3PHM
It would really be amazing for MLJ to incorporate this!
OkonSamuel, MonadKai and VolodyaCO
Metadata
Metadata
Assignees
Labels
No labels

