【Optimization Method】Optuna Tutorial part1
Let's see how to use the Optuna hyperparameter tuning library this time.
1.1 What is Optuna
Official Page:
Official Tutorial:Optuna is an open source hyperparameter optimization framework to automate hyperparameter search.
Using this, we can find better hyperparams in machine learning models, etc.
This article is a summary of Optuna's tutorial.
1.2 Example Code
・Example code
import optuna
def objective(trial):
x = trial.suggest_float("x", -10, 10)
return (x - 2) ** 2
study = optuna.create_study()
study.optimize(objective, n_trials=100)
best_params = study.best_params
found_x = best_params["x"]
print("Found x: {}, (x - 2)^2: {}".format(found_x, (found_x - 2) ** 2))
・Results
[I 2024-09-24 12:56:04,670] A new study created in memory with name: no-name-f940cbb6-9351-4924-9290-198f161da5f9
[I 2024-09-24 12:56:04,678] Trial 0 finished with value: 100.38536990882245 and parameters: {'x': -8.019249967378919}. Best is trial 0 with value: 100.38536990882245.
[I 2024-09-24 12:56:04,681] Trial 1 finished with value: 22.26164037703732 and parameters: {'x': 6.718224282188938}. Best is trial 1 with value: 22.26164037703732.
[I 2024-09-24 12:56:04,682] Trial 2 finished with value: 123.84835479116522 and parameters: {'x': -9.128717571722504}. Best is trial 1 with value: 22.26164037703732.
...
[I 2024-09-24 12:56:06,398] Trial 98 finished with value: 0.38090005805752836 and parameters: {'x': 1.3828289879964157}. Best is trial 69 with value: 0.0002501075714029701.
[I 2024-09-24 12:56:06,410] Trial 99 finished with value: 0.2405386846498442 and parameters: {'x': 2.4904474331157664}. Best is trial 69 with value: 0.0002501075714029701.
Found x: 1.9841852103585609, (x - 2)^2: 0.0002501075714029701
This is an example code. Here we consider the x
that minimizes
Let's see the components of this from here.
1.2.1 Objective Function
・Objective Function
def objective(trial):
x = trial.suggest_float("x", -10, 10)
return (x - 2) ** 2
The objective function contains 2 main contents, trial and return score.
Trial
We need the objective function(other names are fine too).
It takes the trial object as an argument and returns something like a score.
The trial object will be used for specifying the search range of the variables.
x = trial.suggest_float("x", -10, 10)
x: variable actual used.
"x": name of variable, used when checking the best value.
suggest_float: type of variable.
-10, 10: range of variable.
Score
The score will be used for the calculation of how much the variable change affects the score. Typically, the optimization method tries to reduce the score.
1.2.2 Study object
・Study object
study = optuna.create_study()
study.optimize(objective, n_trials=100)
The Optuna study object optimizes the objective function within times of n_trials.
The optimization is completed now, so check the results after this.
1.2.3 Check the results
・Check the results
best_params = study.best_params
found_x = best_params["x"]
print("Found x: {}, (x - 2)^2: {}".format(found_x, (found_x - 2) ** 2))
study.best_params returns the dict of best values of all variables that were registered to the trial object. So we can check the final value.
・best_params
{'x': 1.9841852103585609}
It's done, that's the most simple usage of outuna.
1.3 Summary
This time, I explained how to use the optuna with level 1. I'll also explain the advanced contents after today.
The Optuna tutorial part 1 has ended, thank you for reading and I would be glad if you also read the next article that will be published after tomorrow.
Discussion