I would like to know how you would set the parameters for the GBM model?

1)

I could optimize the parameters sequentially. At first, using a large value for shrinkage and a small number of iterations (for faster computation), I would try to optimize n.minobsinnode, interaction.depth, bag.fraction, etc.

Then using these parameters, I would scale the skrinkage to lower value and find the optimal number of trees.

2)

All at the same time, using tons of scenario simulations.

For example, is it possible that the optimal interaction.depth to have different values for shrinkage 0.01 and 0.001?

If yes, it would not be fine to fix some parameters to optimize the others…

**Contents**hide

#### Best Answer

In theory small shrinkage is supposed to always give a better result, but at the expense of more iterations required. When I have used GBM I have assumed this is true (admittedly without rigorously testing).

However, I wouldn't assume that the optimal set of interaction.depth, n.minobsinnode etc is the same for all values of the shrinkage (after optimising the number of iterations each time). Possibly this is true in some cases, and is probably roughly true in most. If you want to really squeeze the best performance possible without regard to computation cost I would not make this assumption at the outset.

### Similar Posts:

- Solved – Role of n.minobsinnode parameter of GBM in R
- Solved – Change settings in the prediction model (caret package)
- Solved – a good number of treedepth saturations for a fit stan model
- Solved – a good number of treedepth saturations for a fit stan model
- Solved – a good number of treedepth saturations for a fit stan model