-
Notifications
You must be signed in to change notification settings - Fork 181
Implementation of (1 + lambda) #288
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
No, it does not. Indeed, the closest functionality provided would be using the option
I don't quite understand what this means or why this implies to better use (1+1)-selection. My first approach to finding a local optimum quickly would be to set a relatively small initial step-size, the third argument of |
Thanks for the information. As to the second part, I can elaborate a little on our specific use case. Our problem involves testing several interactions between a user and a robotic device, and we want to find the best device parameters for a given user as defined by our objective function. Some key points
|
It's always a good idea to use
This looks like one of the worse case scenarios for having elitism without reevaluations.
I'd say so.
The main "issue" would be that the step-size becomes too small, when either the landscape is not moving for some time or when it is moving too fast (which can then impose as "noise") where it should increase. Using the implemented noise handling should at least address the second scenario. If you have control over the dynamic changes, it seems also highly advisable to do changes only between the CMA-ES iterations and not within. |
Thank you for the responses, this has been very helpful. I did have a few more general questions if you'd be willing. In general, I understand CMA-ES is quite good (and generically consistent) at larger dimensional problems, and is more tolerant of potentially dynamic landscapes than a method like bayesian optimization. However, are there recommendations you may have for a system with a low dimensional search space (<10) and few function evaluations (~100-150), primarily that are still able to have CMA-ES's higher tolerance for dynamic landscapes. Thanks! |
Generally, some tweaked versions of Nelder Mead are very good in low dimension, in particular below five, and SLSQP (available in |
Does the current code base provide an implementation of (1 + lambda) or (1 + 1) CMA-ES? I know the recommended implementation is the standard (mu, lambda), however I would like to evaluate our problem with the other two methods as we have a higher tolerance for unideal local optima. In the current
CMAOptions()
, I see there are settings for parent number, and elitism but I'm unsure if this will utilize the step-size described in this paper Section 2.1 Algorithm 1, Procedure updateStepSize. Any guidance on this would be appreciated!The text was updated successfully, but these errors were encountered: