Skip to content Skip to sidebar Skip to footer

How To Force Larger Steps On Scipy.optimize Functions?

I have a function compare_images(k, a, b) that compares two 2d-arrays a and b Inside the funcion, I apply a gaussian_filter with sigma=k to a My idea is to estimate how much I must

Solution 1:

Quick check: you probably really meant fmin(compare_images, init_guess, (a,b))?

If gaussian_filter behaves as you say, your function is piecewise constant, meaning that optimizers relying on derivatives (i.e. most of them) are out. You can try a global optimizer like anneal, or brute-force search over a sensible range of k's.

However, as you described the problem, in general there will only be a clear, global minimum of compare_images if b is a smoothed version of a. Your approach makes sense if you want to determine the amount of smoothing of a that makes both images most similar.

If the question is "how similar are the images", then I think pixelwise comparison (maybe with a bit of smoothing) is the way to go. Depending on what images we are talking about, it might be necessary to align the images first (e.g. for comparing photographs). Please clarify :-)

edit: Another idea that might help: rewrite compare_images so that it calculates two versions of smoothed-a -- one with sigma=floor(k) and one with ceil(k) (i.e. round k to the next-lower/higher int). Then calculate a_smooth = a_floor*(1-kfrac)+a_ceil*kfrac, with kfrac being the fractional part of k. This way the compare function becomes continuous w.r.t k.

Good Luck!

Solution 2:

Basin hopping may do a bit better, as it has a high chance of continuing anyway when it gets stuck at the plateau's.

I found on this example function that it does reasonably well with a low temperature:

>>> opt.basinhopping(lambda (x,y): int(0.1*x**2 + 0.1*y**2), (5,-5), T=.1)
    nfev: 409fun: 0
       x: array([ 1.73267813, -2.54527514])
 message: ['requested number of basinhopping iterations completed successfully']
    njev: 102
     nit: 100

Solution 3:

I realize this is an old question but I haven't been able to find many discussion of similar topics. I am facing a similar issue with scipy.optimize.least_squares. I found that xtol did not do me much good. It did not seem to change the step size at all. What made a big difference was diff_step. This sets the step size taken when numerically estimating the Jacobian according to the formula step_size = x_i*diff_step, where x_i is each independent variable. You are using fmin so you aren't calculating Jacobians, but if you used another scipy function like minimize for the same problem, this might help you.

Post a Comment for "How To Force Larger Steps On Scipy.optimize Functions?"