asd#
- asd(function, x, args=None, stepsize=0.1, sinc=2, sdec=2, pinc=2, pdec=2, pinitial=None, sinitial=None, xmin=None, xmax=None, maxiters=None, maxtime=None, abstol=1e-06, reltol=0.001, stalliters=None, stoppingfunc=None, randseed=None, label=None, verbose=1, minval=0, **kwargs)[source]#
Optimization using adaptive stochastic descent (ASD). Can be used as a faster and more powerful alternative to e.g.
scipy.optimize.minimize()
.ASD starts at
x0
and attempts to find a local minimizerx
of the functionfunc()
.func()
accepts inputx
and returns a scalar function value evaluated atx
.x0
can be a scalar, list, or Numpy array of any size.- Parameters:
function (func) – The function to minimize
x (arr) – The vector of initial parameters
args (any) – List, tuple, or dictionary of additional parameters to be passed to the function
kwargs (dict) – Additional keywords passed to the function
stepsize (0.1) – Initial step size as a fraction of each parameter
sinc (2) – Step size learning rate (increase)
sdec (2) – Step size learning rate (decrease)
pinc (2) – Parameter selection learning rate (increase)
pdec (2) – Parameter selection learning rate (decrease)
pinitial (None) – Set initial parameter selection probabilities
sinitial (None) – Set initial step sizes; if empty, calculated from stepsize instead
xmin (None) – Min value allowed for each parameter
xmax (None) – Max value allowed for each parameter
maxiters (1000) – Maximum number of iterations (1 iteration = 1 function evaluation)
maxtime (3600) – Maximum time allowed, in seconds
abstol (1e-6) – Minimum absolute change in objective function
reltol (1e-3) – Minimum relative change in objective function
stalliters (10*n) – Number of iterations over which to calculate TolFun (n = number of parameters)
stoppingfunc (None) – External method that can be used to stop the calculation from the outside.
randseed (None) – The random seed to use
label (None) – A label to use to annotate the output
verbose (1) – How much information to print during the run (max 3); less than one will print out once every 1/verbose steps
minval (0) – Minimum value the objective function can take
- Returns:
objdict (see below)
The returned object is an
objdict
, which can be accessed by index, key, or attribute. Its keys/attributes are:x
– The parameter set that minimizes the objective functionfval
– The value of the objective function at the final iterationexitreason
– Why the algorithm terminated;details
– See below
The
details
key consists of:fvals
– The value of the objective function at each iterationxvals
– The parameter values at each iteration;probabilities
– The probability of each step; andstepsizes
– The size of each step for each parameter.
Examples:
# Basic usage import numpy as np import sciris as sc result = sc.asd(np.linalg.norm, [1, 2, 3]) print(result.x) # With arguments def my_func(x, scale=1.0, weight=1.0): # Example function with keywords return abs((x[0] - 1)) + abs(x[1] + 2)*scale + abs(x[2] + 3)*weight result = sc.asd(my_func, x=[0, 0, 1], args=[0.5, 0.1]) # Option 1 for passing arguments result = sc.asd(my_func, x=[0, 0, 1], args=dict(scale=0.5, weight=0.1)) # Option 1 for passing arguments result = sc.asd(my_func, x=[0, 0, 1], scale=0.5, weight=0.1) # Option 2 for passing arguments
Please use the following citation for this method:
CC Kerr, S Dura-Bernal, TG Smolinski, GL Chadderdon, DP Wilson (2018). Optimization by adaptive stochastic descent. PLOS ONE 13 (3), e0192944.
New in version 3.0.0: Uses its own random number stream