TY - UNPB

T1 - Simple Complexity Analysis of Direct Search

AU - Konecny, Jakub

AU - Richtarik, Peter

N1 - 16 pages, 2 algorithms, 1 table

PY - 2014/10/1

Y1 - 2014/10/1

N2 - We consider the problem of unconstrained minimization of a smooth function in the derivative-free setting. In particular, we study the pattern search method (of directional type). Despite relevant research activity spanning several decades, until recently no complexity guarantees---bounds on the number of function evaluations needed to find a satisfying point---for methods of this type were established. Moreover, existing complexity results require long proofs and the resulting bounds have a complicated form. In this paper we give a very brief and insightful analysis of pattern search for nonconvex, convex and strongly convex objective function, based on the observation that what is in the literature called an "unsuccessful step", is in fact a step that can drive the analysis. We match the existing results in their dependence on the problem dimension ($n$) and error tolerance ($\epsilon$), but the overall complexity bounds are much simpler, easier to interpret, and have better dependence on other problem parameters. In particular, we show that the number of function evaluations needed to find an $\epsilon$-solution is $O(n^2 /\epsilon)$ (resp. $O(n^2 \log(1/\epsilon))$) for the problem of minimizing a convex (resp. strongly convex) smooth function. In the nonconvex smooth case, the bound is $O(n^2/\epsilon^2)$, with the goal being the reduction of the norm of the gradient below $\epsilon$.

AB - We consider the problem of unconstrained minimization of a smooth function in the derivative-free setting. In particular, we study the pattern search method (of directional type). Despite relevant research activity spanning several decades, until recently no complexity guarantees---bounds on the number of function evaluations needed to find a satisfying point---for methods of this type were established. Moreover, existing complexity results require long proofs and the resulting bounds have a complicated form. In this paper we give a very brief and insightful analysis of pattern search for nonconvex, convex and strongly convex objective function, based on the observation that what is in the literature called an "unsuccessful step", is in fact a step that can drive the analysis. We match the existing results in their dependence on the problem dimension ($n$) and error tolerance ($\epsilon$), but the overall complexity bounds are much simpler, easier to interpret, and have better dependence on other problem parameters. In particular, we show that the number of function evaluations needed to find an $\epsilon$-solution is $O(n^2 /\epsilon)$ (resp. $O(n^2 \log(1/\epsilon))$) for the problem of minimizing a convex (resp. strongly convex) smooth function. In the nonconvex smooth case, the bound is $O(n^2/\epsilon^2)$, with the goal being the reduction of the norm of the gradient below $\epsilon$.

KW - math.OC

KW - cs.CC

M3 - Working paper

BT - Simple Complexity Analysis of Direct Search

PB - ArXiv

ER -