The Study and Improvement of Unidimensional
Search about Nonlinear Optimization
Yuhuan Cui, Jingguo Qu, and Weiliang Zhu
Qinggong College, Heibei United University, Tangshan, China
Email: qujingguo@163.com
Abstract—This paper, which introduces the destination of
one dimensional search, search interval and solving method,
improves on the basic method and builds faster method of
one dimensional search which includes inexact research and
exact research. And then this paper concludes the method of
global optimization and makes a further contrast and
discuss about its convergence .At the last, this paper
checking the effectiveness of this method by putting it into
use to special case.
Index Terms—One Dimensional Search; Global
Optimization; Convergence; Inexact Research; Exact
Research
I. INTRODUCTION
The method of one dimensional search is a basic
method resolving the problem of nonlinear optimization.
Looking for a fast and efficient one dimensional search is
a basic issue. At present, there are many methods about
one dimensional search, which can be grouped into two
categories. [1] (1) Trial Method such as golden section
and bisection method and so on. (2) Function
Approximation Method such as Newton tangent method,
quadratic interpolation method and rational interpolation
method and so on. This paper introduces a hybrid method
protected by a factitious interval, and this method
combines Trial Method and Function Approximation
Method.
In 2012, Chen Lin in the text of “Several types of the
nonlinear optimization problem solution set” [2], under
the unchanged generalized convexity studies several main
kinds of nonlinear optimization problem solution set.
This paper introduces the solution set depict the research
status of nonlinear optimization problem. On Dini
directional derivative definition, study of nonlinear
optimization problem solution set. The author gives the
definition on Dini directional derivative, some properties
of several kinds of unchanged generalized convexity, and
the solution set of nonlinear optimization problem, the
objective function is the same convex, constraint function
is a pseudo linear. When the objective function and
constraint functions are pseudo linear, further results are
obtained. Article also in Clarke sub differential is defined,
the no smooth pseudo invariant with though laser by
words are given some properties of convex optimization
problem, the solutions for these problems, and example is
given. The article pointed out that in general target space
vector optimization problems were studied in two true
efficient point - Henig efficient point and cone
characterizations Hurwicz really effectively. The main
use of the collection at some point of the cone and normal
cone of dependence and the feasible direction cone for
the vector optimization problems effectively depict the
characteristics.
In 2006, Yonghong Ren in the text of “Nonlinear
Langrange method of solving nonlinear optimization
problems” [3], established on multiplier is a linear
function of a class of nonlinear theory frame of
Langrange method. First, several assumptions are given
to ensure the convergence of the nonlinear Lagrnage
algorithm, at the same time these conditions to build
based on the analysis of the nonlinear Langrange function
duality theory and Heses Lagrnage function matrix
condition number are necessary. Paper discussed two
factorial convergence of iterative method, the function if
the problem is proved Heses array LipshcZti conditions,
the sequence produced by 2 factorial sub iteration method
with second order linear convergence rate.In the end,
given by the paper is verified by numerical experiments
based on the nonlinear nage La bad function of dual
effectiveness of the algorithm. The paper established on
multiplier is a nonlinear function theory frame of another
class of nonlinear though laser method, and gives some
assumptions in place to ensure that the class of nonlinear
Lagrnage convergence of the algorithm. These conditions
in the analysis of condition number of Heses Lagarnge
function matrix, and to establish corresponding duality
theory are necessary, To verify the existing in the
literature many nonlinear Lagrnage function meet these
conditions. At the same time, also set up for a class of
nonlinear Lagrnage method based on the NCP function
structure of the theoretical framework.
In 2007, Jinli in the text of “A differential equation
solving constrained nonlinear optimization method” [4],
constructs the differential equation method for solving
nonlinear optimization problems, including the system of
two differential equation, the first system based on the
function of the first order information, the second system
based on second order information. The two systems have
properties: Local optimal solution of the nonlinear
optimization problem is their asymptotic stability of
equilibrium point, and the initial point is feasible, the
solution trajectory are falls in the feasible region. Paper
proves that the system of two differential equation
JOURNAL OF NETWORKS, VOL. 9, NO. 12, DECEMBER 2014
doi:10.4304/jnw.9.12.3494-3501