 GlobalOptimization - Maple Programming Help

Home : Support : Online Help : Toolboxes : Global Optimization : GlobalOptimization/GlobalSolve

GlobalOptimization

 GlobalSolve
 find a global solution to a nonlinear program

 Calling Sequence GlobalSolve(obj, constr, bd, opts) GlobalSolve(obj, bd, opts) GlobalSolve(opfobj, ineqcon, eqcon, opfbd, opts) GlobalSolve(opfobj, opfbd, opts)

Parameters

 obj - algebraic; objective function constr - set(relation) or list(relation); constraints bd - sequence of name = range; bounds for all variables opfobj - procedure; objective function ineqcon - set(procedure) or list(procedure); inequality constraints eqcon - set(procedure) or list(procedure); equality constraints opfbd - sequence of ranges; bounds for all variables opts - (optional) equation(s) of the form option = value where option is one of avgstopstepwidth, evaluationlimit, feasibilitytolerance, initialpoint, iterationlimit, localrefinement, maximize, method, numexperiments, numsigma, nugget, objectivetarget, optimalitytolerance, optsearch, populationsize, randomseed, thetamethod, theta, targetweight, timelimit, or variables; specify options for the GlobalSolve command

Description

 • The GlobalSolve command computes a global solution to a nonlinear program (NLP) over a bounded region. An NLP involves the minimization (or maximization) of an objective function, possibly subject to inequality and equality constraints. For a more detailed explanation of the solution obtained, see the following Notes section.
 • The global solver minimizes a merit function, incorporating a penalty term for the constraints, and provides both differential evolution and adaptive stochastic search methods. The global search phase is followed by a local search phase to refine the solution. No derivatives are required.
 The solver is designed to search the specified region for a global solution to a non-convex optimization problem.  If the optimization problem is convex (for example, a linear program) or a local solution is acceptable, it is recommended that you use the commands for local optimization in the Optimization package. The Optimization package commands, which are more efficient, can compute global solutions to convex problems.
 • This help page describes the use of the GlobalSolve command when the NLP is specified in algebraic or operator form. Summaries of these forms are given on the GlobalOptimization/AlgebraicForm and GlobalOptimization/OperatorForm help pages. GlobalSolve also recognizes the problem in Matrix form (see the GlobalOptimization[GlobalSolve] (MatrixForm) help page). Matrix form leads to more efficient computation, but is more complex.
 • The first and second calling sequences use the algebraic form of input. The first parameter obj is the objective function, which must be an algebraic expression.
 Constraints can be provided using the constr parameter. This is a set or list of relations (of type <= or =) involving the problem variables. The problem variables are the indeterminates of type name found in obj and constr. They can also be specified using the variables option.  For unconstrained problems, omit the constr parameter.
 Bounds on the variables must be given as additional arguments, each of the form varname= varrange where varname is a variable name and varrange is its range. There must be exactly one argument for each problem variable, and the endpoints of each range must evaluate to finite numeric values.
 • The third and fourth calling sequences use the operator form of input. The objective function opfobj must be a procedure that accepts $n$ floating-point parameters representing the problem variables x1, x2, ..., xn and returns a float.
 Inequality and equality constraints are provided using the ineqcon and eqcon parameters. An inequality constraint $v\left(\mathrm{x1},\mathrm{x2},\dots ,\mathrm{xn}\right)\le 0$ is specified by a procedure v in ineqcon that has the same form as opfobj and returns the left-hand-side value of the constraint. Similarly, an equality constraint $w\left(\mathrm{x1},\mathrm{x2},\dots ,\mathrm{xn}\right)=0$ is specified by a procedure w in eqcon. Either ineqcon or eqcon can be an empty list or set. For unconstrained problems, omit both of these parameters.
 Bounds on the variables must be provided. These must be a sequence of exactly $n$ ranges corresponding in order to x1, x2, ..., xn. The endpoints of each range must evaluate to finite numeric values.
 • Maple returns the solution as a list containing the final minimum (or maximum) value and a point (the extremum). If the input is in algebraic form, the point is a list containing elements of the form $\mathrm{varname}=\mathrm{value}$ where varname is a problem variable and value is its final value. If the input is in operator form, the point is a Vector containing the values of the problem variables.

Options

 The opts argument can contain one or more of the following options not specific to a particular solving method. The options specific to a particular solving method are described in more detail in the GlobalOptimization/Options help page.
 • evaluationlimit = posint -- Set the maximum number of iterations performed by the Differential Evolution method. The global search phase terminates if this limit is reached.
 • feasibilitytolerance = positive and numeric -- Set the allowed violation of constraints.
 • initialpoint = set(equation), list(equation), or list(numeric) --  Use the provided initial point, which is a set or list of equations $\mathrm{varname}=\mathrm{value}$ (for algebraic form input) or a list of exactly $n$ values (for operator form input). This is supported by the Differential Evolution method.
 • localrefinement = truefalse -- Perform a local optimization refinement of the solution starting from the optimal point computed by the global solver. The default is 'localrefinement'='true'.
 • maximize or maximize = truefalse -- Maximize the objective function when m is 'true' and minimize when m is 'false'.  The option 'maximize' is equivalent to 'maximize'='true'. The default is 'maximize'='false'.
 • method = diffevol, ego -- Set the global search algorithm: Differential Evolution (method = diffevol), or Efficient Global Optimization (method = ego).  The default is method = diffevol.
 • objectivetarget = numeric -- Set an acceptable target value for the objective function. If the objective function achieves this value, the search terminates.
 • timelimit = posint -- Set the maximum computation time, in seconds, for the global solver.
 • variables = list(name) or set(name) -- Specify the problem variables when the objective function is in algebraic form.

Notes

 • For more information on the methods used by the global solver, with suggestions for achieving best performance, see the GlobalOptimization/Computation help page.
 • The global solver searches for the optimal solution until one of the termination criteria is met.  Then either the best available solution is returned or an error message is displayed stating that a solution could not be obtained.  The termination criteria can be set using options. Otherwise, default values for these options are applied.  In particular, the evaluationlimit option must be set to a sufficiently high value for difficult optimization problems or unexpected answers may be produced.
 • In the case that the computation is interrupted the last solution computed may be retrieved using the GetLastSolution command.
 • The computation is performed in floating-point. Therefore, all data provided must have type realcons and all returned solutions are floating-point, even if the problem is specified with exact values. The solver uses externally called code that works with hardware floats, but it is possible to evaluate the objective function and the constraints in Maple with higher precision. For details, see the GlobalOptimization/Computation help page.

Examples

 > $\mathrm{with}\left(\mathrm{GlobalOptimization}\right):$

Find the global solution to an unconstrained nonlinear minimization problem.

 > $\mathrm{GlobalSolve}\left(\mathrm{exp}\left(-x\right)-{\mathrm{cos}\left(x\right)}^{3},x=1..4\right)$
 $\left[{0.176954236956028}{,}\left[{x}{=}{1.81002937332731}\right]\right]$ (1)

Find the global solution to a constrained two-variable minimization problem.

 > $\mathrm{GlobalSolve}\left({x}^{3}-{y}^{3}-x+y,\left\{{x}^{2}+2y\le 6\right\},x=0..5,y=0..5\right)$
 $\left[{-24.0191790909674054}{,}\left[{x}{=}{0.0383116419453124}{,}{y}{=}{2.99926610904573}\right]\right]$ (2)

Find the global solution to a constrained least-squares minimization problem.

 > $\mathrm{GlobalSolve}\left({\left(x-1\right)}^{2}+{\left(y-1\right)}^{2}+{\left(z-1\right)}^{2},\left\{{y}^{2}-12x-7z=100\right\},x=-10..10,y=-10..10,z=-10..10\right)$
 $\left[{60.6658882380449995}{,}\left[{x}{=}{-4.02830735637190}{,}{y}{=}{6.17479203580774}{,}{z}{=}{-1.93317929115185}\right]\right]$ (3)

Find the global minimum using the operator input form.

 > $\mathrm{GlobalSolve}\left(x↦\mathrm{\Gamma }\left(x\right),1.1..1.9\right)$
 $\left[{0.885603194410889}{,}\left[\begin{array}{c}{1.46163215294504}\end{array}\right]\right]$ (4)

Initial points can be provided to find the minimum.

 > $\mathrm{GlobalSolve}\left(\frac{\mathrm{cos}\left(x\right)}{x},x=11..21,\mathrm{initialpoint}=\left\{x=12\right\}\right)$
 $\left[{-0.0637915530395936}{,}\left[{x}{=}{15.6441283804492}\right]\right]$ (5)

Use the maximize option to maximize the objective function.

 > $\mathrm{GlobalSolve}\left(\mathrm{add}\left({x‖i}^{2},i=1..8\right),\left\{\mathrm{add}\left({x‖i}^{2},i=1..8\right)\le 1\right\},\mathrm{seq}\left(x‖i=-1..1,i=1..8\right),\mathrm{maximize}\right)$
 $\left[{1.00000000000000067}{,}\left[{\mathrm{x1}}{=}{-0.138687931023509}{,}{\mathrm{x2}}{=}{-0.0413711220237186}{,}{\mathrm{x3}}{=}{-0.0154159434505829}{,}{\mathrm{x4}}{=}{-0.232829087709284}{,}{\mathrm{x5}}{=}{-0.333994844827890}{,}{\mathrm{x6}}{=}{0.00865198080109979}{,}{\mathrm{x7}}{=}{-0.854946138236215}{,}{\mathrm{x8}}{=}{-0.286438021614627}\right]\right]$ (6)

GlobalSolve solves real-valued optimization problems. Trying to solve a complex problem can produce an error message.

 > $\mathrm{Digits}≔20:$
 > $\mathrm{GlobalSolve}\left(\mathrm{sin}\left(x\mathrm{log}\left(x-y\right)\right),x=1..2,y=5..6\right)$

However, you can apply the Re command to use the real portion of the complex-valued expressions in a complex optimization problem to define a different problem.

 > $\mathrm{Digits}≔10:$
 > $\mathrm{GlobalSolve}\left(\mathrm{sin}\left(x\mathrm{\Re }\left(\mathrm{log}\left(x-y\right)\right)\right),x=1..2,y=5..6\right)$
 $\left[{0.360686590689181019}{,}\left[{x}{=}{2.}{,}{y}{=}{6.}\right]\right]$ (7)

Using Re can create distinct problems with distinct solutions. Notice the difference between the previous example and the following.

 > $\mathrm{GlobalSolve}\left(\mathrm{\Re }\left(\mathrm{sin}\left(x\mathrm{log}\left(x-y\right)\right)\right),x=1..2,y=5..6\right)$
 $\left[{11.3952154460165396}{,}\left[{x}{=}{1.00000002213202}{,}{y}{=}{5.00000558219173}\right]\right]$ (8)

Set infolevel to $1$ or higher to display details about the solution procedure.

 > $\mathrm{infolevel}\left[\mathrm{GlobalOptimization}\right]≔3:$
 > $\mathrm{GlobalSolve}\left(\mathrm{exp}\left(x\right)-{x}^{3},x=-1..0\right)$
 GlobalSolve:   calling NLP solver GlobalSolve:   calling global optimization solver GlobalSolve:   number of problem variables   1 GlobalSolve:   number of nonlinear inequality constraints   0 GlobalSolve:   number of nonlinear equality constraints   0 GlobalSolve:   method    OptimusDEVOL GlobalSolve:   maximum iterations   80 GlobalSolve:   population size   50 GlobalSolve:   average stopping stepwidth   .1e-3 GlobalSolve:   time limit    100 GlobalSolve:   trying evalhf mode GlobalSolve:   performing local refinement
 $\left[{0.728617821489263}{,}\left[{x}{=}{-0.458962267755628}\right]\right]$ (9)