Next: The Limit, Previous: Root Finding, Up: Mathematical Packages [Contents][Index]

The Golden Section Search
^{7}
algorithm finds minima of functions which
are expensive to compute or for which derivatives are not available.
Although optimum for the general case, convergence is slow,
requiring nearly 100 iterations for the example (x^3-2x-5).

If the derivative is available, Newton-Raphson is probably a better choice. If the function is inexpensive to compute, consider approximating the derivative.

- Function:
**golden-section-search***f x0 x1 prec* -
`x_0`are`x_1`real numbers. The (single argument) procedure`f`is unimodal over the open interval (`x_0`,`x_1`). That is, there is exactly one point in the interval for which the derivative of`f`is zero.`golden-section-search`

returns a pair (`x`.`f`(`x`)) where`f`(`x`) is the minimum. The`prec`parameter is the stop criterion. If`prec`is a positive number, then the iteration continues until`x`is within`prec`from the true value. If`prec`is a negative integer, then the procedure will iterate`-prec`times or until convergence. If`prec`is a procedure of seven arguments,`x0`,`x1`,`a`,`b`,`fa`,`fb`, and`count`, then the iterations will stop when the procedure returns`#t`

.Analytically, the minimum of x^3-2x-5 is 0.816497.

(define func (lambda (x) (+ (* x (+ (* x x) -2)) -5))) (golden-section-search func 0 1 (/ 10000)) ==> (816.4883855245578e-3 . -6.0886621077391165) (golden-section-search func 0 1 -5) ==> (819.6601125010515e-3 . -6.088637561916407) (golden-section-search func 0 1 (lambda (a b c d e f g ) (= g 500))) ==> (816.4965933140557e-3 . -6.088662107903635)

David Kahaner, Cleve Moler, and Stephen Nash Numerical Methods and Software Prentice-Hall, 1989, ISBN 0-13-627258-4