HP3000-L Archives

March 2001, Week 4

HP3000-L@RAVEN.UTC.EDU

Options: Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Bruce Toback <[log in to unmask]>
Reply To:
Bruce Toback <[log in to unmask]>
Date:
Mon, 26 Mar 2001 16:36:59 -0700
Content-Type:
text/plain
Parts/Attachments:
text/plain (62 lines)
Tom Hula writes:

>Therefore they start out with a basic
>premise ... that God does not exist and therefore everything we see around
>us is completely random.

At the risk of dragging this back on-topic, let me point out that while
mutations are random, natural selection is the very opposite of random,
and this fact is the basis for some interesting optimization algorithms.

There are analytical techniques for optimizing linear functions, but very
few for nonlinear functions, and then only for special cases. So we have
to turn to numerical experimentation. One such class of experimental
techniques is genetic algorithms, so named because they were inspired by
the process of evolution.

Suppose that you have a large nonlinear optimization problem. The problem
has a large vector of inputs, and a vector of outputs, not necessarily
large. There is some weighting function for the output vector, such that
it's possible to calculate a numerical score that expresses the
"goodness" of the output vector as a scalar value.

Now, generate a collection of semi-random input vectors. They're
semi-random because there are presumably some limits on the input
parameters, but other than fitting within the limits, there are no
restrictions. Evaluate the nonlinear function for each of these input
vectors, and select the input vector that results in the "best" output,
according to the output score.

Then generate another set of input vectors by random perturbations of
each element of the "best" input vector. Run these through the nonlinear
function and select the best. Repeat until good enough, or until no
further progress is made.

The result may not be the absolutely optimal solution, but rather a local
maximum. In fact, depending on the weighting function applied to the
output vector, there may be several maxima. But this provides a
probabilistic method for doing an optimization that's too complex to be
done deterministically.

There are variations on the theme. For example, the algorithm can mimic
sexual reproduction, by taking the top N input vectors and "mating" them,
selecting each element for the "child" vector by taking the corresponding
element from one of the two "parent" vectors at random.

There's a large literature on the subject; try a Google search for
"genetic algorithms". There's also a related technique called simulated
annealing.

-- Bruce


--------------------------------------------------------------------------
Bruce Toback    Tel: (602) 996-8601| My candle burns at both ends;
OPT, Inc.            (800) 858-4507| It will not last the night;
11801 N. Tatum Blvd. Ste. 142      | But ah, my foes, and oh, my friends -
Phoenix AZ 85028                   | It gives a lovely light.
btoback AT optc.com                |     -- Edna St. Vincent Millay
Mail sent to [log in to unmask] will be inspected for a
fee of US$250. Mailing to said address constitutes agreement to
pay, including collection costs.

ATOM RSS1 RSS2