570 likes | 689 Views
Genetic algorithms: an introduction. A rtem Eremin , j. researcher, IMMI KSU. Motivation. Motivation. Material properties ( C ij ) - ???. experimental data. Doppler laservibrometry for measuring out-of-plane velocities. Wavelet transform. TOF.
E N D
Genetic algorithms: an introduction Artem Eremin, j. researcher, IMMI KSU
Material properties (Cij) - ??? experimental data • Doppler laservibrometry for measuring out-of-plane velocities Wavelet transform TOF • Time-of-Flight (TOF) with wavelet transform Motivation
Every day we subconsciously solve some optimization problems! Optimization “Optimization is the process of making something better”
Minimum-seeking algorithms • Exhaustive Search = Brute Force • Analytical Optimization • Nelder-Mead downhill Simplex Method • Optimization based on Line Minimization (the coordinate search method, the steepest descent algorithm, Newton’s method, Davidon-Fletcher-Powell (DFP) algorithm, etc …)
Minimum-seeking algorithms 1 – 4 can converge to a local minimum! Natural optimization methods Not the panacea, but … Simulated annealing (Kirkpatrick et al., 1983) Particle swarm optimization (Parsopoulos and Vrahatis, 2002) Genetic algorithms (Holland, 1975) Evolutionary algorithms (Schwefel, 1995) No derivatives, large search spaces, “nature-based”
Biological background (Cell and Chromosomes) • Every animal cell is a complex of many small “factories” working together; the center of this all is the cell nucleus; the nucleus contains the genetic information in chromosomes - strings of DNA • Each chromosome contains a set of genes - blocks of DNA • Each gene determines some aspect of the organism (like eye colour) • A collection of genes is sometimes called a genotype • A collection of aspects (like eye colour) is sometimes called a phenotype
+ Biological background (Reproduction) Organisms produce a number of offspring similar to themselves but can have variations due to: – Mutations (random changes) – Sexual reproduction (offspring have combinations of features inherited from each parent)
Biological background (Natural Selection) • The Origin of Species: “Preservation of favourable variations and rejection of unfavourable variations.” • There are more individuals born than can survive, so there is a continuous struggle for life. • Individuals with an advantage have a greater chance for survive: survival of the fittest. • Important aspects in natural selection are: adaptation to the environment and isolation of populations in different groups which cannot mutually mate
Genetic algorithms (GA) • GA were initially developed by John Holland, University of Michigan (1970’s) • Popularized by his student David Goldberg (solved some very complex engineering problems, 1989) • Based on ideas from Darwinian Evolution • Provide efficient techniques for optimization and machine learning applications; widely used in business, science and engineering
GA main features • Optimizes with continuous or discrete variables • Doesn’t require derivative information • Simultaneously searches from a wide sampling of the cost surface • Deals with a large number of variables • Is well suited for parallel computers • Optimizes variables with extremely complex cost surfaces (they can jump out of a local minimum) • Provides a list of optimum variables, not just a single solution • May encode the variables so that the optimization is done with the encoded variables • Works with numerically generated data, experimental data, or analytical functions.
To start with… Phenotype space Genotype space = {0,1}L Encoding (representation) 10010001 10010010 010001001 011101001 Decoding (inverse representation)
To start with… Gene – a single encoding of part of the solution space, i.e. either single bits or short blocks of adjacent bits that encode an element of the candidate solution Chromosome – a string of genes that represents a solution + Population – the number of chromosomes available to test
Chromosomes Chromosomes can be: –Bit strings(0110, 0011, 1101, …) –Real numbers(33.2, -12.11, 5.32, …) –Permutations of elements (1234, 3241, 4312, …) –Lists of rules(R1, R2, R3, …Rn…) –Program elements(genetic programming) – … Chromosome=array of Nvar variables (genes) pi
How does it works? So… produce an initial population of individuals evaluate the fitness of all individuals while termination condition not met do select fitter individuals for reproduction recombine between individuals mutate individuals evaluate the fitness of the modified individuals generate a new population End while
How does it works? The Evolutionary Cycle parents selection modification modified offspring initiate & evaluation population evaluate evaluated offspring deleted members Or so… discard
s1 = 1111010101 f (s1) = 7 s2 = 0111000101 f (s2) = 5 s3 = 1110110101 f (s3) = 7 s4 = 0100010011 f (s4) = 4 s5 = 1110111101 f (s5) = 8 s6 = 0100110000 f (s6) = 3 Ex. Npop=6 Generation of the initial population
Selection We are kind! Let’s save everybody! or Mating pool
Selection roulette wheel weighting
Selection The roulette wheel method: Individual i will have a probability to be chosen Area is proportional to fitness value 1 2 n We repeat the extraction as many times as it is necessary 3 4
Selection • randomly pick a small subset • perform a “tournament” • “the winner takes it all” Tournament + Threshold = No SORTING!!!
Mating (Crossover) Simple 1-point crossover • Choose a random point on the two parents • Split parents at this crossover point • Create children by exchanging tails • Pc typically in range (0.6, 0.9) • Performance with 1 Point Crossover depends on the order that variables occur in the representation • more likely to keep together genes that are near each other • Can never keep together genes from opposite ends of string • This is known as Positional Bias • Can be exploited if we know about the structure of our problem, but this is not usually the case
Mating (Crossover) n-point crossover • Choose n random crossover points • Split along those points • Glue parts, alternating between parents • Generalisation of 1 point (still some positional bias)
Mating (Crossover) Uniform crossover Uniform crossover looks at each bit in the parents and randomly assigns the bit from one parent to one offspring and the bit from the other parent to the other offspring
Mutation • Alter each gene (or, bit) independently with a probabilitypm • pmis called the mutation rate • Typically between 1/Npop and1/[s]
Crossover or/and Mutation • A long debate: which one is better / necessary / main-background • Answer (at least, rather wide agreement): • it depends on the problem, but • in general, it is good to have both • both have another role • mutation-only-GA is possible, crossover-only-GA would not work
Crossover or/and Mutation • Exploration: Discovering promising areas in the search space, i.e. gaining information on the problem • Exploitation: Optimising within a promising area, i.e. using information • There is co-operation AND competition between them • Crossover is explorative, it makes a big jump to an area somewhere “in between” two (parent) areas • Mutation is exploitative, it creates random small diversions, thereby staying near (in the area of ) the parent • Only crossover can combine information from two parents • Only mutation can introduce new information • To hit the optimum you often need a ‘lucky’ mutation
Real valued problems Mapping real values on bit strings pi [ai, bi] R represented by {a1,…,aL} {0,1}L • [ai, bi] {0,1}L must be invertible (one phenotype per genotype) • : {0,1}L [ai, bi] defines the representation • Only 2L values out of infinite are represented • L determines possible maximum precision of solution • High precision long chromosomes (slow evolution)
Floating point mutations • General scheme of floating point mutations • Uniform mutation: • Analogous to bit-flipping (binary) or random resetting (integers)
Floating point mutations • Non-uniform mutations: • Many methods proposed,such as time-varying range of change etc. • Most schemes are probabilistic but usually only make a small change to value • Most common method is to add random deviate to each variable separately, taken from N(0, ) Gaussian distribution and then curtail to range • Standard deviation controls amount of change (2/3 of deviations will lie in range (- to +)
Crossover for real valued GAs • Discrete: • each gene value in offspring z comes from one of its parents (x,y) with equal probability: zi = xior yi • Could use n-point or uniform • Intermediate • exploits idea of creating children “between” parents (hence a.k.a. arithmetic recombination) • zi = xi + (1 - ) yi where : 0 1. • The parameter can be: • constant: uniform arithmetical crossover • variable (e.g. depend on the age of the population) • picked at random every time
Single arithmetic crossover • Parents: x1,…,xnand y1,…,yn • Pick a single gene (k) at random, • child1 is: • reverse for other child. e.g. with = 0.5
Simple arithmetic crossover • Parents: x1,…,xnand y1,…,yn • Pick random gene (k) after this point mix values • child1 is: • reverse for other child. e.g. with = 0.5
“Whole” arithmetic crossover • Most commonly used • Parents: x1,…,xnand y1,…,yn • child1 is: • reverse for other child. e.g. with = 0.5
micro-GA First generation (random values) Tournament selection SBX crossover Select fittest individual Start new generation Good results? Enough iterations? Yes No
Benefits of GA • Concept is easy to understand • Modular–separate from application (representation); building blocks can be used in hybrid applications • Supports multi-objective optimization • Good for “noisy”environment • Always results in an answer, which becomes better and better with time • Can easily run in parallel • The fitness function can be changed from iteration to iteration, which allows incorporating new data in the model if it becomes available
Issues with GA Choosing parameters: –Population size –Crossover and mutation probabilities –Selection, deletion policies –Crossover, mutation operators, etc. –Termination criteria Performance: –Can be too slow but covers a large search space –Is only as good as the fitness function
Experimental specimens 4 CFRP–plates
GA for Permutations • Ordering/sequencing problems form a special type • Task is (or can be solved by) arranging some objects in a certain order • Example: sort algorithm: important thing is which elements occur before others (order) • Example: Travelling Salesman Problem (TSP) : important thing is which elements occur next to each other (adjacency) • These problems are generally expressed as a permutation: • if there are n variables then the representation is as a list of n integers, each of which occurs exactly once
The Traveling Salesman Problem (TSP) The traveling salesman must visit every city in his territory exactly once and then return to the starting point; given the cost of travel between all cities, how should he plan his itinerary for minimum total cost of the entire tour? TSP NP-Complete Search space is BIG: for 30 cities there are 30! 1032 possible tours
TSP (Representation, Initialization and Selection) A vector v = (i1 i2… in) represents a tour (v is a permutation of {1,2,…,n}) Fitness f of a solution is the inverse cost of the corresponding tour Initialization: use either some heuristics, or a random sample of permutations of {1,2,…,n} We shall use the fitness proportionate selection
Mutation operations for permutations • Normal mutation operators lead to inadmissible solutions • e.g. bit-wise mutation : let gene i have value j • changing to some other value k would mean that k occurred twice and j no longer occurred • Therefore must change at least two values • Mutation parameter now reflects the probability that some operator is applied once to the whole string, rather than individually in each position
Insert Mutation for permutations • Pick two allele values at random • Move the second to follow the first, shifting the rest along to accommodate • Note that this preserves most of the order and the adjacency information