Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
Explaining Optimization in Genetic Algorithms with Uniform Crossover
1. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Explaining Optimization in
Genetic Algorithms with Uniform Crossover
Keki Burjorjee
kekib@cs.brandeis.edu
Foundations of Genetic Algorithms Conference XII (2013)
Keki Burjorjee Explaining Optimization in GAs with UX
2. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Uniform Crossover
Uniform Crossover first used by Ackley (1987)
Studied and popularized by Syswerda (1989) and Spears & De
Jong (1990, 1991)
Numerous accounts of optimization in GAs with UX
In practice frequently outperforms XO with tight linkage
(Fogel, 2006)
Keki Burjorjee Explaining Optimization in GAs with UX
3. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Uniform Crossover
Uniform Crossover first used by Ackley (1987)
Studied and popularized by Syswerda (1989) and Spears & De
Jong (1990, 1991)
Numerous accounts of optimization in GAs with UX
In practice frequently outperforms XO with tight linkage
(Fogel, 2006)
Keki Burjorjee Explaining Optimization in GAs with UX
4. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Uniform Crossover
Uniform Crossover first used by Ackley (1987)
Studied and popularized by Syswerda (1989) and Spears & De
Jong (1990, 1991)
Numerous accounts of optimization in GAs with UX
In practice frequently outperforms XO with tight linkage
(Fogel, 2006)
Keki Burjorjee Explaining Optimization in GAs with UX
5. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Uniform Crossover
Uniform Crossover first used by Ackley (1987)
Studied and popularized by Syswerda (1989) and Spears & De
Jong (1990, 1991)
Numerous accounts of optimization in GAs with UX
In practice frequently outperforms XO with tight linkage
(Fogel, 2006)
Keki Burjorjee Explaining Optimization in GAs with UX
6. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Optimization in GAs with UX Unexplained
Cannot be explained within the rubric of the BBH
No viable alternative has been proposed
Keki Burjorjee Explaining Optimization in GAs with UX
7. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Optimization in GAs with UX Unexplained
Hyperclimbing Hypothesis
A scientific explanation for optimization in GAs with UX
Keki Burjorjee Explaining Optimization in GAs with UX
8. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
What Does “Scientific” Mean?
Logical Positivism (Proof or Bust)
Scientific truth is absolute
Emphasis on Verifiability (i.e. mathematical proof)
The Popperian method
Scientific truth is provisional
Emphasis on Falsifiability (testable predictions)
Keki Burjorjee Explaining Optimization in GAs with UX
9. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
What Does “Scientific” Mean?
Logical Positivism (Proof or Bust)
Scientific truth is absolute
Emphasis on Verifiability (i.e. mathematical proof)
The Popperian method
Scientific truth is provisional
Emphasis on Falsifiability (testable predictions)
Keki Burjorjee Explaining Optimization in GAs with UX
10. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Popperian Method
Logic behind the Popperian Method : Contrapositive
Theory ⇒ Phenomenon ⇔ ¬Phenomenon ⇒ ¬Theory
Additional tightening by Popper
Unexpected Phenomenon → More credence owed to the theory
e.g. Gravitational lensing → General theory of relativity
Keki Burjorjee Explaining Optimization in GAs with UX
11. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Popperian Method
Logic behind the Popperian Method : Contrapositive
Theory ⇒ Phenomenon ⇔ ¬Phenomenon ⇒ ¬Theory
Additional tightening by Popper
Unexpected Phenomenon → More credence owed to the theory
e.g. Gravitational lensing → General theory of relativity
Keki Burjorjee Explaining Optimization in GAs with UX
12. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Additional Requirements in a Science of EC
Weak assumptions about distribution of fitness
This is just Occam’s Razor
Upfront proof of concept
Avoid another ”Royal Roads moment”
(Nice to have) Identification of a core computational efficiency
Keki Burjorjee Explaining Optimization in GAs with UX
13. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Additional Requirements in a Science of EC
Weak assumptions about distribution of fitness
This is just Occam’s Razor
Upfront proof of concept
Avoid another ”Royal Roads moment”
(Nice to have) Identification of a core computational efficiency
Keki Burjorjee Explaining Optimization in GAs with UX
14. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Additional Requirements in a Science of EC
Weak assumptions about distribution of fitness
This is just Occam’s Razor
Upfront proof of concept
Avoid another ”Royal Roads moment”
(Nice to have) Identification of a core computational efficiency
Keki Burjorjee Explaining Optimization in GAs with UX
15. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Hyperclimbing Hypothesis
GAs with UX perform optimization by efficiently
implementing a global search heuristic called hyperclimbing
Keki Burjorjee Explaining Optimization in GAs with UX
16. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Outline
1 Highlight Symmetry of UX
2 Describe Hyperclimbing Heuristic
3 Provide proof-of-concept
Show a GA implementing hyperclimbing efficiently
4 Make a prediction and validate it on
MAX-3SAT
Sherrington Kirkpatrick Spin Glasses problem
5 Outline future work
Keki Burjorjee Explaining Optimization in GAs with UX
17. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Variation in GAs
0 1 0 0 1 0 1 0 1 ......... 0
1 0 0 1 1 0 0 1 1 ......... 1
Keki Burjorjee Explaining Optimization in GAs with UX
18. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Variation in GAs
0 1 0 0 1 0 1 0 1 ......... 0
X1 X2 X3 X4 X5 X6 X7 X8 X9 . . . . . . . . . X
1 0 0 1 1 0 0 1 1 ......... 1
Keki Burjorjee Explaining Optimization in GAs with UX
22. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Variation in GAs
X1 X2 X3 X4 X5 X6 X7 X8 X9 . . . . . . . . . X
Y1 Y2 Y3 Y4 Y5 Y6 Y7 Y8 Y9 . . . . . . . . . Y
UX: X1 , . . . , X are independent
Absence of positional bias (Eshelman et al., 1989)
Attributes can be arbitrarily permuted
Suppose Y1 , . . . , Y independent and independent of
If locus i immaterial to fitness, can be spliced out
Keki Burjorjee Explaining Optimization in GAs with UX
23. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
The Hyperclimbing Heuristic
Keki Burjorjee Explaining Optimization in GAs with UX
24. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Hyperclimbing: A Stochastic Search Heuristic
The Setting
f
{0, 1} → R
f may be stochastic
Keki Burjorjee Explaining Optimization in GAs with UX
25. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Hyperclimbing: A Stochastic Search Heuristic
The Setting
f
{0, 1} → R
f may be stochastic
Keki Burjorjee Explaining Optimization in GAs with UX
26. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Schema Partitions and Schemata
Given index set I ⊂ {1, . . . , }, s.t. |I| = k
I partitions the search space into 2k schemata
Schema partition denoted by I
E.g. for I = {3, 7}
Coarseness of the partition
determined by k (the order)
A3 = 0, A7 = 1
A3 A7
k schema partitions of
= 0, = 0
order k
A3 = 1, A7 = 0 A3 = 1, A7 = 1 For fixed k, k ∈ Ω( k )
Keki Burjorjee Explaining Optimization in GAs with UX
27. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Schema Partitions and Schemata
Given index set I ⊂ {1, . . . , }, s.t. |I| = k
I partitions the search space into 2k schemata
Schema partition denoted by I
E.g. for I = {3, 7}
Coarseness of the partition
determined by k (the order)
A3 = 0, A7 = 1
A3 A7
k schema partitions of
= 0, = 0
order k
A3 = 1, A7 = 0 A3 = 1, A7 = 1 For fixed k, k ∈ Ω( k )
Keki Burjorjee Explaining Optimization in GAs with UX
28. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Schema Partitions and Schemata
Given index set I ⊂ {1, . . . , }, s.t. |I| = k
I partitions the search space into 2k schemata
Schema partition denoted by I
E.g. for I = {3, 7}
Coarseness of the partition
determined by k (the order)
A3 = 0, A7 = 1
A3 A7
k schema partitions of
= 0, = 0
order k
A3 = 1, A7 = 0 A3 = 1, A7 = 1 For fixed k, k ∈ Ω( k )
Keki Burjorjee Explaining Optimization in GAs with UX
29. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Schema Partitions and Schemata
Given index set I ⊂ {1, . . . , }, s.t. |I| = k
I partitions the search space into 2k schemata
Schema partition denoted by I
E.g. for I = {3, 7}
Coarseness of the partition
determined by k (the order)
A3 = 0, A7 = 1
A3 A7
k schema partitions of
= 0, = 0
order k
A3 = 1, A7 = 0 A3 = 1, A7 = 1 For fixed k, k ∈ Ω( k )
Keki Burjorjee Explaining Optimization in GAs with UX
30. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
The Effect of a Schema Partition
Effect of I : Variance of average fitness of schemata in I
If I ⊂ I , then Effect(I) ≤ Effect(I )
I
Keki Burjorjee Explaining Optimization in GAs with UX
31. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
The Effect of a Schema Partition
Effect of I : Variance of average fitness of schemata in I
If I ⊂ I , then Effect(I) ≤ Effect(I )
I I
Keki Burjorjee Explaining Optimization in GAs with UX
32. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
How a Hyperclimbing Heuristic Works
Finds a coarse schema partition
with a detectable effect
Limits future sampling to a
schema with above average
fitness
Raises expected fitness of all future samples
Keki Burjorjee Explaining Optimization in GAs with UX
33. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
How a Hyperclimbing Heuristic Works
Finds a coarse schema partition
with a detectable effect
Limits future sampling to a
schema with above average
fitness
Raises expected fitness of all future samples
Keki Burjorjee Explaining Optimization in GAs with UX
34. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
How a Hyperclimbing Heuristic Works
Finds a coarse schema partition
with a detectable effect
Limits future sampling to a
schema with above average
fitness
Raises expected fitness of all future samples
Keki Burjorjee Explaining Optimization in GAs with UX
35. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
How a Hyperclimbing Heuristic Works
Finds a coarse schema partition
with a detectable effect
Limits future sampling to a
schema with above average
fitness
Raises expected fitness of all future samples
Keki Burjorjee Explaining Optimization in GAs with UX
36. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
How a Hyperclimbing Heuristic Works
Limit future sampling to some schema
equivalent to
Fix defining loci of the schema in the population
Hyperclimbing Heuristic now recurses
Search occurs over the unfixed loci
And so on . . .
Keki Burjorjee Explaining Optimization in GAs with UX
37. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
How a Hyperclimbing Heuristic Works
Limit future sampling to some schema
equivalent to
Fix defining loci of the schema in the population
Hyperclimbing Heuristic now recurses
Search occurs over the unfixed loci
And so on . . .
Keki Burjorjee Explaining Optimization in GAs with UX
38. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
How a Hyperclimbing Heuristic Works
Limit future sampling to some schema
equivalent to
Fix defining loci of the schema in the population
Hyperclimbing Heuristic now recurses
Search occurs over the unfixed loci
And so on . . .
Keki Burjorjee Explaining Optimization in GAs with UX
39. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Fitness Distribution Assumption
∃ a small set of unfixed loci, I, such that
Conditional effect of I is detectable
Conditional upon loci that have already been fixed
Unconditional effect of I may be undetectable
E.g.
Effect of 1*#*0*#* is detectable
Effect of **#***#* undetectable
(‘*’ = wildcard, ‘#’= defined locus)
Called staggered coarse conditional effects
Keki Burjorjee Explaining Optimization in GAs with UX
40. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Fitness Distribution Assumption
∃ a small set of unfixed loci, I, such that
Conditional effect of I is detectable
Conditional upon loci that have already been fixed
Unconditional effect of I may be undetectable
E.g.
Effect of 1*#*0*#* is detectable
Effect of **#***#* undetectable
(‘*’ = wildcard, ‘#’= defined locus)
Called staggered coarse conditional effects
Keki Burjorjee Explaining Optimization in GAs with UX
41. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Fitness Distribution Assumption
∃ a small set of unfixed loci, I, such that
Conditional effect of I is detectable
Conditional upon loci that have already been fixed
Unconditional effect of I may be undetectable
E.g.
Effect of 1*#*0*#* is detectable
Effect of **#***#* undetectable
(‘*’ = wildcard, ‘#’= defined locus)
Called staggered coarse conditional effects
Keki Burjorjee Explaining Optimization in GAs with UX
42. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Fitness Distribution Assumption is Weak
Staggered coarse conditional effects assumption is weak
Weaker than fitness distribution assumption in the BBH
BBH assumes unstaggered coarse unconditional effects
Weaker assumptions are more likely to be satisfied in practice
Good b/c we aspire to explain optimization in all GAs with UX
Keki Burjorjee Explaining Optimization in GAs with UX
43. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Fitness Distribution Assumption is Weak
Staggered coarse conditional effects assumption is weak
Weaker than fitness distribution assumption in the BBH
BBH assumes unstaggered coarse unconditional effects
Weaker assumptions are more likely to be satisfied in practice
Good b/c we aspire to explain optimization in all GAs with UX
Keki Burjorjee Explaining Optimization in GAs with UX
44. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Hyperclimbing Heuristic is in Good Company
Hyperclimbing an example of a global decimation heuristic
Global decimation heuristics iteratively reduce the size of a
search space
Use non-local information to find and fix partial solutions
No backtracking
E.g. Survey propagation (M´rzad et al., 2002)
e
State-of-the-art solver for large, random SAT problems
Keki Burjorjee Explaining Optimization in GAs with UX
45. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Hyperclimbing Heuristic is in Good Company
Hyperclimbing an example of a global decimation heuristic
Global decimation heuristics iteratively reduce the size of a
search space
Use non-local information to find and fix partial solutions
No backtracking
E.g. Survey propagation (M´rzad et al., 2002)
e
State-of-the-art solver for large, random SAT problems
Keki Burjorjee Explaining Optimization in GAs with UX
46. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Hyperclimbing Heuristic is in Good Company
Hyperclimbing an example of a global decimation heuristic
Global decimation heuristics iteratively reduce the size of a
search space
Use non-local information to find and fix partial solutions
No backtracking
E.g. Survey propagation (M´rzad et al., 2002)
e
State-of-the-art solver for large, random SAT problems
Keki Burjorjee Explaining Optimization in GAs with UX
47. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Proof Of Concept
Keki Burjorjee Explaining Optimization in GAs with UX
48. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
4-Bit Stochastic Learning Parities Problem
N (+0.25, 1) if xi1 ⊕ xi2 ⊕ xi3 ⊕ xi4 = 1
fitness(x) ∼
N (−0.25, 1) otherwise
Probability
0
−0.25 +0.25 Fitness
Keki Burjorjee Explaining Optimization in GAs with UX
51. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Solving the 4-Bit Stochastic Learning Parities Problem
Expected
xi1 x i2 x i3 x i4 Marginal
Fitness
0 0 0 0 −0.25
Naive approach 0 0 0 1 +0.25
0 0 1 0 +0.25
Visit loci in groups of four 0
0
0
1
1
0
1
0
−0.25
+0.25
0 1 0 1 −0.25
Check for differentiation in 0 1 1 0 −0.25
0 1 1 1 +0.25
the multivariate marginal 1 0 0 0 +0.25
1 0 0 1 −0.25
fitness values 1 0 1 0 −0.25
1 0 1 1 +0.25
Time complexity is Ω( 4 ) 1 1 0 0 −0.25
1 1 0 1 +0.25
1 1 1 0 +0.25
1 1 1 1 −0.25
Keki Burjorjee Explaining Optimization in GAs with UX
52. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Solving the 4-Bit Stochastic Learning Parities Problem
Expected
Expected xj1 xj2 Marginal
xj Marginal Fitness
Fitness 0 0 0.0
0 0.0 0 1 0.0
1 0.0 1 0 0.0
1 1 0.0
Visiting loci in groups of Expected
xj1 xj2 xj3 Marginal
three or less won’t work Fitness
0 0 0 0.0
0 0 1 0.0
0 1 0 0.0
0 1 1 0.0
1 0 0 0.0
1 0 1 0.0
1 1 0 0.0
1 1 1 0.0
Keki Burjorjee Explaining Optimization in GAs with UX
53. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Watch On YouTube
Dynamics of a SGA on the 4-Bit Learning Parities problem
#Loci=200
Keki Burjorjee Explaining Optimization in GAs with UX
54. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Symmetry Analytic Conclusion
Expected #generations for the red dots to diverge constant w.r.t.
Location of the red dots
Number of blue dots
Dynamics of a blue dot invariant to
Location of the red dots
Number of blue dots
Keki Burjorjee Explaining Optimization in GAs with UX
55. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Watch On YouTube
Dynamics of a SGA on the 4-Bit Learning Parities problem
#Loci=1000
Keki Burjorjee Explaining Optimization in GAs with UX
56. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Watch On YouTube
Dynamics of a SGA on the 4-Bit Learning Parities problem
#Loci=10000
Keki Burjorjee Explaining Optimization in GAs with UX
57. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Staircase Functions
Staircase function descriptor: (m, n, δ, σ, , L, V )
3 87 · · · 93
1 0 ··· 1
42 58 · · · 72 1 0 ··· 0
L= m, V = m
.
. . ..
. .
. .
. .
. .. .
.
. . . .
. . . .
67 73 · · · 81 0 1 ··· 1
n n
Staircase function is a stochastic function f : {0, 1} → R
0 δ 2δ (m−1)δ mδ
Keki Burjorjee Explaining Optimization in GAs with UX
58. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Staircase Function
256
192
128
64
64 128 192 256
Keki Burjorjee Explaining Optimization in GAs with UX
59. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Staircase Function
256
192
128
64
64 128 192 256
Keki Burjorjee Explaining Optimization in GAs with UX
60. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Staircase Function
256
192
128
64
64 128 192 256
Keki Burjorjee Explaining Optimization in GAs with UX
61. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Staircase Function
256
192
128
64
64 128 192 256
Keki Burjorjee Explaining Optimization in GAs with UX
62. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Staircase Functions
δ = 0.2, σ = 1
Keki Burjorjee Explaining Optimization in GAs with UX
74. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Uniform Random MAX-3SAT
#variables=1000, #clauses=4000
4000
3950
3900
3850
Satisfied clauses
3800
3750
3700
3650
3600
3550
0 2000 4000 6000
Generations
#trials=10. Error bars show standard error.
Keki Burjorjee Explaining Optimization in GAs with UX
75. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Uniform Random MAX-3SAT
#variables=1000, #clauses=4000
4000 4000
3950 3950
3900 3900
3850 3850
Satisfied clauses
3800 Satisfied clauses 3800
3750 3750
3700 3700
3650 3650
3600 3600
3550 3550
0 2000 4000 6000 0 2000 4000 6000
Generations Generations
#trials=10. Error bars show standard error.
Keki Burjorjee Explaining Optimization in GAs with UX
76. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Uniform Random MAX-3SAT
#variables=1000, #clauses=4000
4000 4000 900
3950 3950 800
3900 3900 700
3850 3850 600
Satisfied clauses
Satisfied clauses
Unmutated loci
3800 3800 500
3750 3750 400
3700 3700 300
3650 3650 200
3600 3600 100
3550 3550 0
0 2000 4000 6000 0 2000 4000 6000 0 2000 4000 6000
Generations Generations Generation
#trials=10. Error bars show standard error.
Keki Burjorjee Explaining Optimization in GAs with UX
77. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
SK Spin Glasses System
#spins=1000
, fitness(σ) = Jij σi σj , Jij drawn from N (0, 1)
drawn from {−1, +1} 1≤i<j≤
4
x 10
2.5
2
1.5
Fitness
1
0.5
0
0 2000 4000 6000 8000
Generations
#trials=10. Error bars show standard error.
Keki Burjorjee Explaining Optimization in GAs with UX
78. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
SK Spin Glasses System
#spins=1000
, fitness(σ) = Jij σi σj , Jij drawn from N (0, 1)
drawn from {−1, +1} 1≤i<j≤
4 4
x 10 x 10
2.5 2.5
2 2
1.5 1.5
Fitness
Fitness
1 1
0.5 0.5
0 0
0 2000 4000 6000 8000 0 2000 4000 6000 8000
Generations Generations
#trials=10. Error bars show standard error.
Keki Burjorjee Explaining Optimization in GAs with UX
79. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
SK Spin Glasses System
#spins=1000
, fitness(σ) = Jij σi σj , Jij drawn from N (0, 1)
drawn from {−1, +1} 1≤i<j≤
4 4
x 10 x 10 1000
2.5 2.5
900
800
2 2
700
Unmutated loci
1.5 1.5 600
Fitness
Fitness
500
1 1 400
300
0.5 0.5 200
100
0 0 0
0 2000 4000 6000 8000 0 2000 4000 6000 8000 0 2000 4000 6000 8000
Generations Generations Generation
#trials=10. Error bars show standard error.
Keki Burjorjee Explaining Optimization in GAs with UX
80. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Conclusion
Keki Burjorjee Explaining Optimization in GAs with UX
81. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Summary
Proposed the Hyperclimbing Hypothesis
Weak assumptions about the distribution of fitness
Based on a computational efficiency of GAs with UX
Upfront proof of concept
Testable predictions + Validation on
Uniform Random MAX-3SAT
SK Spin Glasses Problem
Keki Burjorjee Explaining Optimization in GAs with UX
82. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Summary
Proposed the Hyperclimbing Hypothesis
Weak assumptions about the distribution of fitness
Based on a computational efficiency of GAs with UX
Upfront proof of concept
Testable predictions + Validation on
Uniform Random MAX-3SAT
SK Spin Glasses Problem
Keki Burjorjee Explaining Optimization in GAs with UX
83. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Summary
Proposed the Hyperclimbing Hypothesis
Weak assumptions about the distribution of fitness
Based on a computational efficiency of GAs with UX
Upfront proof of concept
Testable predictions + Validation on
Uniform Random MAX-3SAT
SK Spin Glasses Problem
Keki Burjorjee Explaining Optimization in GAs with UX
84. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Summary
Proposed the Hyperclimbing Hypothesis
Weak assumptions about the distribution of fitness
Based on a computational efficiency of GAs with UX
Upfront proof of concept
Testable predictions + Validation on
Uniform Random MAX-3SAT
SK Spin Glasses Problem
Keki Burjorjee Explaining Optimization in GAs with UX
85. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Summary
Proposed the Hyperclimbing Hypothesis
Weak assumptions about the distribution of fitness
Based on a computational efficiency of GAs with UX
Upfront proof of concept
Testable predictions + Validation on
Uniform Random MAX-3SAT
SK Spin Glasses Problem
Keki Burjorjee Explaining Optimization in GAs with UX
86. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Corollaries
Implicit Parallelism is real
Not your grandmother’s implicit parallelism
Effect of coarse schema partitions evaluated in parallel
NOT the average fitness of short schemata
Defining length of a schema partition is immaterial
New Implicit parallelism more powerful
Think in terms of Hyperscapes not Landscapes
Keki Burjorjee Explaining Optimization in GAs with UX
87. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Corollaries
Implicit Parallelism is real
Not your grandmother’s implicit parallelism
Effect of coarse schema partitions evaluated in parallel
NOT the average fitness of short schemata
Defining length of a schema partition is immaterial
New Implicit parallelism more powerful
Think in terms of Hyperscapes not Landscapes
Keki Burjorjee Explaining Optimization in GAs with UX
88. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
What Now?
Flesh out the Hypothesis
What does a deceptive hyperscape look like?
More Testable Predictions + Validation
Are staggered coarse conditional effects indeed common?
Generalization
Explain optimization in other EAs
Outreach
Identify and efficiently solve problems familiar to
theoretical computer scientists
Exploitation
Construct better representations
Choose better parameters for existing EAs
Construct better EAs (e.g. backtracking EAs)
Keki Burjorjee Explaining Optimization in GAs with UX
89. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
What Now?
Flesh out the Hypothesis
What does a deceptive hyperscape look like?
More Testable Predictions + Validation
Are staggered coarse conditional effects indeed common?
Generalization
Explain optimization in other EAs
Outreach
Identify and efficiently solve problems familiar to
theoretical computer scientists
Exploitation
Construct better representations
Choose better parameters for existing EAs
Construct better EAs (e.g. backtracking EAs)
Keki Burjorjee Explaining Optimization in GAs with UX
90. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
What Now?
Flesh out the Hypothesis
What does a deceptive hyperscape look like?
More Testable Predictions + Validation
Are staggered coarse conditional effects indeed common?
Generalization
Explain optimization in other EAs
Outreach
Identify and efficiently solve problems familiar to
theoretical computer scientists
Exploitation
Construct better representations
Choose better parameters for existing EAs
Construct better EAs (e.g. backtracking EAs)
Keki Burjorjee Explaining Optimization in GAs with UX
91. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
What Now?
Flesh out the Hypothesis
What does a deceptive hyperscape look like?
More Testable Predictions + Validation
Are staggered coarse conditional effects indeed common?
Generalization
Explain optimization in other EAs
Outreach
Identify and efficiently solve problems familiar to
theoretical computer scientists
Exploitation
Construct better representations
Choose better parameters for existing EAs
Construct better EAs (e.g. backtracking EAs)
Keki Burjorjee Explaining Optimization in GAs with UX
92. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
What Now?
Flesh out the Hypothesis
What does a deceptive hyperscape look like?
More Testable Predictions + Validation
Are staggered coarse conditional effects indeed common?
Generalization
Explain optimization in other EAs
Outreach
Identify and efficiently solve problems familiar to
theoretical computer scientists
Exploitation
Construct better representations
Choose better parameters for existing EAs
Construct better EAs (e.g. backtracking EAs)
Keki Burjorjee Explaining Optimization in GAs with UX
93. Introduction Hyperclimbing Proof of Concept Prediction & Validation Conclusion
Questions?
follow me @evohackr
Keki Burjorjee Explaining Optimization in GAs with UX