An Exploration of Generalization and Over tting in Genetic Programming: Standard and Geometric Semantic Approaches. Ivo Carlos Pereira Gonçalves
|
|
- Erin Pearson
- 5 years ago
- Views:
Transcription
1 Ivo Carlos Pereira Gonçalves An Exploration of Generalization and Over tting in Genetic Programming: Standard and Geometric Semantic Approaches Doctoral thesis submitted to the Doctoral Program in Information Science and Technology, supervised by Associate Professor Carlos Manuel Mira da Fonseca and Principal Investigator Sara Guilherme Oliveira da Silva, and presented to the Department of Informatics Engineering of the Faculty of Sciences and Technology of the University of Coimbra. November 216.
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17 p
18
19 X3/X1 (X2 +.42) X3 2 /X2
20
21 k = 1.1 k = 1.1
22
23 1
24
25
26
27 2
28
29
30 Training error Generalization error: scenario 1 Generalization error: scenario 2 Generalization optimum Error Iterations
31
32
33
34
35
36
37
38 * / + X3 X1 X2.42 X3/X1 (X2 +.42)
39 / * X2 X3 X3 X3 2 /X2
40 N
41
42 Parent 1 Parent 2 Crossover points * / / + * X2 X3 X1 X2.42 X3 X3 Offspring 1 Offspring 2 * / / * + X2 X3 X1 X3 X3 X2.42
43 Parent Random tree Mutation point * * * + * X3 X1 X1 X2.42 X3 X3 Offspring * * * X1 X1 * X3 X3 X3
44
45
46
47
48
49 E N
50
51
52
53 ER(n) ER(min) min ER(n) ER(min) min
54
55 y ymin ymax
56
57
58
59
60
61
62
63 3
64 α =.5
65
66
67 4
68
69 Training error Generalization error Standard RST RST 5% RST 25% RST 5% RST 75% RST 95% Training error Generalization error Standard RST RST 5% RST 25% RST 5% RST 75% RST 95% Training error Generalization error Standard RST RST 5% RST 25% RST 5% RST 75% RST 95%
70 Overfitting Standard RST RST 5% RST 25% RST 5% RST 75% RST 95% Overfitting Standard RST RST 5% RST 25% RST 5% RST 75% RST 95% Overfitting Standard RST RST 5% RST 25% RST 5% RST 75% RST 95%
71 p p p
72 Generalization error Generalization error Standard RST RST 5% RST 25% RST 5% RST 75% RST 95% Standard RST RST 5% RST 25% RST 5% RST 75% RST 95% x Generalization error Standard RST RST 5% RST 25% RST 5% RST 75% RST 95%
73 Tree size Standard RST RST 5% RST 25% RST 5% RST 75% RST 95% Tree size 2 15 Standard RST RST 5% RST 25% RST 5% RST 75% RST 95% Tree size 2 15 Standard RST RST 5% RST 25% RST 5% RST 75% RST 95%
74 Tree depth Standard RST RST 5% RST 25% RST 5% RST 75% RST 95% Tree depth Standard RST RST 5% RST 25% RST 5% RST 75% RST 95% Tree depth Standard RST RST 5% RST 25% RST 5% RST 75% RST 95%
75
76 Training error Generalization error Standard RST IA 5% IA 1% IA 15% IA 2% IA 25% Training error Generalization error Standard RST IA 5% IA 1% IA 15% IA 2% IA 25% Training error Generalization error Standard RST IA 5% IA 1% IA 15% IA 2% IA 25%
77 Tree size Standard RST IA 5% IA 1% IA 15% IA 2% IA 25% Tree size 2 15 Standard RST IA 5% IA 1% IA 15% IA 2% IA 25% Tree size Standard RST IA 5% IA 1% IA 15% IA 2% IA 25%
78 Tree depth Standard RST IA 5% IA 1% IA 15% IA 2% IA 25% Tree depth Standard RST IA 5% IA 1% IA 15% IA 2% IA 25% Tree depth Standard RST IA 5% IA 1% IA 15% IA 2% IA 25%
79
80 Training error Generalization error Standard RST IS 5% IS 1% IS 15% IS 2% IS 25% Training error Generalization error Standard RST IS 5% IS 1% IS 15% IS 2% IS 25% Training error Generalization error Standard RST IS 5% IS 1% IS 15% IS 2% IS 25%
81 Tree size Standard RST IS 5% IS 1% IS 15% IS 2% IS 25% Tree size 2 15 Standard RST IS 5% IS 1% IS 15% IS 2% IS 25% Tree size 2 15 Standard RST IS 5% IS 1% IS 15% IS 2% IS 25%
82 Tree depth Standard RST IS 5% IS 1% IS 15% IS 2% IS 25% Tree depth Standard RST IS 5% IS 1% IS 15% IS 2% IS 25% Tree depth Standard RST IS 5% IS 1% IS 15% IS 2% IS 25%
83 p p p p p p p p
84
85 Training error Generalization error Standard RST RI 99% RI 95% RI 75% RI 5% RI 25% RI 5% Training error Generalization error Standard RST RI 99% RI 95% RI 75% RI 5% RI 25% RI 5% Training error Generalization error Standard RST RI 99% RI 95% RI 75% RI 5% RI 25% RI 5%
86 Overfitting Standard RST RI 99% RI 95% RI 75% RI 5% RI 25% RI 5% Overfitting Standard RST RI 99% RI 95% RI 75% RI 5% RI 25% RI 5% Overfitting Standard RST RI 99% RI 95% RI 75% RI 5% RI 25% RI 5%
87 Generalization error Generalization error Standard RST RI 99% RI 95% RI 75% RI 5% RI 25% RI 5% Standard RST RI 99% RI 95% RI 75% RI 5% RI 25% RI 5% x Generalization error Standard RST RI 99% RI 95% RI 75% RI 5% RI 25% RI 5%
88 Tree size Standard RST RI 99% RI 95% RI 75% RI 5% RI 25% RI 5% Tree size Standard RST RI 99% RI 95% RI 75% RI 5% RI 25% RI 5% Tree size Standard RST RI 99% RI 95% RI 75% RI 5% RI 25% RI 5%
89 Tree depth Standard RST RI 99% RI 95% RI 75% RI 5% RI 25% RI 5% Tree depth Standard RST RI 99% RI 95% RI 75% RI 5% RI 25% RI 5% Tree depth Standard RST RI 99% RI 95% RI 75% RI 5% RI 25% RI 5%
90 Generalization error 6 Generalization error Standard RST RI 75% NRI 75% RI 5% NRI 5% Standard RST RI 75% NRI 75% RI 5% NRI 5% x Generalization error Standard RST RI 75% NRI 75% RI 5% NRI 5%
91 p p p
92
93 5
94 T 1, T 2 : R n R T XO = (T 1 T R ) + ((1 T R ) T 2 ) T R [, 1] T R [, 1] T : R n R ms T M = T + ms (T R1 T R2 ) T R1 T R2
95 T R1 T R2 f(x) = e x [, 1] [ 1, 1] ms [ ms, ms]
96 [ ms, ms]
97
98 Training error Generalization error GSGP UM MS.1 GSGP UM MS.1 GSGP UM MS 1 GSGP UM MS 1 GSGP UM MS Training error Generalization error GSGP BM MS.1 GSGP BM MS.1 GSGP BM MS 1 GSGP BM MS 1 GSGP BM MS T R1 T R2
99 Training error Generalization error GSGP UM MS.1 GSGP UM MS.1 GSGP UM MS 1 GSGP UM MS 1 GSGP UM MS Training error Generalization error GSGP BM MS.1 GSGP BM MS.1 GSGP BM MS 1 GSGP BM MS 1 GSGP BM MS
100 Training error Generalization error GSGP UM MS.1 GSGP UM MS.1 GSGP UM MS 1 GSGP UM MS 1 GSGP UM MS Training error Generalization error GSGP BM MS.1 GSGP BM MS.1 GSGP BM MS 1 GSGP BM MS 1 GSGP BM MS
101 , x < f(x) = 1, x > 1 x, f(x) = sin(x) [, 1]
102 Training error Generalization error GSGP BM Logistic GSGP BM Linear GSGP BM Sine Training error Generalization error GSGP BM Logistic GSGP BM Linear GSGP BM Sine Training error Generalization error GSGP BM Logistic GSGP BM Linear GSGP BM Sine
103 p p p p p p
104 p p p p
105
106
107 Training error Generalization error Standard Standard DL 17 GSGP UM GSGP BM GSGP UM NC GSGP BM NC SSHC UM SSHC BM Training error Generalization error Standard Standard DL 17 GSGP UM GSGP BM GSGP UM NC GSGP BM NC SSHC UM SSHC BM Training error Generalization error Standard Standard DL 17 GSGP UM GSGP BM GSGP UM NC GSGP BM NC SSHC UM SSHC BM
108 Overfitting Standard Standard DL 17 GSGP UM GSGP BM GSGP UM NC GSGP BM NC SSHC UM SSHC BM Overfitting Standard Standard DL 17 GSGP UM GSGP BM GSGP UM NC GSGP BM NC SSHC UM SSHC BM Overfitting Standard Standard DL 17 GSGP UM GSGP BM GSGP UM NC GSGP BM NC SSHC UM SSHC BM
109 Generalization error Generalization error Standard Standard DL 17 GSGP BM GSGP BM NC SSHC BM GSGP UM GSGP UM NC SSHC UM Generalization error 15 1 Generalization error Standard Standard DL 17 GSGP BM GSGP BM NC SSHC BM GSGP UM GSGP UM NC SSHC UM 11 x Generalization error Generalization error Standard Standard DL 17 GSGP BM GSGP BM NC SSHC BM GSGP UM GSGP UM NC SSHC UM
110 % of crossover improvements over the best individual % of crossover improvements over the best parent Standard Standard DL 17 GSGP UM GSGP BM % of crossover improvements over the best individual % of crossover improvements over the best parent Standard Standard DL 17 GSGP UM GSGP BM % of crossover improvements over the best individual % of crossover improvements over the best parent Standard Standard DL 17 GSGP UM GSGP BM
111 % of mutation improvements over the best individual % of mutation improvements over the parent Standard Standard DL 17 GSGP UM GSGP BM GSGP UM NC GSGP BM NC SSHC UM SSHC BM % of mutation improvements over the best individual % of mutation improvements over the parent Standard Standard DL 17 GSGP UM GSGP BM GSGP UM NC GSGP BM NC SSHC UM SSHC BM % of mutation improvements over the best individual % of mutation improvements over the parent Standard Standard DL 17 GSGP UM GSGP BM GSGP UM NC GSGP BM NC SSHC UM SSHC BM
112 Tree size Standard Standard DL 17 GSGP UM GSGP BM GSGP UM NC GSGP BM NC SSHC UM SSHC BM Tree size Standard Standard DL 17 GSGP UM GSGP BM GSGP UM NC GSGP BM NC SSHC UM SSHC BM Tree size Standard Standard DL 17 GSGP UM GSGP BM GSGP UM NC GSGP BM NC SSHC UM SSHC BM
113 Tree depth Standard Standard DL 17 GSGP UM GSGP BM GSGP UM NC GSGP BM NC SSHC UM SSHC BM Tree depth Standard Standard DL 17 GSGP UM GSGP BM GSGP UM NC GSGP BM NC SSHC UM SSHC BM Tree depth Standard Standard DL 17 GSGP UM GSGP BM GSGP UM NC GSGP BM NC SSHC UM SSHC BM
114 ms [ ms, ms]
115
116 I 2M I 2M = P + R1 ms + R2 ms, ms
117 P RI RI ms P + RI ms = t, P RI t RI ms = (t P), A x = y, RI P t pw
118 P pw + RI ms = t
119 Training error Generalization error SSHC UM SSHC BM SSHC AUM SSHC ABM SSHC DAUM SSHC DABM Training error Generalization error SSHC UM SSHC BM SSHC AUM SSHC ABM SSHC DAUM SSHC DABM Training error Generalization error SSHC UM SSHC BM SSHC AUM SSHC ABM SSHC DAUM SSHC DABM
120 Training error Generalization error SSHC UM SSHC BM SSHC AUM SSHC ABM SSHC DAUM SSHC DABM Training error Generalization error SSHC UM SSHC BM SSHC AUM SSHC ABM SSHC DAUM SSHC DABM Training error Generalization error SSHC UM SSHC BM SSHC AUM SSHC ABM SSHC DAUM SSHC DABM
121 11 11 % of mutation improvements over the best individual SSHC AUM SSHC ABM SSHC DAUM SSHC DABM % of mutation improvements over the best individual SSHC AUM SSHC ABM SSHC DAUM SSHC DABM % of mutation improvements over the best individual SSHC AUM SSHC ABM SSHC DAUM SSHC DABM
122 X = { x 1, x 2,..., x n } t = [t 1, t 2,..., t n ] t i x i x i I( x i ) x i si = [I( x 1 ), I( x 2 ),..., I( x n )] n n X ei = s I t. n
123 A B k e A = k e B k A B ea e B = cos θ = e A e B. C A B k A B sa t = k ( s B t ). t = 1 1 k s A k 1 k s B,
124 A B C I opt = 1 1 k A k 1 k B k k e A 1 / e B 1, e A 2 / e B 2,..., e A n / e B n k A k B
125 A eb = ea k, sb = e B + t. A k A k B k C s B k A k A s B T s B
126 s B T N N T s B T N A T B A A A s B A T A T B s B
127 k A T k k k k k = 1.1 k k k k k k
128
129
130
131 Absolute cosine similarity Absolute cosine similarity Absolute cosine similarity k = 1.1
132
133
134
135 k p k = 1.1
136 Training error Generalization error ACA SSHC SSHC Training error Generalization error ACA SSHC SSHC Training error Generalization error ACA SSHC SSHC k = 1.1
137 k =.9 k =.1
138
139 6 x
140 w w i x i y y = f( x w + bias), f Inputs Weights x 1 w 1 Bias b Activation function x 2 w 2 Σ f Output y x 3 w 3
141 f(x) = x T R1 T R2
142 Input layer Hidden layer Output layer Input 1 Input 2 Input 3 Output Input 4 Input 5
143
144 [ ls, ls] ls [ 1, 1] [ ms, ms] ms Parent Neural Network Random Neural Network Resulting Neural Network
145
146 N N B B N N B B N B
147
148
149 Training error Generalization error LS 1 LS 1 LS Iterations / generations Iterations / generations Training error Generalization error LS 1 LS 1 LS Iterations / generations Iterations / generations Training error Generalization error LS 1 LS 1 LS Iterations / generations Iterations / generations
150 Training error Generalization error SS 1 SS 5 SS 1 SS Iterations / generations Iterations / generations Training error Generalization error SS 1 SS 5 SS 1 SS Iterations / generations Iterations / generations 3 3 Training error 25 2 Generalization error 25 2 SS 1 SS 5 SS 1 SS Iterations / generations Iterations / generations
151
152 p p p p p p p [ ls, ls] ls p
153 Training error Generalization error SP % SP 5% SP 25% SP 5% SP 75% SP 95% Iterations / generations Iterations / generations Training error Generalization error SP % SP 5% SP 25% SP 5% SP 75% SP 95% Iterations / generations Iterations / generations 3 3 Training error 25 2 Generalization error 25 2 SP % SP 5% SP 25% SP 5% SP 75% SP 95% Iterations / generations Iterations / generations
154 Generalization error Generalization error SP % SP 5% SP 25% SP 5% SP 75% SP 95% SP % SP 5% SP 25% SP 5% SP 75% SP 95% Generalization error SP % SP 5% SP 25% SP 5% SP 75% SP 95%
155
156 Training error Generalization error SLM FLS SLM FLS SP 75% SLM FLS SP 95% SSHC BM Iterations / generations Iterations / generations Training error Generalization error SLM FLS SLM FLS SP 75% SLM FLS SP 95% SSHC BM Iterations / generations Iterations / generations 3 3 Training error 25 2 Generalization error 25 2 SLM FLS SLM FLS SP 75% SLM FLS SP 95% SSHC BM Iterations / generations Iterations / generations
157 Generalization error 32 3 Generalization error SLM FLS SLM FLS SP 75% SLM FLS SP 95% SSHC BM SLM FLS SLM FLS SP 75% SLM FLS SP 95% SSHC BM Generalization error SLM FLS SLM FLS SP 75% SLM FLS SP 95% SSHC BM
158 p p p p p p p p p p p
159 Training error Generalization error SLM OLS SLM OLS SP 5% SLM OLS SP 25% SLM OLS SP 5% SLM OLS SP 75% SLM OLS SP 95% SSHC ABM Iterations / generations Iterations / generations Training error Generalization error SLM OLS SLM OLS SP 5% SLM OLS SP 25% SLM OLS SP 5% SLM OLS SP 75% SLM OLS SP 95% SSHC ABM Iterations / generations Iterations / generations Training error Generalization error SLM OLS SLM OLS SP 5% SLM OLS SP 25% SLM OLS SP 5% SLM OLS SP 75% SLM OLS SP 95% SSHC ABM Iterations / generations Iterations / generations
160 Generalization error Generalization error SLM OLS SLM OLS SP 5% SLM OLS SP 25%SLM OLS SP 5%SLM OLS SP 75%SLM OLS SP 95% SLM OLS SLM OLS SP 5% SLM OLS SP 25%SLM OLS SP 5%SLM OLS SP 75%SLM OLS SP 95% Generalization error SLM OLS SLM OLS SP 5% SLM OLS SP 25%SLM OLS SP 5%SLM OLS SP 75%SLM OLS SP 95%
161 p p p p p p p p
162 p p p p p p p p
163 Training error Generalization error SLM FLS NWD SP % SLM FLS NWD SP 5% SLM FLS NWD SP 25% SLM FLS NWD SP 5% SLM FLS NWD SP 75% SLM FLS NWD SP 95% Iterations / generations Iterations / generations Training error Generalization error SLM FLS NWD SP % SLM FLS NWD SP 5% SLM FLS NWD SP 25% SLM FLS NWD SP 5% SLM FLS NWD SP 75% SLM FLS NWD SP 95% Iterations / generations Iterations / generations 3 3 Training error 25 2 Generalization error 25 2 SLM FLS NWD SP % SLM FLS NWD SP 5% SLM FLS NWD SP 25% SLM FLS NWD SP 5% SLM FLS NWD SP 75% SLM FLS NWD SP 95% Iterations / generations Iterations / generations
164 Training error Generalization error SLM OLS NWD SP % SLM OLS NWD SP 5% SLM OLS NWD SP 25% SLM OLS NWD SP 5% SLM OLS NWD SP 75% SLM OLS NWD SP 95% Iterations / generations Iterations / generations Training error Generalization error SLM OLS NWD SP % SLM OLS NWD SP 5% SLM OLS NWD SP 25% SLM OLS NWD SP 5% SLM OLS NWD SP 75% SLM OLS NWD SP 95% Iterations / generations Iterations / generations Training error Generalization error SLM OLS NWD SP % SLM OLS NWD SP 5% SLM OLS NWD SP 25% SLM OLS NWD SP 5% SLM OLS NWD SP 75% SLM OLS NWD SP 95% Iterations / generations Iterations / generations
165
166 1 1 9 % of models per Error Deviation (ED) variation ED decrease ED increase % of models per error variation against current best Error decrease Error increase Iterations / generations Iterations / generations % of models per Error Deviation (ED) variation ED decrease ED increase % of models per error variation against current best Error decrease Error increase Iterations / generations Iterations / generations % of models per Error Deviation (ED) variation ED decrease ED increase % of models per error variation against current best Error decrease Error increase Iterations / generations Iterations / generations
167
168
169 Generalization error Number of iterations / generations % 15% 25% 35% 45% 5% 5% 15% 25% 35% 45% 5% Generalization error Number of iterations / generations % 15% 25% 35% 45% 5% 5% 15% 25% 35% 45% 5% 1 Generalization error Number of iterations / generations % 15% 25% 35% 45% 5% 5% 15% 25% 35% 45% 5%
170 p p
171 Generalization error Number of iterations / generations % 15% 25% 35% 45% 5% 5% 15% 25% 35% 45% 5% Generalization error Number of iterations / generations % 15% 25% 35% 45% 5% 5% 15% 25% 35% 45% 5% 3 1 Generalization error Number of iterations / generations % 15% 25% 35% 45% 5% 5% 15% 25% 35% 45% 5%
172 33 76 Generalization error Number of iterations / generations EDV TIE EDV TIE Generalization error Number of iterations / generations EDV TIE EDV TIE 23 9 Generalization error Number of iterations / generations EDV TIE EDV TIE
173 1 1 9 % of models per Error Deviation (ED) variation ED decrease ED increase % of models per error variation against current best Error decrease Error increase Iterations / generations Iterations / generations % of models per Error Deviation (ED) variation ED decrease ED increase % of models per error variation against current best Error decrease Error increase Iterations / generations Iterations / generations % of models per Error Deviation (ED) variation ED decrease ED increase % of models per error variation against current best Error decrease Error increase Iterations / generations Iterations / generations
174 p
175 Generalization error Number of iterations / generations % 15% 25% 35% 45% 5% 5% 15% 25% 35% 45% 5% Generalization error 4 35 Number of iterations / generations % 15% 25% 35% 45% 5% 5% 15% 25% 35% 45% 5% Generalization error Number of iterations / generations % 15% 25% 35% 45% 5% 5% 15% 25% 35% 45% 5%
176 p p
177 Generalization error Number of iterations / generations SLM FLS EDV SLM FLS TIE SLM OLS EDV SLM FLS EDV SLM FLS TIE SLM OLS EDV Generalization error 4 35 Number of iterations / generations SLM FLS EDV SLM FLS TIE SLM OLS EDV SLM FLS EDV SLM FLS TIE SLM OLS EDV Generalization error Number of iterations / generations SLM FLS EDV SLM FLS TIE SLM OLS EDV SLM FLS EDV SLM FLS TIE SLM OLS EDV
178 1 1 9 % of models per Error Deviation (ED) variation ED decrease ED increase % of models per error variation against current best Error decrease Error increase Iterations / generations Iterations / generations % of models per Error Deviation (ED) variation ED decrease ED increase % of models per error variation against current best Error decrease Error increase Iterations / generations Iterations / generations % of models per Error Deviation (ED) variation ED decrease ED increase % of models per error variation against current best Error decrease Error increase Iterations / generations Iterations / generations
179 1 1 9 % of models per Error Deviation (ED) variation ED decrease ED increase % of models per error variation against current best Error decrease Error increase Iterations / generations Iterations / generations % of models per Error Deviation (ED) variation ED decrease ED increase % of models per error variation against current best Error decrease Error increase Iterations / generations Iterations / generations % of models per Error Deviation (ED) variation ED decrease ED increase % of models per error variation against current best Error decrease Error increase Iterations / generations Iterations / generations
180 p p p p p p p p p p p p p p
181 p p C
182 Generalization error 35 Number of iterations / generations SSHC BM EDV SSHC BM TIE SSHC ABM EDV SSHC BM EDV SSHC BM TIE SSHC ABM EDV Generalization error Number of iterations / generations SSHC BM EDV SSHC BM TIE SSHC ABM EDV SSHC BM EDV SSHC BM TIE SSHC ABM EDV Generalization error Number of iterations / generations SSHC BM EDV SSHC BM TIE SSHC ABM EDV SSHC BM EDV SSHC BM TIE SSHC ABM EDV
183 p p
184 Generalization error 5 4 Generalization error FLS EDV FLS TIE OLS EDV SVR MLP RT LR FLS EDV FLS TIE OLS EDV SVR MLP RT LR Generalization error FLS EDV FLS TIE OLS EDV SVR MLP RT LR
185
186 Computational time (in seconds) Computational time (in seconds) FLS EDV FLS TIE OLS EDV FLS EDV FLS TIE OLS EDV 25 2 Computational time (in seconds) FLS EDV FLS TIE OLS EDV
187 7
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
arxiv: v1 [cs.ne] 19 Jun 2017
Unsure When to Stop? Ask Your Semantic Neighbors arxiv:16.6195v1 [cs.ne] 19 Jun 17 ABSTRACT Ivo Gonçalves NOVA IMS, Universidade Nova de Lisboa -312 Lisbon, Portugal igoncalves@novaims.unl.pt Carlos M.
More informationEvolutionary Computation. Chao Lan
Evolutionary Computation Chao Lan Outline Introduction Genetic Algorithm Evolutionary Strategy Genetic Programming Introduction Evolutionary strategy can jointly optimize multiple variables. - e.g., max
More informationImproving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms.
Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms. Gómez-Skarmeta, A.F. University of Murcia skarmeta@dif.um.es Jiménez, F. University of Murcia fernan@dif.um.es
More informationComparison of TSP Algorithms
Comparison of TSP Algorithms Project for Models in Facilities Planning and Materials Handling December 1998 Participants: Byung-In Kim Jae-Ik Shim Min Zhang Executive Summary Our purpose in this term project
More informationCentral Manufacturing Technology Institute, Bangalore , India,
5 th International & 26 th All India Manufacturing Technology, Design and Research Conference (AIMTDR 2014) December 12 th 14 th, 2014, IIT Guwahati, Assam, India Investigation on the influence of cutting
More informationGeometric Semantic Genetic Programming with Perpendicular Crossover and Random Segment Mutation for Symbolic Regression
Geometric Semantic Genetic Programming with Perpendicular Crossover and Random Segment Mutation for Symbolic Regression Qi Chen (B), Mengjie Zhang, and Bing Xue School of Engineering and Computer Science,
More informationReview: Final Exam CPSC Artificial Intelligence Michael M. Richter
Review: Final Exam Model for a Learning Step Learner initially Environm ent Teacher Compare s pe c ia l Information Control Correct Learning criteria Feedback changed Learner after Learning Learning by
More informationGenetic Algorithms Variations and Implementation Issues
Genetic Algorithms Variations and Implementation Issues CS 431 Advanced Topics in AI Classic Genetic Algorithms GAs as proposed by Holland had the following properties: Randomly generated population Binary
More informationLearning from Data: Adaptive Basis Functions
Learning from Data: Adaptive Basis Functions November 21, 2005 http://www.anc.ed.ac.uk/ amos/lfd/ Neural Networks Hidden to output layer - a linear parameter model But adapt the features of the model.
More informationCSE 5526: Introduction to Neural Networks Radial Basis Function (RBF) Networks
CSE 5526: Introduction to Neural Networks Radial Basis Function (RBF) Networks Part IV 1 Function approximation MLP is both a pattern classifier and a function approximator As a function approximator,
More informationUsing a genetic algorithm for editing k-nearest neighbor classifiers
Using a genetic algorithm for editing k-nearest neighbor classifiers R. Gil-Pita 1 and X. Yao 23 1 Teoría de la Señal y Comunicaciones, Universidad de Alcalá, Madrid (SPAIN) 2 Computer Sciences Department,
More informationr v i e w o f s o m e r e c e n t d e v e l o p m
O A D O 4 7 8 O - O O A D OA 4 7 8 / D O O 3 A 4 7 8 / S P O 3 A A S P - * A S P - S - P - A S P - - - - L S UM 5 8 - - 4 3 8 -F 69 - V - F U 98F L 69V S U L S UM58 P L- SA L 43 ˆ UéL;S;UéL;SAL; - - -
More informationDS Machine Learning and Data Mining I. Alina Oprea Associate Professor, CCIS Northeastern University
DS 4400 Machine Learning and Data Mining I Alina Oprea Associate Professor, CCIS Northeastern University September 20 2018 Review Solution for multiple linear regression can be computed in closed form
More informationHybrid Adaptive Evolutionary Algorithm Hyper Heuristic
Hybrid Adaptive Evolutionary Algorithm Hyper Heuristic Jonatan Gómez Universidad Nacional de Colombia Abstract. This paper presents a hyper heuristic that is able to adapt two low level parameters (depth
More informationGeometric Semantic Genetic Programming
Geometric Semantic Genetic Programming Alberto Moraglio 1, Krzysztof Krawiec 2, and Colin G. Johnson 3 1 School of Computer Science, University of Birmingham, UK A.Moraglio@cs.bham.ac.uk 2 Institute of
More informationGeometric Semantic Crossover with an Angle-aware Mating Scheme in Genetic Programming for Symbolic Regression
Geometric Semantic Crossover with an Angle-aware Mating Scheme in Genetic Programming for Symbolic Regression Qi Chen, Bing Xue, Yi Mei, and Mengjie Zhang School of Engineering and Computer Science, Victoria
More informationFeeder Reconfiguration Using Binary Coding Particle Swarm Optimization
488 International Journal Wu-Chang of Control, Wu Automation, and Men-Shen and Systems, Tsai vol. 6, no. 4, pp. 488-494, August 2008 Feeder Reconfiguration Using Binary Coding Particle Swarm Optimization
More informationDS Machine Learning and Data Mining I. Alina Oprea Associate Professor, CCIS Northeastern University
DS 4400 Machine Learning and Data Mining I Alina Oprea Associate Professor, CCIS Northeastern University January 24 2019 Logistics HW 1 is due on Friday 01/25 Project proposal: due Feb 21 1 page description
More informationLecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman
Lecture 4 3B1B Optimization Michaelmas 2017 A. Zisserman Convexity Robust cost functions Optimizing non-convex functions grid search branch and bound simulated annealing evolutionary optimization The Optimization
More informationAn Evolutionary Algorithm for the Multi-objective Shortest Path Problem
An Evolutionary Algorithm for the Multi-objective Shortest Path Problem Fangguo He Huan Qi Qiong Fan Institute of Systems Engineering, Huazhong University of Science & Technology, Wuhan 430074, P. R. China
More informationInfluence of the tape number on the optimized structural performance of locally reinforced composite structures
Proceedings of the 7th GACM Colloquium on Computational Mechanics for Young Scientists from Academia and Industry October 11-13, 2017 in Stuttgart, Germany Influence of the tape number on the optimized
More informationThe Forecast of PM10 Pollutant by Using a Hybrid Model
The Forecast of PM10 Pollutant by Using a Hybrid Model Ronnachai Chuentawat, Nittaya Kerdprasop, and Kittisak Kerdprasop Abstract This research aims to study the forecasting model to predict the 24-hour
More informationIntroduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7.
Chapter 7: Derivative-Free Optimization Introduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7.5) Jyh-Shing Roger Jang et al.,
More informationEstimating Random Delays in Modbus Network Using Experiments and General Linear Regression Neural Networks with Genetic Algorithm Smoothing
Estimating Random Delays in Modbus Network Using Experiments and General Linear Regression Neural Networks with Genetic Algorithm Smoothing B. Sreram 1, F. Bounapane 2, B. Subathra 3, Seshadhri Srinivasan
More informationGeometric Semantic Genetic Programming ~ Theory & Practice ~
Geometric Semantic Genetic Programming ~ Theory & Practice ~ Alberto Moraglio University of Exeter 25 April 2017 Poznan, Poland 2 Contents Evolutionary Algorithms & Genetic Programming Geometric Genetic
More informationPredicting Turning Points in Financial Markets with Fuzzy-Evolutionary and Neuro-Evolutionary Modeling
Predicting Turning Points in Financial Markets with Fuzzy-Evolutionary and Neuro-Evolutionary Modeling Antonia Azzini, Célia da Costa Pereira, and Andrea G.B. Tettamanzi Università degli Studi di Milano
More informationMultivariate Analysis Multivariate Calibration part 2
Multivariate Analysis Multivariate Calibration part 2 Prof. Dr. Anselmo E de Oliveira anselmo.quimica.ufg.br anselmo.disciplinas@gmail.com Linear Latent Variables An essential concept in multivariate data
More informationDr.-Ing. Johannes Will CAD-FEM GmbH/DYNARDO GmbH dynamic software & engineering GmbH
Evolutionary and Genetic Algorithms in OptiSLang Dr.-Ing. Johannes Will CAD-FEM GmbH/DYNARDO GmbH dynamic software & engineering GmbH www.dynardo.de Genetic Algorithms (GA) versus Evolutionary Algorithms
More informationROBUST PARAMETER DESIGN IN LS-OPT AUTHORS: CORRESPONDENCE: ABSTRACT KEYWORDS:
ROBUST PARAMETER DESIGN IN LS-OPT AUTHORS: Willem Roux, LSTC CORRESPONDENCE: Willem Roux LSTC Address Telephone +1 925 4492500 Fax +1 925 4492507 Email willem@lstc.com ABSTRACT Robust parameter design
More informationRegularization of Evolving Polynomial Models
Regularization of Evolving Polynomial Models Pavel Kordík Dept. of Computer Science and Engineering, Karlovo nám. 13, 121 35 Praha 2, Czech Republic kordikp@fel.cvut.cz Abstract. Black box models such
More informationAccurate High Performance Concrete Prediction with an Alignment Based Genetic Programming System
https://doi.org/10.1186/s40069-018-0300-5 International Journal of Concrete Structures and Materials ORIGINAL ARTICLE Open Access Accurate High Performance Concrete Prediction with an Alignment Based Genetic
More informationIN recent years, neural networks have attracted considerable attention
Multilayer Perceptron: Architecture Optimization and Training Hassan Ramchoun, Mohammed Amine Janati Idrissi, Youssef Ghanou, Mohamed Ettaouil Modeling and Scientific Computing Laboratory, Faculty of Science
More informationGenetic Programming. Modern optimization methods 1
Genetic Programming Developed in USA during 90 s Patented by J. Koza Solves typical problems: Prediction, classification, approximation, programming Properties Competitor of neural networks Need for huge
More informationCLOSED LOOP SYSTEM IDENTIFICATION USING GENETIC ALGORITHM
CLOSED LOOP SYSTEM IDENTIFICATION USING GENETIC ALGORITHM Lucchesi Alejandro (a), Campomar Guillermo (b), Zanini Aníbal (c) (a,b) Facultad Regional San Nicolás Universidad Tecnológica Nacional (FRSN-UTN),
More informationPractical Guidance for Machine Learning Applications
Practical Guidance for Machine Learning Applications Brett Wujek About the authors Material from SGF Paper SAS2360-2016 Brett Wujek Senior Data Scientist, Advanced Analytics R&D ~20 years developing engineering
More informationAssignment # 5. Farrukh Jabeen Due Date: November 2, Neural Networks: Backpropation
Farrukh Jabeen Due Date: November 2, 2009. Neural Networks: Backpropation Assignment # 5 The "Backpropagation" method is one of the most popular methods of "learning" by a neural network. Read the class
More informationCPSC 340: Machine Learning and Data Mining. Principal Component Analysis Fall 2016
CPSC 340: Machine Learning and Data Mining Principal Component Analysis Fall 2016 A2/Midterm: Admin Grades/solutions will be posted after class. Assignment 4: Posted, due November 14. Extra office hours:
More informationAn Optimized Hill Climbing Algorithm for Feature Subset Selection: Evaluation on Handwritten Character Recognition
An Optimized Hill Climbing Algorithm for Feature Subset Selection: Evaluation on Handwritten Character Recognition Carlos M. Nunes, Alceu de S. Britto Jr.,2, Celso A. A. Kaestner and Robert Sabourin 3
More informationUninformed Search Methods. Informed Search Methods. Midterm Exam 3/13/18. Thursday, March 15, 7:30 9:30 p.m. room 125 Ag Hall
Midterm Exam Thursday, March 15, 7:30 9:30 p.m. room 125 Ag Hall Covers topics through Decision Trees and Random Forests (does not include constraint satisfaction) Closed book 8.5 x 11 sheet with notes
More informationComputational Financial Modeling
Computational Financial Modeling Enhancing Technical Analysis With Genetic Algorithm SAIKIRAN DEEPAK SHARMA PRANJAL JAIN 23 RD NOV. 2012 How Genetic Algorithm can be used to improve the performance of
More informationBayesian Approaches to Content-based Image Retrieval
Bayesian Approaches to Content-based Image Retrieval Simon Wilson Georgios Stefanou Department of Statistics Trinity College Dublin Background Content-based Image Retrieval Problem: searching for images
More informationROBUST-HYBRID GENETIC ALGORITHM FOR A FLOW-SHOP SCHEDULING PROBLEM (A Case Study at PT FSCM Manufacturing Indonesia)
JURNAL TEKNIK INDUSTRI VOL. 9, NO., DESEMBER 007: 144-151 ROBUST-HYBRID GENETIC ALGORITHM FOR A FLOW-SHOP SCHEDULING PROBLEM (A Case Study at PT FSCM Manufacturing Indonesia) Tanti Octavia, Iwan Halim
More informationChapter 9: Genetic Algorithms
Computational Intelligence: Second Edition Contents Compact Overview First proposed by Fraser in 1957 Later by Bremermann in 1962 and Reed et al in 1967 Popularized by Holland in 1975 Genetic algorithms
More informationDimension Reduction of Image Manifolds
Dimension Reduction of Image Manifolds Arian Maleki Department of Electrical Engineering Stanford University Stanford, CA, 9435, USA E-mail: arianm@stanford.edu I. INTRODUCTION Dimension reduction of datasets
More informationSemantic Genetic Programming Operators Based on Projections in the Phenotype Space
Semantic Genetic Programming Operators Based on Projections in the Phenotype Space Mario Graff, Eric Sadit Tellez, Elio Villaseñor, Sabino Miranda-Jiménez INFOTEC - Centro de Investigación e Innovación
More informationMultidimensional Knapsack Problem: The Influence of Representation
Centre for Informatics and Systems of the University of Coimbra Technical Report 003 February 2007 Multidimensional Knapsack Problem: The Influence of Representation Jorge Tavares Centre for Informatics
More informationDynamic Selection of Ensembles of Classifiers Using Contextual Information
Dynamic Selection of Ensembles of Classifiers Using Contextual Information Paulo R. Cavalin 1, Robert Sabourin 1, and Ching Y. Suen 2 1 École de Technologie Supérieure, 1100 Notre-dame ouest, Montreal(QC),
More informationEvolutionary Algorithms: Lecture 4. Department of Cybernetics, CTU Prague.
Evolutionary Algorithms: Lecture 4 Jiří Kubaĺık Department of Cybernetics, CTU Prague http://labe.felk.cvut.cz/~posik/xe33scp/ pmulti-objective Optimization :: Many real-world problems involve multiple
More informationOptimization of Cutting Parameters for Milling Operation using Genetic Algorithm technique through MATLAB
International Journal for Ignited Minds (IJIMIINDS) Optimization of Cutting Parameters for Milling Operation using Genetic Algorithm technique through MATLAB A M Harsha a & Ramesh C G c a PG Scholar, Department
More informationModified Order Crossover (OX) Operator
Modified Order Crossover (OX) Operator Ms. Monica Sehrawat 1 N.C. College of Engineering, Israna Panipat, Haryana, INDIA. Mr. Sukhvir Singh 2 N.C. College of Engineering, Israna Panipat, Haryana, INDIA.
More informationEvolutionary Algorithms. CS Evolutionary Algorithms 1
Evolutionary Algorithms CS 478 - Evolutionary Algorithms 1 Evolutionary Computation/Algorithms Genetic Algorithms l Simulate natural evolution of structures via selection and reproduction, based on performance
More informationMULTI-RESPONSE SIMULATION OPTIMIZATION USING STOCHASTIC GENETIC SEARCH WITHIN A GOAL PROGRAMMING FRAMEWORK. Felipe F. Baesler José A.
Proceedings of the 000 Winter Simulation Conference J. A. Joines, R. R. Barton, K. Kang, and P. A. Fishwick, eds. MULTI-RESPONSE SIMULATION OPTIMIZATION USING STOCHASTIC GENETIC SEARCH WITHIN A GOAL PROGRAMMING
More informationNEURO-PREDICTIVE CONTROL DESIGN BASED ON GENETIC ALGORITHMS
NEURO-PREDICTIVE CONTROL DESIGN BASED ON GENETIC ALGORITHMS I.Sekaj, S.Kajan, L.Körösi, Z.Dideková, L.Mrafko Institute of Control and Industrial Informatics Faculty of Electrical Engineering and Information
More informationExploration of Pareto Frontier Using a Fuzzy Controlled Hybrid Line Search
Seventh International Conference on Hybrid Intelligent Systems Exploration of Pareto Frontier Using a Fuzzy Controlled Hybrid Line Search Crina Grosan and Ajith Abraham Faculty of Information Technology,
More informationNCGA : Neighborhood Cultivation Genetic Algorithm for Multi-Objective Optimization Problems
: Neighborhood Cultivation Genetic Algorithm for Multi-Objective Optimization Problems Shinya Watanabe Graduate School of Engineering, Doshisha University 1-3 Tatara Miyakodani,Kyo-tanabe, Kyoto, 10-031,
More informationDE/EDA: A New Evolutionary Algorithm for Global Optimization 1
DE/EDA: A New Evolutionary Algorithm for Global Optimization 1 Jianyong Sun, Qingfu Zhang and Edward P.K. Tsang Department of Computer Science, University of Essex, Wivenhoe Park, Colchester, CO4 3SQ,
More informationResearch Article Designing Artificial Neural Networks Using Particle Swarm Optimization Algorithms
Computational Intelligence and Neuroscience Volume, Article ID 99, pages http://dx.doi.org/.//99 Research Article Designing Artificial Neural Networks Using Particle Swarm Optimization Algorithms Beatriz
More informationA New Multi-objective Multi-mode Model for Optimizing EPC Projects in Oil and Gas Industry
A New Multi-objective Multi-mode Model for Optimizing EPC Projects in Oil and Gas Industry Vida Arabzadeh, Hassan Haleh, and S.M.R. Khalili Abstract the objective of this paper is implementing optimization
More informationCAD Algorithms. Placement and Floorplanning
CAD Algorithms Placement Mohammad Tehranipoor ECE Department 4 November 2008 1 Placement and Floorplanning Layout maps the structural representation of circuit into a physical representation Physical representation:
More informationLECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS
LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS Neural Networks Classifier Introduction INPUT: classification data, i.e. it contains an classification (class) attribute. WE also say that the class
More informationSurrogate Genetic Programming: A Semantic Aware Evolutionary Search
Surrogate Genetic Programming: A Semantic Aware Evolutionary Search Ahmed Kattan AI Real-world Application Lab, UQU, Saudi Arabia Yew-Soon Ong Computer Science Department in Nanyang Technological University,
More informationA Learning Automata-based Memetic Algorithm
A Learning Automata-based Memetic Algorithm M. Rezapoor Mirsaleh and M. R. Meybodi 2,3 Soft Computing Laboratory, Computer Engineering and Information Technology Department, Amirkabir University of Technology,
More informationChapter 5 Components for Evolution of Modular Artificial Neural Networks
Chapter 5 Components for Evolution of Modular Artificial Neural Networks 5.1 Introduction In this chapter, the methods and components used for modular evolution of Artificial Neural Networks (ANNs) are
More informationIntroduction to Design Optimization: Search Methods
Introduction to Design Optimization: Search Methods 1-D Optimization The Search We don t know the curve. Given α, we can calculate f(α). By inspecting some points, we try to find the approximated shape
More informationBinary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms
Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms Franz Rothlauf Department of Information Systems University of Bayreuth, Germany franz.rothlauf@uni-bayreuth.de
More informationAPPLICATIONS OF INTELLIGENT HYBRID SYSTEMS IN MATLAB
APPLICATIONS OF INTELLIGENT HYBRID SYSTEMS IN MATLAB Z. Dideková, S. Kajan Institute of Control and Industrial Informatics, Faculty of Electrical Engineering and Information Technology, Slovak University
More informationSearch Space Boundary Extension Method in Real-Coded Genetic Algorithms
Information Sciences, Vol. 33/3-4, pp. 9-47 (00.5) Search Space Boundary Extension Method in Real-Coded Genetic Algorithms Shigeyoshi Tsutsui* and David E. Goldberg** * Department of Management and Information
More informationA Ranking and Selection Strategy for Preference-based Evolutionary Multi-objective Optimization of Variable-Noise Problems
A Ranking and Selection Strategy for Preference-based Evolutionary Multi-objective Optimization of Variable-Noise Problems COIN Report Number 2016002 Florian Siegmund, Amos H.C. Ng School of Engineering
More informationEnergy-Aware Scheduling of Distributed Systems Using Cellular Automata
Energy-Aware Scheduling of Distributed Systems Using Cellular Automata Pragati Agrawal and Shrisha Rao pragati.agrawal@iiitb.org, shrao@ieee.org Abstract In today s world of large distributed systems,
More informationIntroduction to Design Optimization: Search Methods
Introduction to Design Optimization: Search Methods 1-D Optimization The Search We don t know the curve. Given α, we can calculate f(α). By inspecting some points, we try to find the approximated shape
More informationCatholic Central High School
Catholic Central High School Algebra II Practice Examination I Instructions: 1. Show all work on the test copy itself for every problem where work is required. Points may be deducted if insufficient or
More informationSupervised Learning in Neural Networks (Part 2)
Supervised Learning in Neural Networks (Part 2) Multilayer neural networks (back-propagation training algorithm) The input signals are propagated in a forward direction on a layer-bylayer basis. Learning
More informationA novel supervised learning algorithm and its use for Spam Detection in Social Bookmarking Systems
A novel supervised learning algorithm and its use for Spam Detection in Social Bookmarking Systems Anestis Gkanogiannis and Theodore Kalamboukis Department of Informatics Athens University of Economics
More informationA Comparison of the Iterative Fourier Transform Method and. Evolutionary Algorithms for the Design of Diffractive Optical.
A Comparison of the Iterative Fourier Transform Method and Evolutionary Algorithms for the Design of Diffractive Optical Elements Philip Birch, Rupert Young, Maria Farsari, David Budgett, John Richardson,
More informationEvolution Strategies in the Multipoint Connections Routing
408 L. KRULIKOVSKÁ, J. FILANOVÁ, J. PAVLOVIČ, EVOLUTION STRATEGIES IN THE MULTIPOINT CONNECTIONS ROUTING Evolution Strategies in the Multipoint Connections Routing Lenka KRULIKOVSKÁ, Jana FILANOVÁ, Juraj
More informationMulti-Objective Optimization Using Genetic Algorithms
Multi-Objective Optimization Using Genetic Algorithms Mikhail Gaerlan Computational Physics PH 4433 December 8, 2015 1 Optimization Optimization is a general term for a type of numerical problem that involves
More informationHYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS
HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS NABEEL AL-MILLI Financial and Business Administration and Computer Science Department Zarqa University College Al-Balqa'
More informationEvolutionary Approaches for Resilient Surveillance Management. Ruidan Li and Errin W. Fulp. U N I V E R S I T Y Department of Computer Science
Evolutionary Approaches for Resilient Surveillance Management Ruidan Li and Errin W. Fulp WAKE FOREST U N I V E R S I T Y Department of Computer Science BioSTAR Workshop, 2017 Surveillance Systems Growing
More informationDept. of Computer Science. The eld of time series analysis and forecasting methods has signicantly changed in the last
Model Identication and Parameter Estimation of ARMA Models by Means of Evolutionary Algorithms Susanne Rolf Dept. of Statistics University of Dortmund Germany Joachim Sprave y Dept. of Computer Science
More informationAbstract. 1 Introduction
Shape optimal design using GA and BEM Eisuke Kita & Hisashi Tanie Department of Mechano-Informatics and Systems, Nagoya University, Nagoya 464-01, Japan Abstract This paper describes a shape optimization
More informationRecursive Similarity-Based Algorithm for Deep Learning
Recursive Similarity-Based Algorithm for R Tomasz Maszczyk & W lodzis law Duch Nicolaus Copernicus University Toruń, Poland ICONIP 2012 {tmaszczyk,wduch}@is.umk.pl 1 / 21 R Similarity-Based Learning ()
More informationPreliminary Background Tabu Search Genetic Algorithm
Preliminary Background Tabu Search Genetic Algorithm Faculty of Information Technology University of Science Vietnam National University of Ho Chi Minh City March 2010 Problem used to illustrate General
More informationAero-engine PID parameters Optimization based on Adaptive Genetic Algorithm. Yinling Wang, Huacong Li
International Conference on Applied Science and Engineering Innovation (ASEI 215) Aero-engine PID parameters Optimization based on Adaptive Genetic Algorithm Yinling Wang, Huacong Li School of Power and
More informationLinear Separability. Linear Separability. Capabilities of Threshold Neurons. Capabilities of Threshold Neurons. Capabilities of Threshold Neurons
Linear Separability Input space in the two-dimensional case (n = ): - - - - - - w =, w =, = - - - - - - w = -, w =, = - - - - - - w = -, w =, = Linear Separability So by varying the weights and the threshold,
More informationUSING REGRESSION TREES IN PREDICTIVE MODELLING
Production Systems and Information Engineering Volume 4 (2006), pp. 115-124 115 USING REGRESSION TREES IN PREDICTIVE MODELLING TAMÁS FEHÉR University of Miskolc, Hungary Department of Information Engineering
More informationArtificial Neural Networks MLP, RBF & GMDH
Artificial Neural Networks MLP, RBF & GMDH Jan Drchal drchajan@fel.cvut.cz Computational Intelligence Group Department of Computer Science and Engineering Faculty of Electrical Engineering Czech Technical
More informationLecture 6: Genetic Algorithm. An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved
Lecture 6: Genetic Algorithm An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved Lec06/1 Search and optimization again Given a problem, the set of all possible
More informationA Pareto Archive Evolutionary Strategy Based Radial Basis Function Neural Network Training Algorithm for Failure Rate Prediction in Overhead Feeders
A Pareto Archive Evolutionary Strategy Based Radial Basis Function Neural Network raining Algorithm for Failure Rate Prediction in Overhead Feeders Grant Cochenour grc3484@ksu.edu Jerad Simon jesimon@ksu.edu
More informationEffects of Constant Optimization by Nonlinear Least Squares Minimization in Symbolic Regression
Effects of Constant Optimization by Nonlinear Least Squares Minimization in Symbolic Regression Michael Kommenda, Gabriel Kronberger, Stephan Winkler, Michael Affenzeller, and Stefan Wagner Contact: Michael
More informationReinforcement Learning-Based Path Planning for Autonomous Robots
Reinforcement Learning-Based Path Planning for Autonomous Robots Dennis Barrios Aranibar 1, Pablo Javier Alsina 1 1 Laboratório de Sistemas Inteligentes Departamento de Engenharia de Computação e Automação
More informationOPTIMIZATION OF MACHINING PARAMETERS FOR FACE MILLING OPERATION IN A VERTICAL CNC MILLING MACHINE USING GENETIC ALGORITHM
OPTIMIZATION OF MACHINING PARAMETERS FOR FACE MILLING OPERATION IN A VERTICAL CNC MILLING MACHINE USING GENETIC ALGORITHM Milon D. Selvam Research Scholar, Department of Mechanical Engineering, Dr.A.K.Shaik
More informationChapter 7: Computation of the Camera Matrix P
Chapter 7: Computation of the Camera Matrix P Arco Nederveen Eagle Vision March 18, 2008 Arco Nederveen (Eagle Vision) The Camera Matrix P March 18, 2008 1 / 25 1 Chapter 7: Computation of the camera Matrix
More informationArtificial Intelligence
Artificial Intelligence Dr Ahmed Rafat Abas Computer Science Dept, Faculty of Computers and Informatics, Zagazig University arabas@zu.edu.eg http://www.arsaliem.faculty.zu.edu.eg/ Informed search algorithms
More informationCHAPTER 4 GENETIC ALGORITHM
69 CHAPTER 4 GENETIC ALGORITHM 4.1 INTRODUCTION Genetic Algorithms (GAs) were first proposed by John Holland (Holland 1975) whose ideas were applied and expanded on by Goldberg (Goldberg 1989). GAs is
More informationHardware Neuronale Netzwerke - Lernen durch künstliche Evolution (?)
SKIP - May 2004 Hardware Neuronale Netzwerke - Lernen durch künstliche Evolution (?) S. G. Hohmann, Electronic Vision(s), Kirchhoff Institut für Physik, Universität Heidelberg Hardware Neuronale Netzwerke
More informationentire search space constituting coefficient sets. The brute force approach performs three passes through the search space, with each run the se
Evolving Simulation Modeling: Calibrating SLEUTH Using a Genetic Algorithm M. D. Clarke-Lauer 1 and Keith. C. Clarke 2 1 California State University, Sacramento, 625 Woodside Sierra #2, Sacramento, CA,
More informationA Genetic Algorithm to the Strategic Pricing Problem in Competitive Electricity Markets
A Genetic Algorithm to the Strategic Pricing Problem in Competitive Electricity Markets Wagner Pimentel Centro Federal de Educação Tecnológica Celso Suckow da Fonseca Unidade de Ensino Descentralizada
More informationA Genetic Algorithm for Robust Motion Planning
A Genetic Algorithm for Robust Motion Planning Domingo Gallardo, Otto Colomina, Francisco Flórez, Ramón Rizo domingo,otto,florez,rizo@dtic.ua.es Grupo i3a: Informatica Industrial e Inteligencia Artificial
More informationRecombination of Similar Parents in EMO Algorithms
H. Ishibuchi and K. Narukawa, Recombination of parents in EMO algorithms, Lecture Notes in Computer Science 341: Evolutionary Multi-Criterion Optimization, pp. 265-279, Springer, Berlin, March 25. (Proc.
More informationSemantically-Driven Search Techniques for Learning Boolean Program Trees
Semantically-Driven Search Techniques for Learning Boolean Program Trees by Nicholas Charles Miller Bachelor of Science Computer Engineering Georgia Institute of Technology 2006 A thesis submitted to Florida
More information