A Multi-Objective Constraint-Handling Method with PSO Algorithm for Constrained Engineering Optimization Problems Lily D LI1, Xiaodong Li2, Xinghuo Yu2

-----------------------------------------------------------------------------------------------------------------

1

Central Queensland University, Australia 2 RMIT University, Australia IEEE Congress on Evolutionary Computation 2008

1

Outline † † †

Background Problem formulation The proposed approach „ „

† † †

PSO algorithm Constraint handling

Simulation Results Conclusion IEEE Congress on Evolutionary Computation 2008

2

Background † † †

Real world optimization problems are constrained Constraint-Handling is a key issue Evolutionary algorithms for optimization „ successful in unconstrained optimization problems „ constraint-handling remain problematic because their original versions have no mechanism to incorporate constraints into fitness functions IEEE Congress on Evolutionary Computation 2008

3

Background (cont.) †

†

Constraint-Handling via Genetic Algorithm has been studied Constraint-Handling via Particle Swarm Optimization Algorithm has few attempts

IEEE Congress on Evolutionary Computation 2008

4

Problem formulation †

Constrained single objective optimization problem minimize f (X) subject to gi (X) ≤ 0,

⎫ ⎬ i = 1,2,....m.⎭

(1)

where X=[x1,x2,…xn] , with each xi (i=1,2,…n) is bounded by lower and upper limits Li ≤ xi ≤ Ui m is the total number of constraints

IEEE Congress on Evolutionary Computation 2008

5

Problem formulation (cont.) †

Multi-objective constraint-handling method „

†

Treat constraints as objectives

Transform single objective constrained problems into bi-objective unconstrained problems m in im iz e w h e re

F ( X )= ( f ( X ), Φ ( X )) Φ (X ) =

m



i =1

⎫ ⎪ ⎬ m a x ( 0 , g i ( X )) ⎪ ⎭

(2 )

Φ is a total amount of constraint violations

IEEE Congress on Evolutionary Computation 2008

6

Problem formulation (cont.) †

A general Multi-objective Optimization (MOO) problem „ „

†

Find Pareto-front Apply high-level information to find the final solution

For this problem in Equation (2) „ „

Constraints satisfaction (Φ = 0 or Φ ≤ ε) is a must Can be used as high-level information

IEEE Congress on Evolutionary Computation 2008

7

Problem formulation (cont.) The Pareto-front, feasible solutions and desired constrained minimum for a bi-objective constraint handling optimization problem

IEEE Congress on Evolutionary Computation 2008

8

Problem formulation (cont.) † †

If a predefined target is given The problem can be transferred into minimize goal

F(X)=(f (X) - t, Φ(X)) ⎫ ⎬ Φ(X) ≤ ε and f (X) - t ≤ Δ ⎭

(3)

t: predefined target ε: minimum error allowed to constraints Δ: minimum error allowed to target IEEE Congress on Evolutionary Computation 2008

9

Problem formulation (cont.) †

The solutions area for a bi-objective constraint handling optimization problem

IEEE Congress on Evolutionary Computation 2008

10

Proposed approach †

†

Integrate Multi-objective constraint-handling method into PSO algorithm Adopt domination concept from Multiobjective optimization in deciding particles behaviours

IEEE Congress on Evolutionary Computation 2008

11

PSO algorithm † †

† †

stochastic method metaphor of social behaviour of flocks of birds and schools of fish link to evolutionary computation no “survival of the fittest”

IEEE Congress on Evolutionary Computation 2008

12

PSO algorithm †

Formula v id

(t +1)

v id

(t +1)

v id

(t +1)

= χ [ w v id

)⎫ ⎪ + c 2 r 2 i d ( t ) ( l B e s t i d ( t ) − x i d ( t ) ) ] ⎪⎪ ⎬ i f v id ( t +1) > V m ax = V m ax , ⎪ ⎪ i f v id ( t +1) < − V m ax = − V m ax , ⎪⎭ (t )

+ c 1 r1id

(t )

( p B e s t id

(t )

− x id

(t )

(a) (b)

xid (t +1) = xid (t ) + vid (t +1)

vidt+1: velocity for particle i, dimension d, generation t+1 xidt+1: position for particle i, dimension d, generation t+1

IEEE Congress on Evolutionary Computation 2008

13

Constraint Handling † †

Multi-objective Constraint Handling Adopt domination concept Definition: A solution X(1) is said to dominate the other solution X(2), if both conditions 1 and 2 are true: The solution X(1) is no worse than X(2) in all objectives, (for all j =1,2,…,m). The solution X(1) is strictly better than X(2) in at least one objective, for at least one j ∈ {1, 2 , ..., m }

†

Second objective Φ has high priority „

Final results must be selected from Φ = 0 or Φ ≤ ε

IEEE Congress on Evolutionary Computation 2008

14

Selection rules and Diversity Control †

Selection rules „ „

†

Non-dominated particles are better than dominated ones Particles with lower Φ is better than those with higher Φ (means closer to the feasible area)

Perturbation „ „

minor probability p disturb few particles by regenerating IEEE Congress on Evolutionary Computation 2008

15

Modified PSO algorithm 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16.

GlobalF = POSITIVE_INFINITY; P0 = URand (Li, Ui) V0 = 0 F0 = Fitness_F (P0) Φ0 = Fitness_Φ(P0) pBest0 = P0 For i = 0 To n lBesti = LocalBest (Pi-1, Pi, Pi+1) End for Do For i = 0 To n Vi+1 = Speed ( Pi,,Vi, pBesti, lBesti) Pi+1 = Pi+ Vi+1 r = URand (0,1) If (r ≤ p) TempPi+1 = Rand (Li, Ui)

(Selection rules)

(Equation (a)) (Equation (b))

IEEE Congress on Evolutionary Computation 2008

16

Modified PSO algorithm (cont.) 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. 31. 32. 33.

If (TempPi+1 isBetterThan Pi+1) Pi+1 = TempPi+1 Fi+1 = Fitness_F(Pi+1) Φi+1 = Fitness_Φ(Pi+1) If (Pi+1 isBetterThan pBesti ) pBesti = Pi+1 If (Φi+1 ≤ ε) If (Fi+1 < GlobalF) GlobalF = Fi+1 End For For i = 0 To n lBesti = LocalBest (Pi-1, Pi, Pi+1) End for If (GlobalF ≤ Δ ) Output GlobalF and Pi+1 Stop End Do

(Selection rules)

(Selection rules)

(Selection rules)

IEEE Congress on Evolutionary Computation 2008

17

Features of the modified PSO algorithm †

Whenever calculating fitness, both objectives F=f(X)-t and Φ need to be evaluated;

†

If a particle’s new location is better than its best past location, the pBest is updated (decided by selection rules);

IEEE Congress on Evolutionary Computation 2008

18

Features of the modified PSO algorithm (cont.) †

A particle’s best neighboring particle is determined by the two steps: „ „

Find all the non-dominated particles in the neighborhood; If there is only one non-dominated particle in the neighborhood, select it as lBest; otherwise, select one with the lowest Φ as lBest (the lower Φ means closer to the feasible region).

IEEE Congress on Evolutionary Computation 2008

19

Features of the modified PSO algorithm (cont.) †

†

†

A minor perturbation with the probability of p is introduced after the calculating the next particle position. If perturbed particle is better than the particle before perturbation, replace the particle with the perturbed one. After each iteration, check whether the goal is obtained; if obtained, the iteration ends. IEEE Congress on Evolutionary Computation 2008

20

Simulation †

Three well-known engineering design problems „

E01: Welded beam design problem (This problem has little difference from the original one proposed by Reklaitis et al in 1983 which has 5 constraints. For some reason, many researchers have studied the one has 7 constraints, we include this version for comparison purpose)

„ „

E02: Pressure vessel design problem E03: Spring design problem

IEEE Congress on Evolutionary Computation 2008

21

E01: Welded beam design problem Minimize f ( X ) = 1.10471x12 x2 + 0.04811x3 x4 (14.0 + x2 )

Subject to g1(X) =τ(X) −τmax ≤ 0 g2(X) =σ(X) −σmax ≤ 0 g3(X) = x1 − x4 ≤ 0 g4(X) = 0.10471x12 +0.04811x3x4(14+ x2) −5≤0 g5(X) =0.125− x1 ≤0 g6(X) =δ(X) −δmax ≤0 g7(X) = P−Pc (X) ≤0

IEEE Congress on Evolutionary Computation 2008

22

E02: Pressure vessel design problem

Minimize

f ( X ) = 0.6224 x1 x3 x4 + 1.7781x2 x32 + 3.1661x12 x4 + 19.84 x12 x3

Subject to

g 1 ( X ) = 0.0193 x 3 − x1 ≤ 0 g 2 ( X ) = 0.00954 x 3 − x 2 ≤ 0 g 3 ( X ) = 1296000 − π x 3 2 x 4 −

4 π x33 ≤ 0 3

g 4 ( X ) = x 4 − 240 ≤ 0

IEEE Congress on Evolutionary Computation 2008

23

E03: Spring design problem

Minimize Subject to

f ( X ) = ( x 3 + 2) x 2 x1 2 g1 ( X ) = 1 − g2 ( X ) =

4 x 2 2 − x1 x 2 1 + −1 ≤ 0 3 4 12566( x 2 x1 − x1 ) 5108 x1 2

g3 ( X ) = 1 − g4 ( X ) =

x 2 3 x3 ≤0 71785 x1 4

140.45 x1 ≤0 x 2 2 x3

x 2 + x1 −1 ≤ 0 1.5 IEEE Congress on Evolutionary Computation 2008

24

Parameters † † † † † † † † † †

†

Neighbourhood topology: ring (circular) w = 0 (Empirically derived) c1=c2=2 χ=0.63 Vmax= 0.5*(decision variable range) p = 0.1% number of particles is 100 the maximum iteration is set to 10,000 ε = 1.0E-9 Δ = 1.0E-04 (E01 and E03), 1.0E-01 (E02), because E02 has large magnitude than E01 and E03. 30 independent runs IEEE Congress on Evolutionary Computation 2008

25

Result for E01 - Design variables Comments: „ The best-known: 1.724852 „ Our best result: 1.724852321 „ Mean: 1.724861948 „ Standard dev.: 2.05462E-05 „ Search quality: good „ Consistency: good

IEEE Congress on Evolutionary Computation 2008

26

Result for E01 - Algorithm performance Comments: „ Best solution found: † Better than other three „ Mean result found: † Better than other three „ Standard derivation: † Better than C&M † Slightly worse than CPSO and HES-PSO but acceptable

IEEE Congress on Evolutionary Computation 2008

27

Result for E02 - Design variables Comments: „ The best-known: 6059.94634 „ Our best result: 5971.4003 „ Mean: 6049.1590 „ Standard dev.: 22.841537 „ Search quality: good „ Consistency: good

IEEE Congress on Evolutionary Computation 2008

28

Result for E02 - Algorithm performance Comments: „ Best solution found: † Better than other three „ Mean result found: † Not as good as other three „ Standard derivation: † Not as good as other three „ By looking up Table III, it seems our approach needs more iterations to achieve consistent results.

IEEE Congress on Evolutionary Computation 2008

29

Result for E03 - Design variables Comments: „ The best-known: 0.012665 „ Our best result: 0.012665236 „ Mean: 0.012714543 „ Standard dev.: 6.28E-05 „ Search quality: good „ Consistency: good

IEEE Congress on Evolutionary Computation 2008

30

Result for E03 - Algorithm performance Comments: „ Best solution found: † Better than other three „ Mean result found: † Similar to other three „ Standard derivation: † Similar other three

IEEE Congress on Evolutionary Computation 2008

31

Discussion • Performance

Comments: „ Depend on Φ and Δ † larger Φ and Δ will make the search easy „ We use smaller Φ and Δ † ε = 1.0E-9 † Δ = 1.0E04, 1.0E01 and 1.0E04 † Efficient † Rational cost

IEEE Congress on Evolutionary Computation 2008

32

Conclusion † † † † † † †

†

Multi-objective constraint handling method Adopt domination concept Selection rule Perturbation for diversity No problem-dependent parameters Rational computation cost Simulation to 3 engineering design problems demonstrated: capability, efficiency Methodology can be applied to wider applications IEEE Congress on Evolutionary Computation 2008

33

Performance Based Unit Loading Optimization using ...

Background (cont.) □ Constraint-Handling via Genetic Algorithm has been studied. □ Constraint-Handling via Particle Swarm. Optimization Algorithm has few ...

557KB Sizes 0 Downloads 207 Views

Recommend Documents

No documents