- (PSO) Suppose you want to use a PSO system to maximize the function
,
where 500 x(i) 500. A graph of this function is shown in the slides of the lecture (the functions global maximum is at f(x) = 837.9658 for x(i) = 420.9687, i = 1,2).
Consider an illustrative example of a PSO system for composed of three particles. The space of solutions is a two dimensional real valued space: 500 x(i) 500. Consider the update rule for each particle i:
,
where xi denotes the personal best and x the social (global) best. To facilitate calculation, for this exercise we will ignore the fact that r1 and r2 are random numbers and fix them to 0.5 and 1 = 2 = 1. Suppose the current state of the swarm is as follows.
- Position of particles: x1 = (400,400); x2 = (410,410); x3 = (415,415);
- Individual best positions: ;
- Social best position: x = x3; Velocities: v1 = v2 = v3 = (50,50).
- Compute the fitness of each particle.
- What would be the next position and fitness of each particle after one iterationof the PSO algorithm, when using = 2, = 0.5, and = 0.1? (In case a component of a new position falls outside the range 500 x(i) 500, it is mapped to its closest value in the range. For instance, if the computation of new position gives (550,500), it is set to (500,500).) (c) Explain what is the effect of the parameter .
(d) Give an advantage and a disadvantage of a high value of .
- (PSO) [2 points] Consider a particle swarm consisting of a single member. How would it perform in a trivial task such as the minimization of f(x) = x2 when < 1, assuming the particle starts with the velocity pointing away from the optimum (e.g. in a state with velocity v = 10; position x = 20)?
- (PSO) [2 points] Implement the PSO algorithm for clustering described in Van der Merwe, D. W., and Andries Petrus Engelbrecht. Data clustering using particle swarm optimization. Evolutionary Computation, 2003. CEC03. The 2003 Congress on. Vol. 1. IEEE, 2003. (see also swarm intelligence slides). Implement the k-means clustering.
Apply and compare the performance of the two algorithms in terms of quantization error on Artificial dataset 1 and on the Iris dataset (the latter available at UCI
ML repository, see https://archive.ics.uci.edu/ml/datasets/iris). In both algorithms, use the true number of clusters as value of the parameter for setting the number of clusters.
- (ACO) Read the paper: Blum, Christian, and Marco Dorigo. Search bias in ant colony optimization: On the role of competition-balanced systems. IEEE Transactions on Evolutionary Computation 9.2 (2005): 159-174. Figure 1 shows a (toy) problem instance for the 2-cardinality tree problem. The 2-cardinality tree problem amounts to finding a subtree T of a given undirected graph G with exactly 2 edges and the minimum possible weight.
- Is ACO for this problem a Competition-Balanced System (CBS)? Justify youranswer. (A definition of CBS also given in the slides.)
- If a combination of an ACO algorithm and a problem instance is not a CBS, isthe induced bias always harmful? Justify your answer.
Figure 1: 2-cardinality problem instance
- (ACO) The figure below shows an example of an instance of a sourcedestination problem from the ACO book by Dorigo and Stuetzle. The goal is to reach the destination node from the source one using a shortest path through the given graph. What results do you expect for an ant colony algorithm that does not use tabu lists (except for inhibition of immediate return to the previous node)?
Reviews
There are no reviews yet.