Please use this identifier to cite or link to this item:
http://localhost:8081/xmlui/handle/123456789/6992
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Pant, Millie | - |
dc.date.accessioned | 2014-11-05T04:22:57Z | - |
dc.date.available | 2014-11-05T04:22:57Z | - |
dc.date.issued | 2003 | - |
dc.identifier | Ph.D | en_US |
dc.identifier.uri | http://hdl.handle.net/123456789/6992 | - |
dc.guide | Deep, Kusum | - |
dc.description.abstract | Optimization problems arise in various disciplines such as engineering designs, • agricultural sciences, manufacturing systems, economics, physical sciences, pattern recognition etc. in fact newly developed computational techniques are being used in every sphere of human activity where decisions are to be taken in some complex situation that can be represented by a mathematical model. In view of their practical utility there is a need for efficient and robust computational algorithms, which can numerically solve on computers mathematical models of medium as well as large size optimization problem arising in different fields. Mathematical models of many real life problems turn out to be nonlinear in nature. Such problems often have local as well as global optimal solutions. Global optimal solution is usually more difficult to obtain, as compared to local optimal solution, but in many cases it is advantageous and sometimes even necessary, to search for the global optimal solution. In order to determine the global optimal solution of a nonlinear optimization problem, usually two approaches are followed: (1) The Deterministic Approach and (2) The Probabilistic Approach. Deterministic methods extensively use analytical properties such as continuity, convexity, differentiability etc. of the objective and the constraints to locate a neighborhood of the global optimum. It is now established that deterministic methods are best suited for restricted classes of functions, namely convex functions and one dimensional rational or Lipschitz functions and polynomials etc. whose mathematical properties can be easily determined and utilized at specified points or in specified intervals. The stochastic methods utilize randomness in an efficient way to explore the set over which the objective ii function is to be optimized. Contrary to general expectation stochastic methods perform well in the case of the most of the realistic problems over which these have been tried. In stochastic or probabilistic methods two phases are generally employed. In the first phase, also called the global phase, the function is evaluated at a number of randomly sampled points. In the second phase, also called the local phase, these points are manipulated by local searches to yield a possible candidate for the global optima. Although probabilistic methods do not give an absolute guarantee of determining the global minima, these methods are sometimes preferred over deterministic methods, because they are applicable to a wider class of functions as they depend on function evaluations alone and do not assume any mathematical properties of the functions being used. "Evolutionary Computations" is a relatively new and biologically motivated optimization and search paradigm. It may also be considered as an optimization Meta Heuristic. Evolutionary Computations represents a new powerful, rapidly growing field of artificial Intelligence, known as intelligent computation. Evolutionary Computations uses biological metaphor i.e. of Natural Genetics and Natural Selection to evolve a population of candidate solutions to a given problem. The main classes of evolutionary algorithms are: Genetic Algorithms, Evolutionary strategies, Evolutionary Programming, Genetic Programming and Learning Classifier Systems The main characteristic of Evolutionary Algorithms is the intensive use of randomness and genetics-inspired operations to evolve a set of candidate solutions. A population of candidate solutions is generated by using search operations inspired by biology, like recombination, mutation and selection. The candidate solution to a particular iii problem is referred to as chromosomes or individuals. The roots of Evolutionary Algorithms emerge from some evolution-inspired algorithms developed in 1950's. In the present work the concept of Genetic Algorithms have been used. Genetic Algorithms represent the main paradigm of Evolutionary Computation. A population of Chromosomes or individuals encoding potential solutions to a given problem is considered. In the traditional GA model, the chromosomes are bit strings of a fixed length and each chromosome represents a point in the search space. This search process is carried on by modifications in the search space with the help of GA operators. The most commonly used operators are Reproduction, Encoding, Crossover and Mutation. The five objectives of this thesis are: (1) To design an efficient and reliable computational technique for obtaining the global optimal solution of constrained/ unconstrained non-linear optimization_problems. (2) To develop the algorithms for special classes of nonlinear optimization problems i.e. Integer and Mixed Integer Programming Problems, Fractional Programming Problems, Geometric Programming Problems, Large Scale Problems and Multi — Objective Programming Problems. (3) To evolve parallel implementation of the algorithm so as to run on PARAM 10000 Parallel Computer, installed at I.I.T. Roorkee. (4) To test the algorithms on test problems appearing in literature. (5) To use the algorithms in real life problems arising in science and engineering. The RST2 algorithm developed by C. Mohan, K. Shanker (Deep), (1994) for obtaining the global optimal solution of nonlinear optimization problems, has been chosen as the basis of the present work because of its efficiency and robustness. The RST2 algorithm works in Two Phases. In the First Phase the random feasible solutions are generated and the value of the objective function is evaluated at them. In the Second Phase, these solutions are iv manipulated using quadratic approximation to yield a possible candidate for the global optima. The procedure is stopped when all the solutions cluster to the global optima. In this Thesis, the concept of RST2 algorithm has been merged with the concept of Genetic Algorithms to develop new algorithms with a view to increase its efficiency and reliability. At the first instance, three possible versions namely GEN1, GEN2, and GRST have been proposed for obtaining the global optimal solution of nonlinear unconstrained optimization problems. GEN1 and GEN2 are two methods, which are based on the Genetic Operators, including the criteria of fitness function. The proposed algorithms GEN1 and GEN2 have only two phases the Global Phase and the Genetic algorithm phase. In GEN1 the first phase is the Global Phase where a random set of feasible solutions are generated. The objective function is evaluated at each of these solutions and is stored in an array A. In this array A reproduction is performed using Roulette Wheel Selection method. On these reproduced solutions binary encoding is performed and crossover operator is applied. After that a single point mutation is performed. The only difference between GEN1 and GEN2 is in the global phase. In GEN2 mutation is performed immediately after crossover whereas in GEN1 if the candidate with best function value after crossover is better then the candidate with highest function value after reproduction, then the latter is replaced by the former otherwise the algorithm enters the Genetic Algorithm Phase. After that mutation is performed. In the third version i.e. the GRST version the concept of Genetic Algorithms has been merged with that of the Controlled Random Search using quadratic approximation approach (RST2 algorithm) of Mohan and Shanker (1994). The GRST works in three phases: In the First Phase the objective function is evaluated at a number of randomly generated feasible solutions, while in the Second Phase, at each iteration these solutions are manipulated by local searches (using quadratic approximation) to yield possible candidate for global optima. In an iteration, in case no satisfactory approximation is obtained, then instead of abandoning that iteration the algorithm enters the Third Phase where, the Genetic Algorithm operators namely reproduction, crossover and mutation are activated with the hope of finding a better approximation. The Genetic Random Search operators used in GRST are: Reproduction, Encoding and Crossover: The Flow charts of all the three proposed versions has been presented. They have been tested on a set of 44 cases of 20 unconstrained test problems taken from literature. The global optimal solutions of these test problems appear in literature as well. Due to the probabilistic nature of these problems each problem has been run 50 times. A run is considered a success if the obtained result lies within 1% of the known global optimal solution. The percentage of success has been calculated as the ratio between the successful runs and the total number of runs. The average function evaluations and the number of times Genetic Algorithms activated have also been reported. Based on a suitable performance index, a comparison has been made between GEN1, GEN2, GRST and RST2. The numerical and the graphical results indicate that the GRST outperforms the remaining versions in the unconstrained cases. Since GRST version emerged the best, therefore, its use has been extended to solve constrained problems as well. The performance of GRST vis-a-vis RST2 has been reported on a set of 15 test problems using a performance index. Based on the vi numerical and graphical results, it is concluded that GRST outperforms RST2 in the constrained cases as well. In addition to constrained and unconstrained test problems, GRST has been applied to solve Integer Programming Problems, Fractional Programming Problems, Geometric Programming problems etc... GRST has also been used to solve Large Scale Programming Problems. Due to the hardware restrictions of Pentium IV to solve the large-scale problems GRST has been implemented on Parallel Computer, PARAM 10000 and has been coded in C as well as C++. This Algorithm has been tested on 32 cases of 12 test problems where the number of variables varies from 10 to 100. In order to visualize the effect on the increase in number of variables the Sensitivity Analysis of GRST has also been performed. The effect of the size of the initial array has been analyzed. The initial array of the random feasible solutions has been taken as 2x(n+1), 5x(n+1) and 10x(n+1) where n denotes the number of variables of the optimization problem. The numerical and graphical results are reported. It is concluded that since 2x(n+1) is too small for the size of initial array, the size of the initial array for large-scale problems can be taken as 5x(n+1) otherwise the size of the initial array should be 10x(n+1) to ensure convergence. A Multi — Objective Programming Problem is a problem which has more than one conflicting objectives. Since no one solution can be termed as an optimum solution in the presence of multiple conflicting objectives, the resulting multi-objective programming problems gives rise to a number of tradeoff optimal solutions or Pareto Optimal solutions. Conventional optimization methods can find only one optimum solution in one run and are vii thus not suitable for solving Multi Objective Programming Problems. GRST can find multiple optimal solutions in a single run due to its population based approach. The GRST has been used to obtain the Pareto Optimal solutions of 10 Multi Objective programming problems and the Pareto Optimal Front has been plotted in each case. The GRST Algorithm is probabilistic in nature and depends upon the generation of initial random feasible solutions. Based on this random sample, the solutions are manipulated to obtain a possible candidate for global optimal solution. This depends on the seed for generating random number on the computer. Thus no two runs are expected to reach exactly the same solution. Hence more than one run of GRST is required to obtain the Global Optimal solution. One method of solving this difficulty is to make a parallel version of GRST so that a copy of GRST is placed on each processor and each processor in turn produces a copy of random feasible solutions based on independently selected seeds for generation of random numbers. With this objective in mind, the GRST Algorithm has been implemented on the Parallel Computer, PARAM 10000 installed at Indian Institute of Technology Roorkee. The parallel version of GRST has been named as P-GRST. The MPI (Message Passing Interface) concept has been used to program the algorithm. The MPI library routines have been accessed from the C and C++ programs. Since a copy of GRST is being placed on each processor, each processor selects a seed based on the rank of the particular processor and consequently generates the initial array of random feasible points. At the end of the execution the P-GRST Algorithm displays all global minima obtained by the different processors. The user is free to select the best out of the obtained solutions. It may be noted that the structure of PGRST is such that it does not require any inter process viii communication, which is a big hurdle in parallelization. As a result of this, the delay caused in the execution of the program due to interprocess communication does not exist. The performance of P-GRST algorithm has been reported on 21 cases of 5 test problems appearing in the literature. The numerical and graphical results are reported. The numerical and the graphical results clearly indicate that all the processors converge to the same global optima although they initiate from different random sample of feasible solutions. The rate of convergence plotted against the iteration numbers also indicates that P-GRST is a natural parallelization of GRST. It is the best alternative for running GRST a number of times. Also it does not require any inter-processor communication. It may be concluded that PGRST is well suited for large-scale problems as well. The efficiency of the algorithms developed in this thesis namely GRST, I-GRST and P-GRST have been used to solve a variety of real life problems arising in various fields like civil engineering, architecture, system reliability etc. These problems are single as well as multi-objective optimization problems. Most of these problems are non linear and some have integer restrictions imposed on them. In all the cases the results obtained by the algorithms developed in this thesis were comparable to the quoted results and sometimes even better than the quoted results. The thesis ends with the conclusion drawn on the basis of this work. Some suggestions for further work in this direction are also presented. | en_US |
dc.language.iso | en | en_US |
dc.subject | MATHEMATICS | en_US |
dc.subject | GENETIC ALGORITHMS | en_US |
dc.subject | GLOBAL OPTMIZATION | en_US |
dc.subject | GENETIC RANDOM SEARCH OPERATORS | en_US |
dc.title | GENETIC ALGORITHMS FOR GLOBAL OPTMIZATION AND THEIR APPLICATIONS | en_US |
dc.type | Doctoral Thesis | en_US |
dc.accession.number | G11508 | en_US |
Appears in Collections: | DOCTORAL THESES (Maths) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
TH MTD G11508.pdf Restricted Access | 11.54 MB | Adobe PDF | View/Open Request a copy |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.