Lecture Notes on Design and Analysis of Algorithms 18CS42 (original) (raw)

Basic Concepts Algorithm An Algorithm is a finite sequence of instructions, each of which has a clear meaning and can be performed with a finite amount of effort in a finite length of time. No matter what the input values may be, an algorithm terminates after executing a finite number of instructions. In addition every algorithm must satisfy the following criteria: Input: there are zero or more quantities, which are externally supplied; Output: at least one quantity is produced; Definiteness: each instruction must be clear and unambiguous; Finiteness: if we trace out the instructions of an algorithm, then for all cases the algorithm will terminate after a finite number of steps; Effectiveness: every instruction must be sufficiently basic that it can in principle be carried out by a person using only pencil and paper. It is not enough that each operation be definite, but it must also be feasible. In formal computer science, one distinguishes between an algorithm, and a program. A program does not necessarily satisfy the fourth condition. One important example of such a program for a computer is its operating system, which never terminates (except for system crashes) but continues in a wait loop until more jobs are entered. We represent algorithm using a pseudo language that is a combination of the constructs of a programming language together with informal English statements.