Asymptotic Notation

Its been a while, since I have blogged. To start with I am posting a basics of asymptotic analysis of algorithm, which we study at the start of the course. Now we are going to be less precise and worry only about approximate answers for large inputs. The Big-Oh Notation Definition: Let f(n) and g(n) be real-valued functions of a single non-negative integer argument. We write f(n) is O(g(n)) if there is a positive real number c and a positive integer n0such that f(n)≤cg(n) for all n≥n0. [Read More]

Growth Rate - The Importance of Asymptotics

It really is true that if algorithm A is o(algorithm B) then for large problems A will take much less time than B. Definition: If (the number of operations in) algorithm A is o(algorithm B), we call A asymptotically faster than B. Example:: The following sequence of functions are ordered by growth rate, i.e., each function is little-oh of the subsequent function. log(log(n)), log(n), (log(n))2, n1/3, n1/2, n, nlog(n), n2/(log(n)), n2, n3, 2n, 4n, n! [Read More]

Algorithm Analysis

In computer science, the analysis of algorithms is the determination of the amount of resources (such as time and storage) necessary to execute them. Most algorithms are designed to work with inputs of arbitrary length. Usually, the efficiency or running time of an algorithm is stated as a function relating the input length to the number of steps (time complexity) or storage locations (space complexity). In computer science there can be multiple algorithms exist for solving the same problem (for example, sorting problem has many algorithms like insertion sort, selection sort, quick sort and many more). [Read More]