Find next higher number using exact same digits

Problem Given a number X find the number Y which is next higher number than X using the exact same digits as in X. Example For example: given 38276 return 38627 Solution Method 1 - Brute force The brute-force method is to generate the numbers with all digit permutations and consider all the higher numbers than the given number and return the minimum number from those higher numbers. [Read More]

Maximum value in a Sliding Window

Problem Statement Question: A long array A[] is given to you. There is a sliding window of size w, which is moving from the very left of the array to the very right. You can only see the w numbers in the window. Each time the sliding window moves rightwards by one position. Example: The array is [1 3 -1 -3 5 3 6 7], and w is 3. Window position Max \--------------- ----- 1311 3 -1 -3 5 3 6 7 3 1 3133 -1 -3 5 3 6 7 3 1 3 135-1 -3 5 3 6 7 5 1 3 -1 353-3 5 3 6 7 5 1 3 -1 -3 5365 3 6 7 6 1 3 -1 -3 5 3673 6 7 7 Input: A long array A[], and a window width w [Read More]

Closest Pair Algorithm

Here we will discuss about the closest pair algorithm. Input : A set p = { p1,p2, … pn} of n points in the plane (R2) Notation d(pi , pj) = Euclidean distance. d = x Assumption - All points have distinct x-coordinates distinct y-coordinates. Brute force method Say you are given a set of points in the plane and you want to find the two points closest to one another. [Read More]

Randomized Selection : Selecting the i'th smallest element

Input - array A with n distinct numbers and a number i ∈ {1,2,…,n} Output : ith order statistic i.e. ith smallest element of the input array O(n log n) algorithm Apply mergesort/quicksort Return the ith element of sorted array So, we used reduction via taking the help of sorting algorithm. So this was the naive way as we would be to say, “Hey let’s sort it and then we can just pick out the ith one! [Read More]

Meaning behind Master Method

After establishing what the running time of an algorithm based on three variables a, b, d, we might be wondering what is the meaning behind this method. Where do these logs and exponents come from? If we look at the amount of work done at a single level of the recursion tree, we get that it is, where aj is the number of level-j subproblems and the rest is the amount of work per level-j subproblem. [Read More]

Master Method for Solving Recurrences

Why Master method ? - Master method helps us analysing algorithms with their running time. So, lets consider the integer multiplication - here. Now we get equation 1. The way we reason about recursive algorithm like these is with the help of recurrence. Recurrences What is recurrence? T(n) = max number of operations this algorithm needs to multiply 2 n digit numbers. Recurrence is to express T(n) in terms of recursive calls. [Read More]

Strassen's Subcubic Matrix Multiplication Algorithm

A Review on Matrix Multiplication Given two 2 nxn matrices X, Y and their product matrix Z, X.Y = Z we know that Zij = (ith row of X) . (jth column of Y). So, we take row from X, and column from Y. zij = ∑xik . ykj where k is 1 < k < n. For eg. : Let X = a b and Y = e f [Read More]

Inversions

This was taught in Algorithms: Design and Analysis Part I on coursera - how to count inversions. What is an inversion? An inversion is a pair of objects that are out of order. If we are given an array and our goal is to sort from low to high, an inversion would be two items ai and aj such that ai > aj, but i < j. We can see that they are misplaced. [Read More]

Time complexity - Omega of algorithms

a) if statement - O(n) [Omega times n] b) for loop - O(n) [Omega times n] c) for loop with a for loop - O(n*n) [Omega times n squared] d) for loop within a if loop, this if loop is within a for loop - O(n + n*n) [n plus n squared omega] e) while loop - O(n) [Omega times n] f) MergeSort - O(nlogn) [Omega n time logarithmic n] [Read More]

Time complexity of algorithms - In particular Big Omega of algorithms

Big Omega or so called asymptotic upper bound of algorithms such as Sequential statements, If statements and Looping construct (i) Sequential statements: Big Omega of this algorithm equals to maximum time complexities of all individual statements (i.e. sum of maximum time complexities of the individual statements) Therefore, O[g(n)] = O{f1(n)+f2(n)+ … +fk(n)} Conclusion: Overall complexity of this algorithm equates to O[g(n)] (ii) If statements: Big Omega of this algorithm equals to maximum time complexities of either one of the alternative statements i. [Read More]