Algorithm Design

Algorithm Design

  1. Exhaustive algorithms (brute force):

    examine every possible alterative to find the solution, Exhaustive algorithm will always find the global optimum and recognize it too. That being said, it doesn't scale (not even beyond toy data sets) and is therefore mostly useless. It is noteworthy to note that brute force is mostly unusable for a real-world problem due to time limitations.

  2. Branch and Bound algorithm:

    Branch And Bound also explores nodes in an exponential search tree, but it investigates more promising nodes first and prunes away worthless nodes. For each node, Branch And Bound calculates the optimistic bound: the best possible score to which that node can lead to. If the optimistic bound of a node is lower or equal to the global pessimistic bound, then it prunes away that node (including the entire branch of all its subnodes). In summary,it omit searching through a large number of alternatives by branch-and-bound or pruning

  3. Divide and conquer algorithms:

    Divide and Conquer is broadly a 3-step strategy:

    1. Divide the actual problem into sub-problems (A subproblem is just a smaller instance of the same problem).

    2. Conquer i.e. recursively solve each sub-problem.

    3. Combine the solutions of the sub-problems to get the solution to the actual problem.

For example, Let there be a problem of size N and let us divide this problem into 4 sub-problems say n1, n2, n3, and n4.

  1. Greedy algorithms: find the solution by always choosing the currently ”best” alternative
  1. Dynamic programming: The dynamic programming paradigm was formalized and popularized by Richard Bellman in the mid-1950s, while working at the RAND Corporation, although he was far from the first to use the technique. Dynamic programming is not about filling in tables. It’s about smart recursion

    Dynamic programming algorithms are best developed in two distinct stages.

    1. Formulate the problem recursively: Write down a recursive formula or algorithm for the whole problem in terms of the answers to smaller subproblems. This is the hard part. A complete recursive formulation has two parts: (a) Specification : Describe the problem that you want to solve recursively, in coherent and precise English—not how to solve that problem, but what problem you’re trying to solve. Without this specification, it is impossible, even in principle, to determine whether your solution is correct. (b) Solution: Give a clear recursive formula or algorithm for the whole problem in terms of the answers to smaller instances of exactly the same problem.
    2. Build solutions to your recurrence from the bottom up. Write an algo- rithm that starts with the base cases of your recurrence and works its way up to the final solution, by considering intermediate subproblems in the correct order. This stage can be broken down into several smaller, relatively mechanical steps: (a) Identify the subproblems. What are all the different ways your re- cursive algorithm can call itself, starting with some initial input? (b) Choose a memoization data structure. Find a data structure that can store the solution to every subproblem you identified in step (a). This is usually but not always a multidimensional array. (c) Identify dependencies. Except for the base cases, every subproblem depends on other subproblems—which ones? Draw a picture of your data structure, pick a generic element, and draw arrows from each of the other elements it depends on. Then formalize your picture. (d) Find a good evaluation order. Order the subproblems so that each one comes after the subproblems it depends on. You should consider the base cases first, then the subproblems that depends only on base cases, and so on, eventually building up to the original top-level problem. The dependencies you identified in the previous step define a partial order over the subproblems; you need to find a linear extension of that partial order. Be careful! (e) Analyze space and running time. The number of distinct subproblems determines the space complexity of your memoized algorithm. To compute the total running time, add up the running times of all possible subproblems, assuming deeper recursive calls are already memoized. You can actually do this immediately after step (a). (f) Write down the algorithm. You know what order to consider the subproblems, and you know how to solve each subproblem. So do that! If your data structure is an array, this usually means writing a few nested for-loops around your original recurrence, and replacing the recursive calls with array look-ups.