ProAlgos / ProAlgos-Cpp

C++ implementations of well-known (and some rare) algorithms, while following good software development practices
MIT License
510 stars 363 forks source link

Mention time and space complexities in comments #8

Closed faheel closed 7 years ago

faheel commented 7 years ago

For each algorithm, mention its time complexity and space complexity in the "description comment" of its implementation program.

The format for the "description comment" (which is written at the beginning) should be:

/*
    <Name of algorithm>
    -------------------
    <Brief description>

    Time complexity
    ---------------
    O(...), where <description of variable(s)>    

    Space complexity
    ----------------
    O(...), where <description of variable(s)>
*/

Following are the algorithms for which time and space complexity hasn't been added yet:

ilayaraja97 commented 7 years ago

I think its a good idea to put this in the readme or coding guidlines.

faheel commented 7 years ago

Yes. Thanks for the suggestion!

Anamika28 commented 7 years ago
Name of algorithm: Binary Search
Binary search algorithm is a searching algorithm which uses Divide and conquer technique to search the given element. The prior requirements before applying Binary search is that elements should be present in sorted order.

Time complexity:
O(nlogn), where n is the number of input elements    

Space complexity:
O(n), where n is the number of input elements
faheel commented 7 years ago

Time complexity of binary search is O(logN) and it's space complexity is O(1)

abhishekabk5 commented 7 years ago

Name of algorithm: Bubble Sort Bubble sort is a simple sorting algorithm that repeatedly steps through the list to be sorted, compares each pair of adjacent items and swaps them if they are in the wrong order.

Time Complexity: O(n^2), where n is the number of elements in the list.

Space Complexity: O(1), temporary space for swapping.

abhishekabk5 commented 7 years ago

Name of algorithm: Insertion Sort Insertion sort is a simple sorting algorithm that builds the final sorted array (or list) one item at a time. Less suitable for large lists.

Time Complexity: O(n^2), where n is the number of elements in the list.

Space Complexity: O(1), temporary space for swapping.

abhishekabk5 commented 7 years ago

Name of algorithm: Selection Sort The selection sort algorithm sorts an array by repeatedly finding the minimum element (considering ascending order) from the unsorted part and putting it at its right place.

Time Complexity: O(n^2), where n is the number of elements in the list.

Space Complexity: O(1), temporary space for swapping.

abhishekabk5 commented 7 years ago

Name of algorithm: Merge Sort Merge sort is a Divide and Conquer algorithm. It divides input array into two halves, recursively calls itself for the two halves and then merges the two sorted halves.

Time Complexity: O(n*logn), where n is the number of elements in the list.

Space Complexity: O(n), where n is the number of elements in the list.

abhishekabk5 commented 7 years ago

Name of algorithm: Quick Sort Like merge sort, quick sort is a Divide and Conquer algorithm. It picks an element as pivot and partitions the given array around the picked pivot, then recursively calls itself on those partitions. Statistically, for large data sets, it amounts to be the fastest sorting algorithm.

Time Complexity: Worst case ->> O(n^2) Average case ->> O(n*logn), where n is the number of elements in the list.

Space Complexity: O(1)

faheel commented 7 years ago

@abhishekabk5 For sorting programs, the complexities have already been added.

But your descriptions of the algorithms are much better :+1:. Would you mind sending a PR for those? (It would be a good opportunity to get into open source!)

NOTE: For quick sort, the worst-case time complexity would also be O(N * log(N)) because this variant of quick sort uses random pivot selection which makes the worst-case complexity equal to the average-case complexity which is O(N * log(N)).

abhishekabk5 commented 7 years ago

Sure, I'll get to it.

abhishekabk5 commented 7 years ago

@faheel About quick sort, I don't believe that technically, random pivot selection would help in worst case time complexity as an unsorted array can already be considered as random. So, the worst case time complexity would still be O(n^2). Tell me if I'm wrong.

Also, in that case, random selection is redundant. It should be removed.

tb25 commented 7 years ago

Binary Search

Divide and conquer algorithm - Given an array of presorted integers: A(0), A(1), A(2), ... , A(n-1) and and a target value X, find and return i such that X = A(i) or return -1 if X is not in the list.

Solution:

Best case: θ(1) when X = middle right away.

worst/ average case: T(N) = θ(Log(N)), where, T(N) = the time complexity, θ(f(N)) = the tight bounded analysis.

Space complexity

Recursive: O( log(N) ), where N = the depth of the tree, and describes how many recursive stacks have been called.

Iterative: O(1) space since there are no stack calls.

faheel commented 7 years ago

@abhishekabk5 The randomised version of quick sort selects a pivot randomly for a given range of indices, and then partitions the elements around this randomly selected pivot. Therefore, the expected worst-case time complexity comes out to be O(N * log(N)) (see §3.4 in this document for a detailed proof).

The absolute worst-case time complexity, even for the randomised version, would still be O(N2). But as the number of elements increase, the probability that this case arises decreases (exponentially, I guess), and the algorithm "expectedly" runs in log-linear time rather than quadratic time.

So I think that mentioning both the expected and the absolute worst-case complexity for quick sort (or for any other randomised algorithm) would be the better thing.

faheel commented 7 years ago

@tb25 The overhead space of the call stack is not considered while calculating the space complexity of recursive functions. Moreover, the implementation of binary search here uses tail recursion which eliminates the stack overhead (if the compiler optimises for it).

tb25 commented 7 years ago

@faheel so you're saying that the space complexity of the recursive algorithm is O(N) or O(1)?

tb25 commented 7 years ago

Binary Search

Divide and conquer algorithm - Given an array of presorted integers: A(0), A(1), A(2), ... , A(n-1) and and a target value X, find and return i such that X = A(i) or return -1 if X is not in the list.

Solution:

recursive: check if the middle value is X and return if it is, else if the middle value is greater to X then search the left side of the list, this dividing the search time. Else search the right side. This is a recursive function, so the search will stop once X is found or the base case is reached.

iterative: use a loop and conditionals with the same logic. while (left <= right){ mid = (left+right)/2; if X... }

Time complexity

Best case: θ(1) when X = middle right away.

worst/ average case: T(N) = θ(Log(N)), where, T(N) = the time complexity, θ(f(N)) = the tight bounded analysis.

Space complexity

O(1), where 1 is the variable X that is created to check for the target value.