Visit
www.annedawson.net for more Computer Science Education resources
Last updated: Saturday 28th February
2009, 11:40 PT by AHD
Complexity of Algorithms Using Big-Oh
notation
A function f(n) is Big-Oh of a
function g(n) if:
there exists positive constants c
and n0 such that: f(n) <= c g(n) for n > n0
O(1000n) = O(n)
O(n2 + 3n + 2) =
O(n2)
O(3n3 + 6n2
- 4n + 2) = O(3n3) = O(n3)
If f(x) = n2 * log n = O(n2 logn)
Example 1
We can often analyze the running time of a
program by determining the number of times selected statements are executed. We
can usually get a good estimate of the running time by considering one type of
statement such as some statement in a for loop.
for (i = 0; i < n; i++)
for (j = 0; j < n; j++)
cin >> A[i][j];
Number of times cin >> A[i][j]; executed: n2
Complexity: O (n2)
Analysis: In a nested for loop with fixed lower and upper limits (e.g.
"0" and "n") the number of times the insider part is
executed is just n x n . f(n) = n2
. So this program section
runs in O(n2) time.
Example 2
for (i = 0; i < n; i++)
for (j = 0; j < i; j++)
A[i][j] = 0;
Analysis: This code zeroes out the lower
triangular part of a two-dimensional array. Since the inner loop is executed a
variable amount of times, we can't just multiply n by i:
When i = 0 the inner loop is
executed 0 times.
When i = 1 the inner loop is executed 1 time.
When i = 2 the inner loop is executed 2 times.
So the number of times the statement "A[i][j] = 0" is executed is: 1
+ 2 + 3 + ... + n
This sum comes up a lot in algorithm analysis. You should memorize the formula:
1 + 2 + 3 + ... + n = n(n+1)/2
= 1/2(n2 + n)
Since in Big-Oh analysis we ignore leading constants (the 1/2 in the equation above) and lower order terms (the n in the equation above), this algorithm runs in O(n2) time.
Example 3
Dim iSum,i,j,k As Integer
For i = 1 to n
For j = 1 to
n
iSum = iSum
+ 1 | = 1 x n
Next j
For k = 1 to 2 *
n
= x n
iSum =
iSum + 1
|
iSum =
iSum + 1
| = 3 x 2n
iSum =
iSum + 1
|
Next k
Next i
Number of times executed: n x (1 x n + 3 x 2n) = n2 + 6n2
= 7n2
Complexity: O (n2)
Since in Big-Oh analysis we ignore leading constants (the 7 in the equation above), this algorithm runs in O(n2) time.
Example 4
Dim iSum,i,j,k As Integer
For i = 1 to n
For j = 1 to n
iSum = iSum + 1 = 1 x n
Next j
For k = 1 to 2 * n = x n
iSum = iSum + 1 = 1
For m = 1 to 4 * n step 2 = 2n
iSum = iSum + 1 = 1 x 2n
Next m
Next k
Next i
Number of times executed: n (1 x n + 2n (1 + (1 x 2n)))
= n (n + 2n (1 + 2n))
= n (n + 2n + 4n2)
= n2 + 2n2 + 4n3
= 3n2 + 4n3
Complexity: O (n3)
Since in Big-Oh analysis we ignore leading constants (the 3 and 4 in the equation above)
and lower order terms (the n2 in the equation above), this algorithm runs in O(n3) time.
Example 5
void main(void)
{
int i, tofind, A[100], n;
i = 0;
cin >> tofind;
while ((A[i] != tofind) && (i < n)) i++;
if (i > n) cout << "not found";
else cout << "found";
}
Analysis: This code has a comparision: (A[i] != tofind) . We count comparisons to determine the
running time. This means the running time is essentially the number of times
the while loop is executed. This depends on the input and the value of tofind which we do not know. So we must consider the worst possible case. In
the worst case the value of tofind is not found and the while loop is executed n times. Therefore the
worst-case running time is O(n).
Example 6
All the previous examples have been actual
code. You should be able to analyze pseudo-code too, although this can be
tricky because pseudo-code is more vague. In pseudo-code the last example above
would be:
search the array starting from location 0
comparing each element to the search value until it is either found
or the end of the array is reached.
Analysis: The worst-case will be when the
search value is not found and is compared to every element of the array. If
there are n elements this takes O(n) time.
Example 7
Some algorithms have logarithmic
(base 2) complexity:
1. i = n; // i starts from n
2. while (i >= 1)
3. {
4. x = x + 1; // count this line
5. i = i / 2; // i becomes half at the end of every iteration
6. }
iteration |
value of i |
number of times |
1 |
n |
1 |
2 |
n/21 |
1 |
3 |
n/22 |
1 |
k |
n/2k-1 = 1 |
1 |
k+1 |
n/2k = 0 |
0 |
We are interested in what k is
(because that's the number of times the line 4 is executed). In other
words,
T(n) = 1 + 1 + 1 + .. + 1 = (k
* 1)
|<-- k of
them--->|
To derive k, we look at the relation
at the last iteration (kth):
So, T(n) = log2n + 1 (where log2n is
alternatively written as log n).
Common
asymptotic growth functions (complexity classes) in an ascending order:
T(n) |
Name |
1 |
constant |
log n |
logarithmic |
n |
linear |
n log n |
n log n |
n2 |
quadratic |
n3 |
cubic |
nm |
(general) polynomial (m is a fixed, non-negative integer; ; e.g. n4, n5) |
mn |
exponential (m >= 2; ; e.g. 2n, 3n) |
n! |
factorial |