Temporary Work Space |
|
|
Ignore this page, it's temporary working, to be integrated into other pages, or deleted. |
x.
x.
..
..
..
..
..
..
..
..
..
..
..
..
..
..
..
..
..
So really, everything that dominates $f(n)=n$ is either and in a very real sense that absolutely is the best we can do. In particular, for all the other functions, the linear function is dominated by them! In other words, of all the upper bounds (in this sense), the linear function is the "smallest."
(We are using the option given to us that we can multiply up by a suitable constant, and then our dominating function is always larger.)
In words, for this (hypothetical) algorithm, the time taken is less than a linear function, and the linear function is the "best" upper bound.
That is why we say that this algorithm takes linear time.
So what have we ended up with?
This can be summarised as "sketch how long we think the algorithm takes. |
Look at the long-term behaviour of that function.
What we want is to find the slowest growing function that dominates (in the technical sense) the algorithm's behaviour.
That is the time-complexity of our analysis of that algorithm.
Usually we are less precise, less rigorous, less pedantic, and we say that it's "the time-complexity of the algorithm," because usually our analysis is "close enough".
So when we say that an algorithm is $O(n^2)$ what we mean is this:
Contents |
Links on this page |
|
Quotation from Tim Berners-Lee |