In short, time-complexity analysis is the abstract, mathematical approach of determining the running-time of an algorithm as its input size changes. There a number of categories that algorithms can fall under, depending on their aforementioned running times: Big-Oh, Big-Omega, Big-theta [among others not covered in this course].
Each of the latter has certain properties showing us to the growth rate of an algorithm.
- Big-Oh. Example: f ϵ O(g). This, "function f is in Big-Oh of g," means that function f grows no faster than function g. In other words, g is upper-binding
- Big-Omega. Example: f ϵ Ω (g). This, "function f is in Big-Omega of g," means that function f grows no slower than function g. In other words, f is lower-biding.
- Big-Theta. Example: f ϵ Θ (g). This, "function f is in Big-theta of g," means that function f grows as fast as function g.
This is definitely fascinating as not only an estimate is devised of the efficiency of a program, different programs can be compared and contrasted to find the one with most optimum performance. Computer science fascinates me!
No comments:
Post a Comment