Saturday, November 1, 2014

A Big Family: Big-Oh, Big-Omega, Big-Theta

I was surprised during our lecture this week because of how much I enjoyed analyzing different run-times.  I have had a very brief introduction to related ideas in a previous course however I never cared much for the densely-theoretical approach emphasized by the concept.  Run-time analysis, now I understand, requires a meticulous, calculative nature which [now] makes analyzing different algorithms an enjoyable, deeply stimulating experience.

In short, time-complexity analysis is the abstract, mathematical approach of determining the running-time of an algorithm as its input size changes. There a number of categories that algorithms can fall under, depending on their aforementioned running times: Big-Oh, Big-Omega, Big-theta [among others not covered in this course].  

Each of the latter has certain properties showing us to the growth rate of an algorithm.
  1. Big-Oh.  Example: f ϵ O(g).  This, "function f is in Big-Oh of g," means that function f grows no faster than function g.  In other words, g is upper-binding
  2. Big-Omega.  Example: f ϵ Ω (g).  This, "function f is in Big-Omega of g," means that function f grows no slower than function g.  In other words, f is lower-biding.
  3. Big-Theta.  Example: f ϵ Θ (g).  This, "function f is in Big-theta of g," means that function f grows as fast as function g.
This is definitely fascinating as not only an estimate is devised of the efficiency of a program, different programs can be compared and contrasted to find the one with most optimum performance.  Computer science fascinates me!

No comments:

Post a Comment