Late Night Big O Revelations

Big O complexity describes a relationship between the length of the input and the amount of “steps” it will take to complete some operation.

I am observing a function, with no numbers. If there is any multiplication / division of constant numbers that do not rely on anything, delete them. Just delete them. O(1), which is constant time, means the amount of steps (y), relates to n, the size of the input, as a horizontal line!

y = 1 (* whatever) is a horizontal line, right!?

O(n) means a straight, diagonal line, like a linear function. Hence, linear time. y = n.

( I should mention this is all worst case - the actual poly time it takes can be less than any of these lines. )

O(n2) means an polynomial curve! y = n^2!

Now… logarithmic…

O(log n)… the base doesn’t matter. so y = log n, say base 10, says if n is going up like 10, 20, 30 then y is 1, 1.something, 1.somethingbigger. Because log n (base 10) means y is the number of 10 that must be multiplied together to create n.

And the curve is like that because the input (n) is not growing exponentially, but in order to have the number of bases (y) grow linearly, the result (n) must have exponential growth. It doesn’t, so y lags behind and becomes the curve you see.

 
2
Kudos
 
2
Kudos

Now read this

Including modules on singleton objects in Ruby

Zenchi‘s relational model has the following relationship: User has_many memberships A membership is say, a Coinbase account. This model contains the email associated with this account, among other things of that nature. Separately, we... Continue →