Submitted by moazkhan on Friday, August 22, 2014 - 10:50.

While explaining previous tutorials, I was finding it difficult to explain various functions as I could talk in terms of efficiency and order. So, I have decided to include this earlier.
In this tutorial, you will learn
1. What efficiency means.
2. Why time is more important than space.
3. The meaning of growth rate.
4. Explain different orders of growth.
**What is efficiency of a program?**
Efficiency of a program is directly related to the time and computer space it needs for execution. It means while measuring the efficiency of a program, the time and space it requires is studied. A single program can be implemented in a variety of ways. E.g. an array can be sorted in a number of ways. Efficiency analysis comes in handy when we are comparing different solutions of the same problem. Efficiency analysis should be independent of following factors
1. The specific language of coding.
2. Computer architecture and processor speed
3. Data.
The first two are quite easy to grab but the third factor needs to be understood. Basically, the amount of data should not be a factor. E.g. if some algorithm takes less resources for small data but takes a lot of data for larger data then we need a generic efficiency that can be applied to any amount of data. Infact, efficiency analysis is mostly applied for large processing where we actually have to worry about resources so we mostly take amount of data to be large. You will see later how this is achieved.
**Why is time more important than space?**
In modern computers we have a lot of RAM and hard disk and even though the processor speed is also very large, we still give more importance to time. Actually, if we are playing a game on a desktop, I would want the game to run without getting stuck and my secondary importance would be the space it requires. Similarly, in a telephone exchange where a lot of calls are made and data is processed, the primary importance would be given to the time. I was reading a very interesting fact about time complexity. Scientists have estimated that for complete computation of various transformations of one single DNA chain for one single protein would require 1 THz (T means tera and one tera Hz is equivalent to ten raised to power 12 Hz) processor and that much time for this code or large time for computation of any other program is an issue and space is not a very big issue nowadays.
**What does growth rate mean?**
Growth rate is the measurement of time requirement of an algorithm as the function of the problem size O(N). Sometimes the problem size is also called the amount of data and the way to measure problem size depends on what type of data structure you are using or in short, it depends on the application. E.g. in case of linked list it depends on number of nodes and incase of arrays, it depends upon size of array. Suppose there are two codes with order N^2 and 5*N. The better option is 5*N because its growth rate is less. Again, I said that efficiency analysis is done for bigger problems so N would be larger than 5 so we N*2 would be greater than 5*N and we would prefer the solution which generated 5*N growth rate.
**What are some orders of growth rate?**
Given below are some of the main orders of growth rate.
1. Constant O(1)
2. Linear O(N)
3. Quadratic O(N^2)
4. Logarithmic O(ln(N))
5. Exponential O(a^N)
Constant growth rate means that the time requirement of a certain task does not depend upon problem size and equal time is taken by computer to perform that task no matter how much data is present. E.g. access of data in arrays does not depend upon the amount of data.
No matter what the size of array is, it requires unit time.
Linear growth rate means that the time required for the computation is directly proportional to the problem size or amount of data. E.g. the traversal in linked list depends upon size of data and in fact it is directly proportional to the amount of data. Basically, single loop operations are linear.
Quadratic growth rate means that if the data/problem size is doubled then the time required becomes four times. E.g. if there is a nested loop in a loop then the operation is considered to have quadratic growth rate. Note: I said nested loop. If there are two separate loops one after the other then order will be linear as constant before O(N) is ignored but more on this later.
Logarithmic growth means that the time requirement increases rapidly at start but then the increasing rate becomes quite slow. It is mostly used in divide and conquer algorithms like binary search. In binary search, when you have to search an element and you dividing the data. Hence, less computation is required like in a dictionary where a pivot point is selected and if the word searched according to order should lie next to thet pivot then the next half is searched only. You can search about binary search on google for further information and C++ implementation.
Exponential growth refers to a very high growth. O(a^n) . Thus, this should be avoided in our programs.
Other important growth rates include cubic O(N^3) and O(nlog(N)) .
Note: In next tutorial, finding growth rate of an algorithm will be discussed. So do read the next tutorial to grab the concept of efficiency analysis of an algorithm.

Arr[5]=2;

- 89 views