The time
complexity of the algorithm
In
computer science, the time complexity of an algorithm quantifies the amount of
time taken by an algorithm to run as a function of the length of the string
representing the input[1]:226. The time complexity of an algorithm is commonly
expressed using big O notation, which excludes coefficients and lower order
terms. When expressed this way, the time complexity is said to be described
asymptotically, i.e., as the input size goes to infinity. For example, if the
time required by an algorithm on all inputs of size n is at most 5n3 + 3n, the
asymptotic time complexity is O(n3).
Time complexity is
commonly estimated by counting the number of elementary operations performed by
the algorithm, where an elementary operation takes a fixed amount of time to
perform. Thus the amount of time taken and the number of elementary operations
performed by the algorithm differ by at most a constant factor.
Since an algorithm's
performance time may vary with different inputs of the same size, one commonly
uses the worst-case time complexity of an algorithm, denoted as T(n), which is
defined as the maximum amount of time taken on any input of size n. Time
complexities are classified by the nature of the function T(n). For instance,
an algorithm with T(n) = O(n) is called a linear time algorithm, and an
algorithm with T(n) = O(2n) is said to be an exponential time algorithm.
No comments:
Post a Comment