Deterministic time complexity class
Definition
Given a model of computation for a single function
We consider here a model of computation where deterministic Turing machines with certain constraints are allowed, and strings all over a fixed alphabet. The goal of the Turing machine is to determine whether the string is in a particular language or not.
Suppose is an (Eventually) increasing function from positive integers to positive integers. The deterministic time complexity class is defined as the collection of all languages for which, for sufficiently large , there exists a deterministic Turing machine that can, in at most steps, determine whether the string is in the language or not, where is the length of the string.
There are three key aspects to note:
- The function is in terms of the length of the string, but the Turing machine itself is independent of the length of the string.
- The time complexity is worst case complexity over all strings of length .
- We are interested in the behavior for sufficiently large only.
Given a model of computation for a family of functions
Fill this in later
General definition
In general, deterministic time complexity classes are ill defined because the smallest number of steps that can be taken by an algorithm depends heavily on the model of computation. Roughly speaking, the specifics of the model of computation can introduce a multiplicative error that is roughly polynomial in the length of the string. Thus, in practice, we consider deterministic time complexity classes for families of functions with the property that multiplying a function in the family by a polynomial with positive leading coefficient gives a function that is still bounded from above by a function in the family.