Theory of Computing
Lecture 21
MAS 714
Hartmut Klauck
Time Hierarchy
•
•
•
•
•
We briefly return to complexity theory
We believe that NP is larger than P
Now we prove that EXP is larger than P
Why now?
Idea of the proof is similar to undecidability
proofs/diagonalization
Time Hierarchy
• Informal statement:
– If we increase the computing time enough, more
languages can be decided
– DTIME(f(n)) is larger than DTIME(g(n)) as long as
f(n) is suitably larger than g(n)
• Problem: in this generality the statement is
false
• Fix: f(n) and g(n) need to be `nice’ bounds
Constructible bounds
• What is nice? We should be able to compute f(n), so we can
use a counter from 1…f(n)
• Definition: A function f(n):N N is called timeconstructible, if f(n) can be computed by a TM from input
1n in time O(f(n))
• Example: everything reasonable, e.g. polynomial,
exponential bounds etc. Usually first compute n in binary
• Example: given 1n we can compute n in binary in time
O(n log n) and then n2, takes time O(n log n) in total
• Example: given 1n we can compute 2n in time O(n)
Time Hierarchy
• Theorem:
– Let f(n),g(n)>n be time-constructible functions
with f(n)=(g2(n) log2 n)
– Then there is a language L such that
L is in DTIME(f(n))
L is not in DTIME(g(n))
Proof
• L={<M>,x: M accepts x in at most g(|x|) log|x| steps}
– M is 1-tape TM
• We first need to show that L in DTIME(f(n))
– On a 1-tape TM
– We give a 2-tape machine that runs in time O(g(|x|) log |x|),
then there is a 1-tape machine that runs in time O(f(n))
• old exercise
–
–
–
–
–
2-tape machine: tape 1: simulate M on x using the universal TM
tape 2: first construct g(|x|) [time g(n)]
Then use tape 2 to count the steps of M
If more than g(|x|) log |x| steps used: reject
Accept if M accepts in g(|x|) log |x| steps
Proof
• Show that L not in DTIME(g(n))
• Assume A decides L in time O(g(n))
• Intuition: cannot simulate M, because the time is not
enough
• Construct a new machine N that on input x simulates A on
x, x and rejects if A accepts on x, x and vice versa
• N is O(g(n)) time bounded [use universal 1-tape TM]
• What happens on input x=<N> ?
– If N accepts <N> in time at most g(|x|)log |x| then A says
accept, and hence N will reject. This takes time O(g(|x|))
– If N does not accept <N> in time g(|x|) log |x|, then A says
reject, and N will accept. This takes time O(g|x|)
• Contradiction for large enough |x|
Note
• Tighter time hierarchies are known, it is
enough to increase the time bound by
!(log g(n)) factor
– Idea: simulate the 2-tape machine by keeping the
counter on the first tape, and moving it in each
step together with the head movement
– Counter needs O(log g(n)) bits
• For the nondeterministic time hierarchy the
log-factor is not needed
Corollary
• DTIME(nk) is a proper subset of DTIME(n2k+1)
for all constants k
• P is a subset of DTIME(nlog n)
– log n grows faster than any constant
• P is a proper subset of EXP
• Improved hierarchy: DTIME(nk) is proper
subset of DTIME(nk+²)
Space Complexity
•
•
•
•
We list a few results
NPµ PSPACEµ EXP
PSPACE=NPSPACE
More specifically, NSPACE(f(n))µDSPACE(f2(n))
– Savitch’s theorem
• Def: co-NSPACE(f(n))={L: C(L)2 NSPACE(f(n))
• co-NSPACE(f(n))=NSPACE(f(n))
– Immerman–Szelepcsényi
• Space-hierarchy: like time hierarchy, no log-factor
needed
Provably intractable
• Problems that are complete for EXP can
provably not be computed in polynomial time
• Problems that are complete for EXPSPACE can
provably not be computed with polynomial
space
• Completeness: polynomial time reductions
• Proof:
– If the complete problem is in P then EXP=P resp.
EXPSPACE=PSPACE, contradiction to hierarchy
P vs. NP
• Why can we not prove that P is not equal to
NP using diagonalization?
• The diagonalization technique relativizes, i.e.,
it applies to oracle TM’s
• There are oracles such that relative to them
P=NP (and oracles where the opposite is the
case)
Last Part of the Course
• Turing machines with constant space
• I.e., machines that cannot write on the tape
– Any additional constant space can be moved into the
internal state
• Definition: A two-way deterministic finite
automaton is a Turing machine that cannot write
on it’s tape and cannot leave the section where
the input is
• Definition: A deterministic finite automaton
(DFA) is a two way DFA whose head only moves
from left to right on the input tape
Finite Automata
• Definition: A DFA is a 5-tuple (Q,§,±,q0,F)
– Q: states
– §: alphabet
– ±: Q£§ Q: transition function
– q0: starting state
– FµQ: accepting states
• No need to specify terminal states, halts on
the last input symbol
Graph of a finite automaton
• An DFA can be described by a directed graph
with |Q| vertices and |Q|¢ |§| edges
• Every vertex has |§| outgoing edges
• Computation starts on vertex q0
• In each step we follow the edge labeled by the
next input symbol from the current state to
the next state
• DFA accepts, iff the state reached after
reading the whole input is in F
Example 1: Parity
Example:
• Set of all strings that contain substring 001
Regular languages
• Definition: A language is called regular, if it
can be decided by a DFA
– This is definition 1, name is explained by definition
2 later (via regular expressions)
Topics:
1. Closure Properties
2. Power of Non-determinism
3. Which languages are regular and which ones
are not?
4. Two-way versus one-way automata
5. Regular expressions
6. DFA as a data-structure for languages
– DFA minimization
1) Closure: complementation
• Observation:
– If L is regular, then so is the complement of L
• Proof: swap accepting and nonaccepting
states of a DFA
• Closure under union and intersection: later
© Copyright 2026 Paperzz