Primer on Fourier Analysis
Dana Moshkovitz
Princeton University and
The Institute for Advanced Study
Fourier Analysis in
Theoretical Computer Science
Fourier Analysis in Theoretical Computer
Science (Unofficial List)
•
•
•
•
Polynomials multiplication (FFT) •
Collective Coin Flipping [BL,KKL] •
Computational Learning [KM]
•
Analysis of threshold
phenomena
•
• Voting/social choice schemes
…
• Quantum Computing
List Decoding [AGS03]
Analysis of expansion/sampling
(e.g., [MR06])
Linearity testing [BLR]
Hardness of Approximation
(dictator testing) [H97]
“The Fourier Magic”
Fourier Analysis
“something that looks
scary to analyze”
“bunch of
(in)equalities”
Today: Explain the “Fourier Magic”
What is it?
Why is it useful?
What does it do?
When to use it?
What do we know about it?
It’s Just a
Different Way to Look at Functions
It’s Changing Basis
• Background: Real/complex functions form
vector space
• Idea: Represent functions in Fourier basis,
which is the basis of the shift operators
(representation by frequency).
• Advantage: Convolution (complicated “global”
operation on functions) becomes simple
(“local”) in Fourier basis
• Generality: Here will only consider the
Boolean case – very-special case
Fourier Basis
(Boolean Cube Case)
• Boolean cube: additive group Z2n
• Space of functions: Z2n.
– Inner product space where f,g=Ex[f(x)g(x)].
• Characters: (x+y)=(x)(y)
Foundations
• Claim (Characterization): The characters are the
eigenvectors of the shift operators Ssf(x)→ f(x+s).
• Corollary (Basis): The characters form an
orthonormal basis.
• Claim (Explicit): The characters are the
functions S(x) = (-1)iSxi for S[n].
Fourier Transform =
Polynomial Expansion
• Fourier coefficients: f^(S) = f,S.
• Note: f^()=Ex[f(x)]
• Polynomial expansion: substitute yi=(-1)xi
f(y1,…,yn) = Sµ[n]f^(S)i2Syi
• Fourier transform: f f^
level
|S|
The Fourier Spectrum
n
n-1
…
n/2
…
1
0
Degree-k Polynomial
|S|
n
n-1
…
n/2
0
…
k
1
0
k-Junta
|S|
n
n-1
…
n/2
0
…
k
1
0
Orthonormal Bases
Parseval Identity (generalized
Pythagorean Thm):
For any f,
S(f^(S))2 = Ex[ (f (x))2]
So, for Boolean f:{±1}n→{±1},
we have:
x(f^(x))2 = 1
In general, for any f,g,
f,g = 2nf^,g^
Convolution
Convolution:
(f*g)(x) = Ey[f(y)g(x-y)]
Example
Weighted average:
(f*w)(0) = Ey[f(y)w(y)]
Convolution in Fourier Basis
Claim: For any f,g,
(f*g)^ f^·g^
Proof: By expanding according to definition.
Things You Can Do with Convolution
Parts of The Spectrum
• Variance:
Varx[f(x)] = Ex[f(x)2] - Ex[f(x)]2 = S≠; f^(S)2
• Influence of i’th variable:
Infi(f) = Px[f(x)≠f(xei)] = S3i f^(S)2
n
n-1
…
n/2
…
1
0
Smoothening f
• Perturbation: x»±y : for each i,
– yi = xi with probability 1-±
– yi = 1-xi otherwise
• T±f(x) = Ex»±y[f(y)]
• Convolution: T±f f*P(noise=µ)
• Fourier: (T±f)^ (1-2±)|S|·f^
Smoothed Function is
Close to Low Degree!
Tail: Part of |T±f|22 on levels ¸ k is:
· (1-2±)2k |f|22· e-c±k
Hence, weight on levels ¸
C · 1/ · log 1/
Hypercontractivity
Theorem (Bonami, Gross): For f,
for ± · √(p-1)/(q-1),
|T±f|q · |f|p
Roughly, and incorrectly ;-):
“T±f much [in a “tougher” norm] smoother than f”
Noise Sensitivity and Stability
• Noise Sensitivity:
NS±(f) = Px»±y (f(x)f(y))
• Correlation:
NS±(f) = 2(E[f]-f,T±f)
• Stability: Set := 1/2-/2
S½(f) = f,T±f
• Fourier:
S±(f) = f^, |S|f^
= §S |S| f^(S)2
Thresholds Are Stablest and Hardness
of Approximation
• What is it? Isoperimetric inequality
on noise stability [MOO05].
• Applications to hardness of
approximation (e.g., Max-Cut
[KKMO04]).
• Derived from “Invariance Principle”
(extended Central Limit Theorem),
used by the [R08] extension of
[KKMO04].
Thresholds Are Stablest
Theorem [MOO’05]: Fix 0<<1. For balanced f
(i.e., E[f]=0) where Infi(f)≤ for all i,
Sρ(f) ≤ 2/π · arcsin ρ + O( (loglog 1/²)/log1/²)
≈ noise stability of threshold functions
t(x)=sign(∑aixi), ∑ai2=1
More Material
© Copyright 2026 Paperzz