Description
Boolean functions, , are a basic object of
study in theoretical computer science. In this seminar we will study Boolean
functions via their Fourier transform and other analytic methods. The
powerful techniques from this field have application in numerous areas of
computer science. We will spend some time developing the area's basic
mathematics; however, the main focus will be on applications, in CS theory and beyond. Highlights will include applications in constraint satisfaction problems, learning theory, circuit complexity, pseudorandomness, additive combinatorics, hypercontractivity, property testing, and probabilistic invariance principles.
study in theoretical computer science. In this seminar we will study Boolean
functions via their Fourier transform and other analytic methods. The
powerful techniques from this field have application in numerous areas of
computer science. We will spend some time developing the area's basic
mathematics; however, the main focus will be on applications, in CS theory and beyond. Highlights will include applications in constraint satisfaction problems, learning theory, circuit complexity, pseudorandomness, additive combinatorics, hypercontractivity, property testing, and probabilistic invariance principles.
General Information
Time
Thursdays, 10:00 AM - 12:00 PM
Location (varies)
2/28: G882, 3/7: D463, 3/21: D463, 4/4: G882, 4/11: G882
Tentative Schedule
Lecture 1: 2/7
Tensor space of functions
Tensor bases (computational and Fourier)
Linearity Testing [BLR]: test if a function is a tensor
Lecture 2: 2/14
The Boolean Hypercube as a graph
Eigenvalues of hamming cube & influence formula
KKL, influence of a variable on a Boolean function
Lecture 3: 2/21
the noisy hypercube
small set expansion
hypercontractivity
Lecture 4: 2/28
finish SSE, KKL
Dictatorship testing
Lecture 5: 3/7
Guest lecture by Eli Ben Sasson: Additive combinatorics, Sander's Theorem, Croot+Sisask.
** no class 3/14 **
Lecture 6: 3/21
Invariance Principle.
** no class 3/28 (spring break) **
Lecture 7: 4/4
Guest lecture by Eric Blais. Learning?
Lecture 8: 4/11
The parallel repetition theorem
Lecture 9: 5/2
Short Code, possibly a derandomized BLR test.
Tensor space of functions
Tensor bases (computational and Fourier)
Linearity Testing [BLR]: test if a function is a tensor
Lecture 2: 2/14
The Boolean Hypercube as a graph
Eigenvalues of hamming cube & influence formula
KKL, influence of a variable on a Boolean function
Lecture 3: 2/21
the noisy hypercube
small set expansion
hypercontractivity
Lecture 4: 2/28
finish SSE, KKL
Dictatorship testing
Lecture 5: 3/7
Guest lecture by Eli Ben Sasson: Additive combinatorics, Sander's Theorem, Croot+Sisask.
** no class 3/14 **
Lecture 6: 3/21
Invariance Principle.
** no class 3/28 (spring break) **
Lecture 7: 4/4
Guest lecture by Eric Blais. Learning?
Lecture 8: 4/11
The parallel repetition theorem
Lecture 9: 5/2
Short Code, possibly a derandomized BLR test.
Announcements
Scribe volunteers needed
3/07/13 11:52 AM
I still need scribe volunteers for the last two lectures
Lecture 9: 4/18
Direct sums and products, and derandomized DP testing
Lecture 10: 4/25
Short Code, possibly a derandomized BLR test.
For the next three lectures I have
Lecture 6: 3/21 (scribe by Mat)
Invariance Principle.
Lecture 7: 4/4 (scribe by Yilei)
Guest lecture by Eric Blais. Learning?
Lecture 8: 4/11 (scribe by Omer)
Direct products and the parallel repetition theorem
Lecture 9: 4/18
Direct sums and products, and derandomized DP testing
Lecture 10: 4/25
Short Code, possibly a derandomized BLR test.
For the next three lectures I have
Lecture 6: 3/21 (scribe by Mat)
Invariance Principle.
Lecture 7: 4/4 (scribe by Yilei)
Guest lecture by Eric Blais. Learning?
Lecture 8: 4/11 (scribe by Omer)
Direct products and the parallel repetition theorem
Reading - to be done before lecture 2
2/07/13 3:25 PM
Please read Section 1 in the Minicourse Lecture Notes by Ryan O'Donnell: http://www.cs.cmu.edu/~odonnell/papers/barbados-aobf-lecture-notes.pdf
My notes for Lecture 1
2/07/13 2:16 PM - main object of study
Tools
Applications
The Fourier Basis
The space of functions f is a tensor space.
Definition: Let f,g be two vectors, then the tensor product f ot g is …
Let U,V be two vector spaces, then the tensor product space is U ot V spanned by all f ot g.
Claim: The space of all functions on n variables is the tensor of n copies of the space of all functions on one variable.
Proof: construct by example.
Two possible bases for n=1, then tensorize.
Standard Basis: (0,1) , (1,0)
and
Fourier Basis: (1,1) , (1,-1)
Tensorize and get the standard and Fourier basis.
Define hat f_S,
Claim: orthogonality
Proof: using product property of tensors.
Claim: parseval
Linearity Testing
A Boolean function can also be written as p:GF2^n → GF2
Define linear, show local characterization, and explain “robust” version (noise is here!).
Talk about property testing, quasirandomness, and how this kind of thinking helps understand whether a characterization is inherent, and how so.
Theorem: BLR ..
Proof:
Two proofs. First, ”elementary”, try to find the linear function and then prove that it is good.
Second, using Fourier analysis.
Technically, move back from p to f by shifting -1 → 1 and 1 → 0.
Given we define by
Interpret as tensor testing
Claim:
Compare with the original BLR proof:
Define g(x) = E_y [ f(y)f(xy) ]. The Fourier transform of g has
and the proof above looks at the correlation of g with f.
In the BLR proof we look at sgn(g) = the majority vote, and show that is is a linear function itself & close to f
- The range is {0,1}, compared to R, or [0,1], these are decision functions, and not valuation functions. This difference creates a tension familiar in life, when we have an internal valuation and need to make a hard decision based on it; and manifested in math as well, when a function is “wants” to be smooth but needs to decide.
- The domain is a product space {0,1} x {0,1} x … x {0,1} consisting of n independent coordinates. Not the same as a function f:X → {0,1}, because we care how the function’s behavior is related to the structure of the domain.
- The domain also has a graph structure - and the underlying Hamming metric. We will care how a function respects this structure.
- Analysis : we will be interested in various statements about f “on the average”. This immediately implies an underlying distribution, and we will mostly focus on the uniform distribution. This allows us to think about n independent coordinates.
Tools
- Tensor structure of the function space - Fourier basis.
- Hypercube graph: expansion, eigenvectors, subgraphs and quotient graphs.
- Noise operator and noisy hypercube graph; generalization to product spaces
- Hypercontractivity
- Central Limit Theorem & Invariance Principle
- Inverse Theorems
Applications
- Property testing
- Inapproximability
- Learning
- Hardness amplification
- pseudo-randomness
The Fourier Basis
The space of functions f is a tensor space.
Definition: Let f,g be two vectors, then the tensor product f ot g is …
Let U,V be two vector spaces, then the tensor product space is U ot V spanned by all f ot g.
Claim: The space of all functions on n variables is the tensor of n copies of the space of all functions on one variable.
Proof: construct by example.
Two possible bases for n=1, then tensorize.
Standard Basis: (0,1) , (1,0)
and
Fourier Basis: (1,1) , (1,-1)
Tensorize and get the standard and Fourier basis.
Define hat f_S,
Claim: orthogonality
Proof: using product property of tensors.
Claim: parseval
Linearity Testing
A Boolean function can also be written as p:GF2^n → GF2
Define linear, show local characterization, and explain “robust” version (noise is here!).
Talk about property testing, quasirandomness, and how this kind of thinking helps understand whether a characterization is inherent, and how so.
Theorem: BLR ..
Proof:
Two proofs. First, ”elementary”, try to find the linear function and then prove that it is good.
Second, using Fourier analysis.
Technically, move back from p to f by shifting -1 → 1 and 1 → 0.
Given
Interpret as tensor testing
Claim:
Compare with the original BLR proof:
Define g(x) = E_y [ f(y)f(xy) ]. The Fourier transform of g has
and the proof above looks at the correlation of g with f.
In the BLR proof we look at sgn(g) = the majority vote, and show that is is a linear function itself & close to f
Name | Office Hours | |
---|---|---|
Irit Dinur | When? Where? |
Lecture Notes
Lecture Notes
Lecture Date
Apr 30, 2013
Apr 4, 2013
Mar 21, 2013
Feb 28, 2013
Feb 21, 2013
Feb 14, 2013
Feb 7, 2013
Feb 7, 2013
General Resources
General Resources