So at dinner on Wednesday, I noticed that my younger brother (age 14, non-Honors Algebra II, smart but unmotivated, pretty much what I would have been had I had an academically overachieving older sibling) was filling out 16-man brackets. But these weren't NCAA basketball brackets (what he would have been doing with those in December is beyond me anyway), or fantasy NCAA football playoffs, or any of the above. They were 16 competitors labeled from "A" to "P."

So I asked him: What are you doing?

Him: Filling out brackets.

Me: Why?

(Silence).

(A little later)

Me: Chris, why are you filling out brackets all over your paper?

(Silence).

(Later still)

Him: Harrison, I have a question.

Me: Shoot.

Him: If you have 16 people and you know their initial positions in a bracket, what's the minimum number of spots you have to fill in so you can fill in the rest of the bracket?

I'm pretty sure I've been more proud of a member of my family than I was of him at that moment. But I'm having trouble thinking of an example.

[PS: I told him he should guest-post the problem on this blog. He politely declined.]

## Saturday, December 29, 2007

## Wednesday, December 19, 2007

### P, BPP, VDW, and all that jazz: A Complex Conversation in Three Parts

BACKGROUND

Louis and I were discussing the problem of finding an arithmetic progression of "special elements" in a general series. It all started like this:

Louis: Awesome question:

Me: ?

Louis: Find the longest arithmetic sequence of awesome XKCD comics

Me: XD

Me: 1. that's totally subjective, 2. that's AWESOME

Louis: well, it's different for every person of coures

Louis: *course

Me: well, of course, since they do have positive density, if xkcd were to continue forever

Me: there'd be arbitrarily long APs...

[...]

Louis: I'm talking a computational

Louis: problem

Me: oh

Me: hm.

Louis: not VDW :P

Me: well that's technically szemeredi...

Me: wait, is finding long APs...

Louis: VDW.

Me: szemeredi, and is it in P?

Louis: VDW.

Louis: And yes, I believe so

Louis: given that there are only n^2 possible AP'sMe: oh, d'oh. gp.

Me: lol

Louis: ;-)

Me: (incidentally, there's more like n^3 partial APs, but still polynomial)

Louis: ah, partial AP's.

Louis: well, we needn't consider only partial AP's

Louis: for each a_i, for each a_j, search the arithmetic sequence going a_i, a_j, a_i + 2(a_j - a_i),...

Louis: as far as you can go until you reach something not in the set.

Me: yeah, yeah

Me: and, btw, it is about n^2 log n [Exercise for the reader: Why?]

I'm many things, but I'm not an Algorithms person, and this was now, to me, an Algorithms question and therefore dead. (It didn't help that I couldn't find any way of checking for an AP in o(n^2) time). But then:

AN EPIPHANY

Me: I'm wondering if this might

Me: if this could provide an oracle separation between P and BPP

Louis: XD

Me: seriously!

Louis: how?

Louis: it's in P!

Me: yeah, but

Me: let's take sets of size

Me: well, let's consider the problem of finding an AP of length at least n in the range [1, superpoly(n)], 2-colored according to some cuh-razy random oracle.

Louis: okay

Me: (actually, we should probably specify that it has to be one specific color)

Louis: you do realize you're now reminding me of the quantum database search problem, too?

Me: let's say that all of the elements of the AP have to be yellow, and the oracle colors a number yellow with some fairly high probability

Louis: okay

Me: actually, let's not make it length n, let's make it length [O(log(superpoly(n)))]

Louis: how will BPP help us?

Me: for almost all oracles, all we have to do is pick a random AP and there's positive probability it's yellow.

Louis: but wait

Louis: if the numbers are colored randomly

Louis: then why would randomized search be any better than brute force?

Me: on average? duh, it wouldn't. worst-case, though...

Me: it seems less hard to show brute-force won't work well in some fraction of oracle universes

Mathematical backing-up of my assertions in the conversation: Consider a random coloring of [1, superpoly(n)] s.t. the probability of a given number being colored yellow is \delta. There are approximately superpoly(n)^2 APs of length log(superpoly(n)) [I'm totally fudging the numbers there, but the basic argument should still hold.] If the implied constant of the log(superpoly(n)) is sufficiently small, then the probability of a randomly chosen AP of that length being yellow is at least \delta^log(superpoly(n)) > 1/superpoly(n)^2.

N.B., in fact, that by doing nothing more than messing around with a few constants, we can get the probability to be > k/superpoly(n)^2 for any constant k we want to work with. (Or even, for that matter, > k/superpoly(n).) Then the expected value of the number of APs of the...oh, screw it, of the right length is k.

So! In retrospect, the functions I picked don't work very well, but it's not difficult to change them around a bit so that our above argument (after throwing in some stuff about the Chernoff bound) actually ends up telling us the following:

1. For almost all oracles we've considered here, there exists f(n) superpolynomial, g(n) really slow-growing such that a BPP algorithm can pick polynomially many APs of length g(n) in the range [1, f(n)] s.t. with positive, constant probability, at least one will be yellow.

And here's the crux: There's no obvious way that a deterministic algorithm can do the same thing. Actually, with all these oracle universes floating around, it should be obvious to even the most dimwitted individual who holds an advanced degree in hyperbolic topology, n'hey [sorry], that some of them won't.

CONCLUSION

The problem is, though, that like most lower bounds, this fact is easier noticed than proved. And since there are already known oracle separations of P from BPP, one more wouldn't be particularly exciting. Still, though, it's fun to see two of my (and some other occasional readers') main interests -- computational complexity and additive combinatorics -- work together in a way I can understand, even if I can't quite finish the proof.

Louis and I were discussing the problem of finding an arithmetic progression of "special elements" in a general series. It all started like this:

Louis: Awesome question:

Me: ?

Louis: Find the longest arithmetic sequence of awesome XKCD comics

Me: XD

Me: 1. that's totally subjective, 2. that's AWESOME

Louis: well, it's different for every person of coures

Louis: *course

Me: well, of course, since they do have positive density, if xkcd were to continue forever

Me: there'd be arbitrarily long APs...

[...]

Louis: I'm talking a computational

Louis: problem

Me: oh

Me: hm.

Louis: not VDW :P

Me: well that's technically szemeredi...

Me: wait, is finding long APs...

Louis: VDW.

Me: szemeredi, and is it in P?

Louis: VDW.

Louis: And yes, I believe so

Louis: given that there are only n^2 possible AP'sMe: oh, d'oh. gp.

Me: lol

Louis: ;-)

Me: (incidentally, there's more like n^3 partial APs, but still polynomial)

Louis: ah, partial AP's.

Louis: well, we needn't consider only partial AP's

Louis: for each a_i, for each a_j, search the arithmetic sequence going a_i, a_j, a_i + 2(a_j - a_i),...

Louis: as far as you can go until you reach something not in the set.

Me: yeah, yeah

Me: and, btw, it is about n^2 log n [Exercise for the reader: Why?]

I'm many things, but I'm not an Algorithms person, and this was now, to me, an Algorithms question and therefore dead. (It didn't help that I couldn't find any way of checking for an AP in o(n^2) time). But then:

AN EPIPHANY

Me: I'm wondering if this might

Me: if this could provide an oracle separation between P and BPP

Louis: XD

Me: seriously!

Louis: how?

Louis: it's in P!

Me: yeah, but

Me: let's take sets of size

Me: well, let's consider the problem of finding an AP of length at least n in the range [1, superpoly(n)], 2-colored according to some cuh-razy random oracle.

Louis: okay

Me: (actually, we should probably specify that it has to be one specific color)

Louis: you do realize you're now reminding me of the quantum database search problem, too?

Me: let's say that all of the elements of the AP have to be yellow, and the oracle colors a number yellow with some fairly high probability

Louis: okay

Me: actually, let's not make it length n, let's make it length [O(log(superpoly(n)))]

Louis: how will BPP help us?

Me: for almost all oracles, all we have to do is pick a random AP and there's positive probability it's yellow.

Louis: but wait

Louis: if the numbers are colored randomly

Louis: then why would randomized search be any better than brute force?

Me: on average? duh, it wouldn't. worst-case, though...

Me: it seems less hard to show brute-force won't work well in some fraction of oracle universes

Mathematical backing-up of my assertions in the conversation: Consider a random coloring of [1, superpoly(n)] s.t. the probability of a given number being colored yellow is \delta. There are approximately superpoly(n)^2 APs of length log(superpoly(n)) [I'm totally fudging the numbers there, but the basic argument should still hold.] If the implied constant of the log(superpoly(n)) is sufficiently small, then the probability of a randomly chosen AP of that length being yellow is at least \delta^log(superpoly(n)) > 1/superpoly(n)^2.

N.B., in fact, that by doing nothing more than messing around with a few constants, we can get the probability to be > k/superpoly(n)^2 for any constant k we want to work with. (Or even, for that matter, > k/superpoly(n).) Then the expected value of the number of APs of the...oh, screw it, of the right length is k.

So! In retrospect, the functions I picked don't work very well, but it's not difficult to change them around a bit so that our above argument (after throwing in some stuff about the Chernoff bound) actually ends up telling us the following:

1. For almost all oracles we've considered here, there exists f(n) superpolynomial, g(n) really slow-growing such that a BPP algorithm can pick polynomially many APs of length g(n) in the range [1, f(n)] s.t. with positive, constant probability, at least one will be yellow.

And here's the crux: There's no obvious way that a deterministic algorithm can do the same thing. Actually, with all these oracle universes floating around, it should be obvious to even the most dimwitted individual who holds an advanced degree in hyperbolic topology, n'hey [sorry], that some of them won't.

CONCLUSION

The problem is, though, that like most lower bounds, this fact is easier noticed than proved. And since there are already known oracle separations of P from BPP, one more wouldn't be particularly exciting. Still, though, it's fun to see two of my (and some other occasional readers') main interests -- computational complexity and additive combinatorics -- work together in a way I can understand, even if I can't quite finish the proof.

Subscribe to:
Posts (Atom)