## Wednesday, September 19, 2007

### Y'arr! Cryptography and piracy.

Ahoy, me mathematical mateys. Seein' as how today be the International Talk Like a Pirate Day, I decided to make a piratey post. Y'arr.

Y'arr! Here be a rough formal definition o' what ye can use for public-key encryption. Arr. A function f:Zx(Z/nZ) -> Z/nZ be a trapdoor function if:

1. Ye may compute f(m,k) smartly (in polynomial time).
2. The problem o' findin' a preimage o' k \in Z/nZ, given fixed m, is in NP.
3. There be some t not dependin' on k so, given t, findin' a preimage o' k with fixed m is in P.

Aye, I claim that these be the functions from which ye can construct a public-key cryptosystem. If ye disagree with me, ye can walk the plank. Otherwise, we'll continue:

Call the complexity class which consists o' findin' preimages under trapdoor functions PK (for public-key. Or plank, which be like a trapdoor, but better suited for seafarin' folk.). Is PK contained in FBQP?

If the answer be "aye," then pirates rejoice! With a quantum computer, ye can crack any classical public-key system, which would allow ye to commandeer credit card information and much, much more, without the risks of scurvy and losin' limbs that make pirates' health-insurance premiums so #@#\$in' high.

However, I believe that the answer be "nay." Since P is in BQP, this would imply P < NP, so I'm not expectin' any o' ye landlubbers to solve it. But if ye can give me evidence of why my hunch is correct, this ol' scurvy dog'll be forever indebted. Y'arr.

## Tuesday, September 18, 2007

### The Riemann Hypothesis for Dummies; or, How Bad Music can Make You Obscenely Rich

No, this blog post isn't about 50 Cent.

Here's the scenario:

I'm an ultra-modern, avant-garde composer. I...compose music. One day, I decide, "Hey! I'm gonna write the weirdest piece EVER!"

Here's what I do:

For strings, I write a waltz (which, as you probably know, is in 3/4 time.) But actually, I lied. I don't write a waltz; I only write one measure of a waltz. Which seems like a weird thing to do, but you'll see my plan.

Then, for horns, I write, say, one measure of 4/4 time. And for piano, one measure of, let's say, 7/4 time. And so on.

Then I instruct each section to play their one single measure over and over and over again until they get tired, and then to keep playing it until all the sections finish their respective measures at the same moment.

The musicians will gripe about this arrangement, as musicians often do, but the overall sound won't repeat for a very long time! For example: If, as above, the strings play in 3/4, the horns in 4/4, and the piano in 7/4, then the piece will last a total of 3*4*7 = 84 beats.

But there's a catch! Yes, I'm an edgy and avant-garde composer, but I'm also environmentally conscious. Which means I want to make this piece as long as I possibly can (without repeating itself, of course) while only using a fixed number of beats, so that I won't waste paper. (Don't think too hard about that part. It's avant-garde composer logic.)

So, here's my question: Exactly how long can I make my piece last without writing more than, oh, n total beats?

Believe it or not, this question is actually unsolved! More than 100 years ago, in 1902, a German mathematician named Edmund Landau proved that when n is really big, I can't make it go on for much longer than e^\sqrt{n ln n} beats (where e is the base of the natural logarithms, or 2.71828...) This is a weird function! It grows a LOT slower than any exponential function (e.g. 1,2,4,8,...) but a LOT faster than any polynomial (for example, n^2, or 1,4,9,16...).

But mathematicians think they can do better than Landau. In fact, they think that something similar to Landau's answer -- exactly the same, in fact, except that n ln n has been replaced by something called "Li^-1(n)," which behaves very similarly -- provides an upper bound for big enough n, not just a close-to-upper bound.

But no one's been able to prove their conjecture! In fact, mathematicians have shown that this conjecture is precisely equivalent to the famous Riemann Hypothesis.

And the RH is considered to be one of the hardest problems in mathematics. It's so hard, in fact, that the Clay Mathematics Institute is offering a million-dollar reward to the first person who can publish a correct proof. In other words, my composer's idle question is actually worth seven figures.

## Sunday, September 16, 2007

### Random graph labelings

OK. Here's what I think I maybe've got:

Let G be the random graph G(n,M) on n vertices and M edges. Then RS(G) = O(M*\delta(G)).

Proof (outline): Set the implied constant in the big-O notation as k, to be determined later.

Base Case: M = 0; trivial.

For M>0, choose a vertex v with degree \delta(G), and remove an edge incident to it to obtain the graph G'. Then, by the IH, RS(G') \leq k*(M-1)*(\delta(G)-1) = k*M*\delta(G)-(kM+k\delta(G)-1). For suitable k (I suspect), the probability that there exists some number in [0, kM+k\delta(G)-1] which we can add to the label on v so that the labeling remains range-relaxed graceful is 1.

...Yeah. I guess I hope that this is workable? I dunno though. Eh.

## Thursday, September 6, 2007

### Graph labellings and coding theory

So, here's what I did today (among, of course, other stuff):

Call a graph G (n,k)Hamming-representable if there is a labeling f: V(G) -> (F_2)^n such that two vertices v, w are adjacent iff the Hamming distance between f(v) and f(w) is at most k. If G is (n,k)Hamming-representable for some choice of n and k, we'll just call it Hamming-representable.

In addition, say that G is (n,k)Hamming-regular when two vertices are adjacent iff the Hamming distance between their labels is k. Same deal with just "Hamming-regular."

So it's not hard to show that all trees on n edges are (n,1)Hamming-regular; root the tree, assign each vertex apart from the root a number from 1 to n, assign the root the label (0,0,0,...,0), and assign a vertex v a label which has a 1 in the mth spot iff the path from the root to v passes through the vertex assigned m. (If you don't follow that, comment me; I'll attempt to make a diagram so it'll hopefully be clearer.)

Some conjectures/questions:

1. Are all Hamming-regular graphs also Hamming-representable? Trees are (duh), but what about others?

2. Are all graphs Hamming-representable? (Obviously, this is a stronger statement in the affirmative than #1...)

(Edit: So, a thought. Assign an index i to each vertex; set the "bit" in the ith place to 1 if the vertex you're labelling is either vertex i or adjacent to i. Does this work? I'm too tired to prove/find a counterexample; any takers?

Edit^2: Grr, doesn't work. Certain trees, for example. So it's trickier than I thought.)

3. What's the growth rate for the maxmin of n for all Hamming-representable graphs of the same size? Can we construct an (n,k)Hamming-labelling of a tree where n is logarithmic in the number of vertices, or is our current construction optimal?

I think I had more. But I don't remember them. Yeah.

As always, any thoughts/comments/opinions are greatly valued. Comment, people!

## Monday, September 3, 2007

### Revenge of the Knapsack

Bram Cohen describes a modified knapsack system and suggests it could be used as a public-key cryptosystem. My thoughts:

1. Let's call the elements of the public key a_i, for 1 \leq i \leq k. Consider the sets B_n = {b_i: b_i = a_i (mod n)} for arbitrary n.

2. Note that, if gcd(p,n) is sufficiently large (i.e., > p/2k), then the elements of B_n will follow a very specific distribution. Specifically, each will fall into one of the intervals [a*gcd(p,n), a*gcd(p,n)+p/2k) for some integer a.

3. For p << n << kp with gcd(p,n) large, there will necessarily be more than 1 residue in at least some of the intervals, and, with high probability, each interval will contain at least one residue.

4. Trivially: gcd(p,n) = gcd(p,n+p).

So let's posit a hypothetical function f(n) that will give us a good probabilistic estimate of whether the residues of the a_i (mod n) really do "clump together" in the intervals of size p/2k. This should be close to 0 when gcd(p,n) is low, larger when gcd(p,n) is greater than about p/k, and close to 1 when gcd(p,n) = p. Then f(n) is periodic (or "close to it") with period p. So if we can construct f as a quantum function, we may be able to use a QFT to estimate its period, which would recover the private key.

This is shaky on several levels; first, I don't know whether you can use a QFT to measure functions that are "close to periodic." However, this could perhaps be rectified by defining a new function g: N -> {0, 1/2, 1}, where g(n) is whichever of those three f(n) is nearest to. Then g(n) might be periodic with positive probability.

More crucially: Can we actually devise a function f with the properties outlined above? I would venture to say "yes," since humans can (with high probability) tell whether 300 numbers fall within just three separate, small intervals. And as humans can do, so can do computers. The problem is actually exhibiting a function which we can then prove to work.

Any thoughts on the matter would be appreciated.

Harrison