Saturday, March 29, 2008

How do you quantify how "wiggly" a curve is? I've come across a way that some mathematicians do it. Based on number theoretic methods the "wiggliness" is defined by the number of points at which the curve and a given random straight line intersect. Apparently, if X is the random variable describing the number N = {1, 2, 3, ...) of intersection points then the expectation value of X can be shown to be E(X) ~ 2L/H. The quantity L is the length of the curve, and H is the convex hull of the curve. This appears to be due to a theorem of Steinhaus.

Furthermore, based on the above, the "temperature" of a curve can be derived to be the quantity T = 1/ln[λ/(λ-1)], as quoted here. I've defined the parameter λ = 2L/H. I assume that this relation in MathWorld comes from calculating the Shannon entropy of the random variable X, in which case the natural logarithm is used because the entropy is calculated in nats (not bits). Balestrino et al. quote Mendès-France's work on the entropy of a curve as:

S = log(λ) + β/[exp(β)-1]

where β = log[λ/(λ-1)] and now the entropy is measured in bits. This second term acts to force S to be zero when λ goes to zero.

Now, I'm not a mathematician, so the definitions above for the temperature and entropy of a curve in terms of its "convex hull" is not especially appealing. This is because I don't have a good understanding of convex hulls, and besides, defining the entropy of a curve in terms of its convex hull seems to be a tautology because the larger the convex hull of the set of points on a curve is, the more obviously larger the "wiggliness" of the curve (and hence, the temperature and entropy) is.

OK, so my mind now wanders to a wonderful paper recently published by two physicists I know (one of whom has been a colleague and friend of mine for many years) that studies the "shape" of a randomly lying cord. In this model, Lemons and Lipscombe derive expressions for the mean and variance of the squared distance between cord ends. For example, if the cord has a length L and a "stiffness" defined by a parameter 1/κ, then the mean square distance between the cord ends is given by <r2> = 2L/κ when κL >> 1 (flexible cord), or <r2>=L2 (stiff cord). The relevant random variable here is r, the distance between the cord ends.

This is a very nice result, and is consistent with one's intuition. The Lemons-Lipscombe paper also makes an analogy of this problem with diffusion (Weiner processes) and also with long-chain polymers.

From the above, this seems to suggest that the Lemons-Lipscombe random cord is parametrized by 2L/κ, in contrast to the parameter 2L/H coming out of the work of Steinhaus et al. Is there a subtle connection between the flexibility κ of a cord and the convex hull H of a curve?

Let's investigate further. Every point on the random-lying cord can be characterized by its distance l from one end of the cord (an arbitrary end for which l = 0), and an angle θ between the local tangent to the cord and the x-axis. Then the angle θ is assumed to be a normal random variable with mean equal to the initial orientation angle θ0 and variance equal to 2κl:

θ(l) = N(θ0, 2κl)

All well and good. This method gives great results agreeing with experiment. So now here's what I'm thinking...can one equivalently define an entropy of a randomly lying cord like the mathematicians define the entropy of an arbitrary curve? Well, the Shannon entropy S of a random variable X is known, and if X is normally-distributed (like θ above) then one can easily calculate the entropy to be:

S(θ) = (1/2) log(4e π κl )

But how does this entropy relate to the entropy of a curve given above in terms of the parameter λ?

Update

I received a comment from the principal author of the Lemons-Lepscombe paper (Prof. Don Lemons) and I quote it here:

I am quite pleased that my friend Bill finds the Lemons and Lipscombe paper worthy. As one of its authors I can say that it is just about my favorite paper among the approximately 50 papers I’ve written. I do have a few responses to Bill’s suggestion that the Lemons and Lipscombe paper provides a metric for the entropy of a randomly lying cord. Of course, there is an entropy, a Shannon entropy, or a missing information (terms I use synonymously) associated with any well-defined random variable. Bill’s suggestion that we use the net random displacement θ(l) = N(θ0, 2κl) where l is the cord length and κ is the cord flexibility to determine the entropy of the randomly lying rope is a good one. Its associated entropy is [1 + ln 4π κl ]/2 and is associated with the “curviness” of the random cord.

However, I would be hard-pressed to say that this is the only or the best entropy of a randomly lying rope. There may be other entropies that quantify other aspects of its randomness. Some of these can be found in the Lemons and Lipscombe paper – I am thinking of the entropy associated with the random variable r2 – the squared distance between the ends of the rope. Others might be generated in other ways, say, from the convex hull.

If any one has any thoughts about how to distinguish the entropy of a randomly lying rope from simply an entropy, I’d like to hear them.

No comments: