Saturday, February 27, 2010
Sunday, April 20, 2008
With the help of a friend, I recently determined that my Erdös number is five--which was pretty surprising to me. For those who might not know, the Erdös number is a way of describing the collaborative distance (in regard to scientific papers) between a given author (i.e., me) and Paul Erdös, the famous and prolific Hungarian Jewish mathematician. Here are the explicit details of the "collaborative distance measure" between Erdös and myself.
Sunday, March 30, 2008
Baseball season is almost underway, and "the boys are back in town" from their spring training in Arizona. Here is a picture of the Chicago Cubs working out in a cold, gray day at Wrigley Field, Chicago. They play the Milwaukee Brewers tomorrow (weather permitting). It would be a fine year for the Chicago National League Ballclub to win the world series. As Jack Brickhouse, the TV announcer on WGN whom I loved in my childhood, used to say: "Anyone can have a bad century."
Labels:
Baseball,
Chicago Cubs,
Wrigley Field
Saturday, March 29, 2008
How do you quantify how "wiggly" a curve is? I've come across a way that some mathematicians do it. Based on number theoretic methods the "wiggliness" is defined by the number of points at which the curve and a given random straight line intersect. Apparently, if X is the random variable describing the number N = {1, 2, 3, ...) of intersection points then the expectation value of X can be shown to be E(X) ~ 2L/H. The quantity L is the length of the curve, and H is the convex hull of the curve. This appears to be due to a theorem of Steinhaus.
Furthermore, based on the above, the "temperature" of a curve can be derived to be the quantity T = 1/ln[λ/(λ-1)], as quoted here. I've defined the parameter λ = 2L/H. I assume that this relation in MathWorld comes from calculating the Shannon entropy of the random variable X, in which case the natural logarithm is used because the entropy is calculated in nats (not bits). Balestrino et al. quote Mendès-France's work on the entropy of a curve as:
S = log(λ) + β/[exp(β)-1]
where β = log[λ/(λ-1)] and now the entropy is measured in bits. This second term acts to force S to be zero when λ goes to zero.
Now, I'm not a mathematician, so the definitions above for the temperature and entropy of a curve in terms of its "convex hull" is not especially appealing. This is because I don't have a good understanding of convex hulls, and besides, defining the entropy of a curve in terms of its convex hull seems to be a tautology because the larger the convex hull of the set of points on a curve is, the more obviously larger the "wiggliness" of the curve (and hence, the temperature and entropy) is.
OK, so my mind now wanders to a wonderful paper recently published by two physicists I know (one of whom has been a colleague and friend of mine for many years) that studies the "shape" of a randomly lying cord. In this model, Lemons and Lipscombe derive expressions for the mean and variance of the squared distance between cord ends. For example, if the cord has a length L and a "stiffness" defined by a parameter 1/κ, then the mean square distance between the cord ends is given by <r2> = 2L/κ when κL >> 1 (flexible cord), or <r2>=L2 (stiff cord). The relevant random variable here is r, the distance between the cord ends.
This is a very nice result, and is consistent with one's intuition. The Lemons-Lipscombe paper also makes an analogy of this problem with diffusion (Weiner processes) and also with long-chain polymers.
From the above, this seems to suggest that the Lemons-Lipscombe random cord is parametrized by 2L/κ, in contrast to the parameter 2L/H coming out of the work of Steinhaus et al. Is there a subtle connection between the flexibility κ of a cord and the convex hull H of a curve?
Let's investigate further. Every point on the random-lying cord can be characterized by its distance l from one end of the cord (an arbitrary end for which l = 0), and an angle θ between the local tangent to the cord and the x-axis. Then the angle θ is assumed to be a normal random variable with mean equal to the initial orientation angle θ0 and variance equal to 2κl:
θ(l) = N(θ0, 2κl)
All well and good. This method gives great results agreeing with experiment. So now here's what I'm thinking...can one equivalently define an entropy of a randomly lying cord like the mathematicians define the entropy of an arbitrary curve? Well, the Shannon entropy S of a random variable X is known, and if X is normally-distributed (like θ above) then one can easily calculate the entropy to be:
S(θ) = (1/2) log(4e π κl )
But how does this entropy relate to the entropy of a curve given above in terms of the parameter λ?
Update
I received a comment from the principal author of the Lemons-Lepscombe paper (Prof. Don Lemons) and I quote it here:
Furthermore, based on the above, the "temperature" of a curve can be derived to be the quantity T = 1/ln[λ/(λ-1)], as quoted here. I've defined the parameter λ = 2L/H. I assume that this relation in MathWorld comes from calculating the Shannon entropy of the random variable X, in which case the natural logarithm is used because the entropy is calculated in nats (not bits). Balestrino et al. quote Mendès-France's work on the entropy of a curve as:
S = log(λ) + β/[exp(β)-1]
where β = log[λ/(λ-1)] and now the entropy is measured in bits. This second term acts to force S to be zero when λ goes to zero.
Now, I'm not a mathematician, so the definitions above for the temperature and entropy of a curve in terms of its "convex hull" is not especially appealing. This is because I don't have a good understanding of convex hulls, and besides, defining the entropy of a curve in terms of its convex hull seems to be a tautology because the larger the convex hull of the set of points on a curve is, the more obviously larger the "wiggliness" of the curve (and hence, the temperature and entropy) is.
OK, so my mind now wanders to a wonderful paper recently published by two physicists I know (one of whom has been a colleague and friend of mine for many years) that studies the "shape" of a randomly lying cord. In this model, Lemons and Lipscombe derive expressions for the mean and variance of the squared distance between cord ends. For example, if the cord has a length L and a "stiffness" defined by a parameter 1/κ, then the mean square distance between the cord ends is given by <r2> = 2L/κ when κL >> 1 (flexible cord), or <r2>=L2 (stiff cord). The relevant random variable here is r, the distance between the cord ends.
This is a very nice result, and is consistent with one's intuition. The Lemons-Lipscombe paper also makes an analogy of this problem with diffusion (Weiner processes) and also with long-chain polymers.
From the above, this seems to suggest that the Lemons-Lipscombe random cord is parametrized by 2L/κ, in contrast to the parameter 2L/H coming out of the work of Steinhaus et al. Is there a subtle connection between the flexibility κ of a cord and the convex hull H of a curve?
Let's investigate further. Every point on the random-lying cord can be characterized by its distance l from one end of the cord (an arbitrary end for which l = 0), and an angle θ between the local tangent to the cord and the x-axis. Then the angle θ is assumed to be a normal random variable with mean equal to the initial orientation angle θ0 and variance equal to 2κl:
θ(l) = N(θ0, 2κl)
All well and good. This method gives great results agreeing with experiment. So now here's what I'm thinking...can one equivalently define an entropy of a randomly lying cord like the mathematicians define the entropy of an arbitrary curve? Well, the Shannon entropy S of a random variable X is known, and if X is normally-distributed (like θ above) then one can easily calculate the entropy to be:
S(θ) = (1/2) log(4e π κl )
But how does this entropy relate to the entropy of a curve given above in terms of the parameter λ?
Update
I received a comment from the principal author of the Lemons-Lepscombe paper (Prof. Don Lemons) and I quote it here:
I am quite pleased that my friend Bill finds the Lemons and Lipscombe paper worthy. As one of its authors I can say that it is just about my favorite paper among the approximately 50 papers I’ve written. I do have a few responses to Bill’s suggestion that the Lemons and Lipscombe paper provides a metric for the entropy of a randomly lying cord. Of course, there is an entropy, a Shannon entropy, or a missing information (terms I use synonymously) associated with any well-defined random variable. Bill’s suggestion that we use the net random displacement θ(l) = N(θ0, 2κl) where l is the cord length and κ is the cord flexibility to determine the entropy of the randomly lying rope is a good one. Its associated entropy is [1 + ln 4π κl ]/2 and is associated with the “curviness” of the random cord.
However, I would be hard-pressed to say that this is the only or the best entropy of a randomly lying rope. There may be other entropies that quantify other aspects of its randomness. Some of these can be found in the Lemons and Lipscombe paper – I am thinking of the entropy associated with the random variable r2 – the squared distance between the ends of the rope. Others might be generated in other ways, say, from the convex hull.
If any one has any thoughts about how to distinguish the entropy of a randomly lying rope from simply an entropy, I’d like to hear them.
Tuesday, March 18, 2008
Recently I've become entranced with Maximum Length Sequences (MLS). They first came to my attention when I was involved with some acoustic signal processing, where they are used to measure ocean (or room) reverberations. One interesting property, and useful application, of an MLS is that its autocorrelation is a unit impulse function (a Dirac delta function). This feature allows extracting the response of a system by calculating the cross-correlation of its measured output with the MLS.
There are other "magical" properties. For example, consider the sequence I0 = 0011101. As Laxton and Anderson pointed out, six additional sequences (e.g., I1 = 0111010) can be obtained by cyclically shifting I0 to the left. Moreover if we add (modulo 2) two of these seven unique sequences, then we obtain another sequence in this MLS set. This is a very rare property for a general n-bit sequence and is related to the fact that the zeros and ones in such a sequence must be distributed in a very special way. For example, the number of ones in the sequence must be one greater than the number of zeros.
The most interesting property of an MLS to me is the fact that although they are deterministic, they can be used as approximate pseudo-random numbers. This immediately makes me think of using them for Monte Carlo simulations. I haven't yet seen any real published work using them for a practical problem--although a lot of work has been done on their usefulness as pseudo-random numbers by Compagner in Delft (what a beautiful city and university, by the way). Besides the fact that I'd like to play with them in a real way by using them in Monte Carlo simulations, there is also another interesting property they have: each MLS can be associated with a so-called companion polynomial. This brings up another wacky idea to my mind: can this somehow be used in Gaussian quadrature?
There are other "magical" properties. For example, consider the sequence I0 = 0011101. As Laxton and Anderson pointed out, six additional sequences (e.g., I1 = 0111010) can be obtained by cyclically shifting I0 to the left. Moreover if we add (modulo 2) two of these seven unique sequences, then we obtain another sequence in this MLS set. This is a very rare property for a general n-bit sequence and is related to the fact that the zeros and ones in such a sequence must be distributed in a very special way. For example, the number of ones in the sequence must be one greater than the number of zeros.
The most interesting property of an MLS to me is the fact that although they are deterministic, they can be used as approximate pseudo-random numbers. This immediately makes me think of using them for Monte Carlo simulations. I haven't yet seen any real published work using them for a practical problem--although a lot of work has been done on their usefulness as pseudo-random numbers by Compagner in Delft (what a beautiful city and university, by the way). Besides the fact that I'd like to play with them in a real way by using them in Monte Carlo simulations, there is also another interesting property they have: each MLS can be associated with a so-called companion polynomial. This brings up another wacky idea to my mind: can this somehow be used in Gaussian quadrature?
Sunday, March 09, 2008
I just wanted to publicly express my grief and sadness to my friend Prof. Michael Gedalin, whose mother was killed by a suicide bomber in southern Israel a few weeks ago. Michael's father was also injured in the attack, and is still in critical condition. I also want to express my condolences to the families of the many victims of the terrorist attack at the Merkaz ha'Rav seminary in Jerusalem that happened a couple of days ago.
Sunday, November 12, 2006
Last night I dreamt I saw my friend and colleague, the late Michael E. Jones, in some strange research facility I was visiting. I'm not sure where this research facility was, but it seemed to have a lot of physicists, and it looked like some kind of mixture of the Googleplex and Los Alamos. Anyway, it was sure sweet being able to see Mike again, and talking a little physics with him. I miss that guy. He was a great physicist and a great human being. It was a horrible tragedy how he died.
Sunday, January 09, 2005
I just ran across a cool quote of Chief Tecumseh of the Shawnee Nation:
For those of you who might remember, doesn't this almost remind you of the Desiderata -- that paragraph that everyone in the 1970s hung on their wall? Or the kinds of ethical sayings one would read in religious tracts like Ecclesiastes (Kohelet), or Sayings of the Fathers (Pirkei Avot)?
So live your life that the fear of death can never enter your heart. Trouble no one about their religion; respect others in their view, and demand that they respect yours. Love your life, perfect your life, beautify all things in your life. Seek to make your life long and its purpose in the service of your people. Prepare a noble death song for the day when you go over the great divide. Always give a word or a sign of salute when meeting or passing a friend, even a stranger, when in a lonely place. Show respect to all people and grovel to none. When you arise in the morning give thanks for the food and for the joy of living. If you see no reason for giving thanks, the fault lies only in yourself. Abuse no one and no thing, for abuse turns the wise ones to fools and robs the spirit of its vision. When it comes your time to die, be not like those whose hearts are filled with the fear of death, so that when their time comes they weep and pray for a little more time to live their lives over again in a different way. Sing your death song and die like a hero going home.
For those of you who might remember, doesn't this almost remind you of the Desiderata -- that paragraph that everyone in the 1970s hung on their wall? Or the kinds of ethical sayings one would read in religious tracts like Ecclesiastes (Kohelet), or Sayings of the Fathers (Pirkei Avot)?
Subscribe to:
Posts (Atom)