Robert Oldershaw
5 min readJun 13, 2017

--

Thanks for your comments, Tobias, Here is beginnings of a work in progress that you might enjoy.

NATURE’S STARTLING CLUE — CONFORMAL SYMMETRY

Key Words: Benford’s Law, Scale Invariance, Linear Recursion, Self-Similarity, Conformal Symmetry, Relative Scale

Abstract: Given the near universality of the Benford/Newcomb Law of First Digits, and the ubiquity of fractal self-similarity observed throughout nature from the smallest to largest scales, it is argued that nature’s most fundamental geometry may be conformal geometry, which is universally present in full, partial, and discrete/broken forms. As a consequence, the assumption of absolute scale may apply only in restricted contexts, whereas relative scale may be the more dominant principle for the cosmos as a whole.

I. Simon Newcomb

In the 1880s Simon Newcomb, a noted astronomer and mathematician, discovered something very odd. It was well before fancy calculators and so when scientists had to do complex mathematical calculations, they needed to use logarithm tables. Newcomb noticed that the first couple of pages (associated with numbers beginning with 1 and 2) were quite dog-eared, but the last pages (associated with numbers beginning with 8 and 9) were far less worn. This was a great mystery to Newcomb since he and apparently all those before him assumed that the first digits of any sizeable collection of data would fall on the numbers 1 through 9 with equal probability (i.e., about 11.11% for each of those first digits). If that were true, then the logarithm table pages should be equally dog-eared. Clearly they were not! How could one explain such a bizarre phenomena.

After considerable thought, and perhaps some checking of various data tables, Newcomb came up with the following hypothesis. The numbers were evenly spaced if one chose to plot them on a logarithmic scale, not on the usual arithmetic scale. On a log scale the x-axis length devoted to numbers beginning with 1 is about six times wider than the x-axis length devoted to numbers beginning with 9. This would explain the mystery of the dog-eared logarithm tables, and Newcomb published a short paper in the American Mathematical Monthly1 (where he was an editor) in which he concluded: “The law of probability of the occurrence of numbers is such that all mantissae [first significant digits] of their logarithms are equally probable.” This was a heuristic probability law. Hill notes2 that Newcomb “supplied neither a precise domain or meaning to this probability, a formal argument, nor numerical data.” Mathematically,

Prob (1st signif. digit d) = log10 (1 + 1/d), d = 1, 2, 3,…9.

Numerically, this probability sequence turns out to be: 30.1%, 17.6%, 12.5%, 9.7%, 7.9%, 6.7%, 5.8%, 5.1%, 5.1% and 4.6%. In graphic form, it looks like this3:

Unfortunately, no one knew what to make of Newcomb’s strange law and it was for the most part forgotten — until the 1930s.

II. Frank Benford

Frank Benford worked at General Electric as a physicist in the 1930s, and he apparently independently discovered the same oddity in logarithm tables that Newcomb had found. He was fascinated with this discovery and devoted years to researching it empirically. He researched roughly 20 data tables, including about 20,000 entries, and the data tables were from many different sets of data from a very diverse selection of physical phenomena. Testing the data for the log-normal first digit distributions, he rediscovered what Newcomb had intuited. In 1938 he published his results4 and this time many people were impressed with Benford’s efforts and curiously baffled by the fact that the Benford/Newcomb Law of First Digits is common in nature, math and social phenomena. Benford’s somewhat philosophical interpretation of the First Digit Law was than Man counts arithmetically (1, 2, 3, 4, …) but Nature counts geometrically (e0, ex, e2x, e3x, …) “and builds and functions accordingly”5. However, this seems to beg the question of why nature should do this, i.e., what is the causal explanation for the heuristic law. How do log-normal distributions come to be ubiquitous in the physical, biological, mathematical and social realms?

Since Benford’s paper was written many examples of data sets that obey the First Digit Law have been identified. A partial list can include the following.

surface areas of rivers
molecular weights

sizes of stored computer files
atomic element/isotope masses
E1 atomic transition lines in plasmas

pulsar physical properties
universal physical constants
populations of 3,000 countries
surface areas of countries
full widths (lifetimes) of mesons and baryons

Dow Jones numbers
fibonacci sequence
half-lives of radioactive nuclei

internet connections
exoplanet masses, radii, volumes, orbital periods,…
distances to galaxies
distances to stars in our galaxy
death rates
blackbody radiation
prime numbers
river lengths

size of bank accounts

Clearly there must be a scientific explanation for this remarkable ubiquity of log scale distributions, and there have been many attempts. As yet, however, no single explanation has garnered wide acceptance. The hunt for the meaning of this nearly universal phenomena is still on.

III. Toward An Understanding Of The Benford/Newcomb Law of First Digits

Since Benford’s paper was published there have been many attempts to understand the Benford/Newcomb Law, ranging from philosophical claims of a universal harmony to skeptical proposals that it was an artifact of human uses of numbers, base systems, the floating decimal system, etc. Benford argued that the first digit law might be explained if the data tables were comprisd of data garnered from geometric sequences and Prof. Roger Pinkham of Rutgers University argued that some form of relativity theory was probably involved5. There is now quite a large body of published work on the Benford/Newcomb Law, but the research papers of the mathematicians Riami5,6 and Hill2,7,8 offer clearest and most convincing discussions of this complicated subject.

1. Geometric sequences (2, 4, 8, 16, 32, 64, 128, …) will tend to conform to the logarithm law if they are continued out long enough.

2. Scale invariance, wherein physical objects/systems, laws, and processes are largely unaffected by changes in the scales of length, energy and other variables, being multiplied by conserved constants, also tend to conform to the logarithmic law. This also means that if a data table obeys the Benford/Newcomb Law in one set of units, it will obey the logarithm law for any arbitrary choice of units.

3. Linear recursion, wherein the same operation is applied in a series of iterations will also tend to conform to the logarithm law.

4. The Benford/Newcomb Law is virtually base invariant (i.e., base 10, base 2, base e, …).

5. Hill8 has published “a formal rigorous proof that the log law is the only probability distribution which is scale invariant, and the only one which is base invariant (excluding the constant 1).”

So it might seem like we are have some solid evidence for explaining why the Benford/Newcomb Law is so common in nature and human endeavors. However, there are some caveats that we must consider.

References:

1. Newcomb, S., American Journal of Mathematics, 4, 39–40, 1881.

2. Hill, T.P., Proceedings of the American Mathematical Society, 123(3), 887–895, 1995.

3. By Gknor — Own work, Public Domain, https://commons.wikimedia.org/w/index.php?curid=4509760

4. Benford, F., Proceedings of the American Philosophical Society, 78, 551–572, 1938.

5. Raimi, R. A., The American Mathematical Monthly, 83(7), 521–538, 1976.

6. Raimi, R. A., Scientific American, 221, 109–129, 1969.

7. Hill, T. P., The American Mathematical Monthly, 102, 322–327, 1995.

8. Hill, T. P., American Scientist, 86, 358–363, 1998.

9.

--

--

No responses yet