Format

Send to

Choose Destination
Phys Rev E Stat Nonlin Soft Matter Phys. 2010 Jun;81(6 Pt 2):066119. Epub 2010 Jun 25.

Statistical mechanics of letters in words.

Author information

1
Joseph Henry Laboratories of Physics, Princeton University, Princeton, New Jersey 08544, USA.

Abstract

We consider words as a network of interacting letters, and approximate the probability distribution of states taken on by this network. Despite the intuition that the rules of English spelling are highly combinatorial and arbitrary, we find that maximum entropy models consistent with pairwise correlations among letters provide a surprisingly good approximation to the full statistics of words, capturing ∼92% of the multi-information in four-letter words and even "discovering" words that were not represented in the data. These maximum entropy models incorporate letter interactions through a set of pairwise potentials and thus define an energy landscape on the space of possible words. Guided by the large letter redundancy we seek a lower-dimensional encoding of the letter distribution and show that distinctions between local minima in the landscape account for ∼68% of the four-letter entropy. We suggest that these states provide an effective vocabulary which is matched to the frequency of word use and much smaller than the full lexicon.

PMID:
20866490
PMCID:
PMC3648583
DOI:
10.1103/PhysRevE.81.066119
[Indexed for MEDLINE]
Free PMC Article

Supplemental Content

Full text links

Icon for PubMed Central
Loading ...
Support Center