Readings – Statistical Physics and Information Theory – Books

Readings – Statistical Physics and Information Theory – Books

Readings – Statistical Physics and Information Theory – Books

Tsallis on Non-extensive Entropy

Introduction to Nonextensive Statistical Mechanics, by Constantino Tsallis.
Introduction to Nonextensive Statistical Mechanics, by Constantino Tsallis.

Tsallis, Constantino (2009). Introduction to Nonextensive Statistical Mechanics: Approaching a Complex World, New York: Springer. (Google Books)

Goldenfeld on Phase Transitions and Renormalization Group Theory

Phase Transitions and the Renormalization Group by N. Goldenfeld.
Phase Transitions and the Renormalization Group by N. Goldenfeld.

Goldenfeld, N. (1992). Lectures On Phase Transitions And The Renormalization Group (Frontiers in Physics) (Addison-Wesley).

Covering the elementary aspects of the physics of phases transitions and the renormalization group, this popular book is widely used both for core graduate statistical mechanics courses as well as for more specialized courses. Emphasizing understanding and clarity rather than technical manipulation, these lectures de-mystify the subject and show precisely “how things work.” Goldenfeld keeps in mind a reader who wants to understand why things are done, what the results are, and what in principle can go wrong. The book reaches both experimentalists and theorists, students and even active researchers, and assumes only a prior knowledge of statistical mechanics at the introductory graduate level.Advanced, never-before-printed topics on the applications of renormalization group far from equilibrium and to partial differential equations add to the uniqueness of this book.

van Enter et al. on Renormalization Group Theory and Non-Gibbsian Measures

van Enter, A.C.D., Fernandez, R., & Sokal, A.D. (1993). Regularity Properties and Pathologies of Position-Space Renormalization-Group Transformations, NY: Cornell University Library.

Abstract:

We reconsider the conceptual foundations of the renormalization-group (RG) formalism, and prove some rigorous theorems on the regularity properties and possible pathologies of the RG map. Regarding regularity, we show that the RG map, defined on a suitable space of interactions (= formal Hamiltonians), is always single-valued and Lipschitz continuous on its domain of definition. This rules out a recently proposed scenario for the RG description of first-order phase transitions. On the pathological side, we make rigorous some arguments of Griffiths, Pearce and Israel, and prove in several cases that the renormalized measure is not a Gibbs measure for any reasonable interaction. This means that the RG map is ill-defined, and that the conventional RG description of first-order phase transitions is not universally valid. For decimation or Kadanoff transformations applied to the Ising model in dimension d≥3, these pathologies occur in a full neighborhood {β>β0,|h|<ϵ(β)} of the low-temperature part of the first-order phase-transition surface. For block-averaging transformations applied to the Ising model in dimension d≥2, the pathologies occur at low temperatures for arbitrary magnetic-field strength. Pathologies may also occur in the critical region for Ising models in dimension d≥4. We discuss in detail the distinction between Gibbsian and non-Gibbsian measures, and give a rather complete catalogue of the known examples. Finally, we discuss the heuristic and numerical evidence on RG pathologies in the light of our rigorous theorems.

As this book is possibly not available in most local libraries, here’s an article that seems on a related theme:
Pathological Behavior of Renormalization-Group Maps at High Fields and Above the Transition Temperature, van Enter, A.C.D., Fernandez, R., & Kotecky´, Roman (1994). Preprint.

Abstract

We show that decimation transformations applied to high-q Potts models result in non-Gibbsian measures even for temperatures higher than the transition temperature. We also show that majority transformations applied to the Ising model in a very strong field at low temperatures produce non-Gibbsian measures. This shows that pathological behavior of renormalization-group transformations is even more widespread than previous examples already suggested.

This is one of their points, and I think it might be important:

Together these claims imply that by changing block spins arbitrarily far away, one changes the phase of the internal spins, which in turns changes the value of blockspin averages close to the origin. For instance it modifies the (average) value of the block-spin at the origin and that of one of its nearest-neighbors (when these spins are “unfixed”; this part of the argument is almost identical to the corresponding argument for block-averaging transformations; see Step 3 in [30, pp. 1008-1009].) This modification takes place despite the fact that the intermediate block spins are fixed in the configuration w′special. This means that the direct influence of far away block spins does not decrease with the distance, hence the renormalized measure can not be Gibbsian.

Crucial refs from the above:
[12] R. B. Griffiths and P. A. Pearce. Position-space renormalization-group transformations:
Some proofs and some problems. Phys. Rev. Lett., 41:917–920, 1978.
[13] R. B. Griffiths and P. A. Pearce. Mathematical properties of position-space
renormalization-group transformations. J. Stat. Phys., 20:499–545, 1979.
20] R. B. Israel. Banach algebras and Kadanoff transformations. In J. Fritz, J. L.
Lebowitz, and D. Sz´asz, editors, Random Fields (Esztergom, 1979), Vol. II,
pages 593–608. North-Holland, Amsterdam, 1981.

See the definition of Gibbsian variables in: Topic 2925: Variables: Gibbsian and Non-Gibbsian

Look also at:
Külske, Christof, Le Ny, Arnaud, and Redig, Frank (2004). Relative Entropy and Variational Properties of Generalized Gibbsian Measures, The Annals of Probability 2004, 32 (2), 1691–1726. DOI: 10.1214/009117904000000342

Information Theory, Inference, and Learning Algorithms

Information Theory, Inference, and Learning Algorithms: MacKay DJC: Information Theory, Inference, and Learning Algorithms. Cambridge University Press; 2003.

Merhav N: Statistical Physics and Information Theory, Foundations and Trends (R) in Communications and Information Theory, vol 6, nos 1-2, pp 1-212, 2009. Access to Google Books result on keywords “information theory free energy equilibrium”.