search?q=canonica&btnI=lucky

# Phase Transitions

**Aliases**

*This identifies the pattern and should be representative of the concept that it describes. The name should be a noun that should be easily usable within a sentence. We would like the pattern to be easily referenceable in conversation between practitioners.
*

**Intent**

*Describes in a single concise sentence the meaning of the pattern.
*

**Motivation**

* This section describes the reason why this pattern is needed in practice. Other pattern languages indicate this as the Problem. In our pattern language, we express this in a question or several questions and then we provide further explanation behind the question.*

**Sketch**

*This section provides alternative descriptions of the pattern in the form of an illustration or alternative formal expression. By looking at the sketch a reader may quickly understand the essence of the pattern.
*

**Discussion**

*This is the main section of the pattern that goes in greater detail to explain the pattern. We leverage a vocabulary that we describe in the theory section of this book. We donâ€™t go into intense detail into providing proofs but rather reference the sources of the proofs. How the motivation is addressed is expounded upon in this section. We also include additional questions that may be interesting topics for future research.*

**Known Uses**

*Here we review several projects or papers that have used this pattern.*

**Related Patterns**
*
In this section we describe in a diagram how this pattern is conceptually related to other patterns. The relationships may be as precise or may be fuzzy, so we provide further explanation into the nature of the relationship. We also describe other patterns may not be conceptually related but work well in combination with this pattern.*

*Relationship to Canonical Patterns*

*Relationship to other Patterns*

**Further Reading**

*We provide here some additional external material that will help in exploring this pattern in more detail.*

**References**

*To aid in reading, we include sources that are referenced in the text in the pattern.*

https://people.maths.ox.ac.uk/tanner/papers/SPARS_PTRIP_BlCaTa.pdf Phase Transitions for Restricted Isometry Properties

https://www.quantamagazine.org/20141015-at-the-far-ends-of-a-new-universal-law/ At the Far Ends of a New Universal Law

http://arxiv.org/abs/1511.02476v4 Statistical physics of inference: Thresholds and algorithms

A growing body of work has shown that often we can understand and locate these fundamental barriers by thinking of them as phase transitions in the sense of statistical physics. Moreover, it turned out that we can use the gained physical insight to develop new promising algorithms. Connection between inference and statistical physics is currently witnessing an impressive renaissance and we review here the current state-of-the-art, with a pedagogical focus on the Ising model which formulated as an inference problem we call the planted spin glass.

(1) Under what conditions is the information contained in the measurements sufficient for a satisfactory inference to be possible? (2) What are the most efficient algorithms for this task?

When the amount of information is too low a successful inference of the signal is not possible for any algorithm: the corresponding information is simply insufficient. On the other hand, for large enough amount of data, inference is possible, and the two regime are separated by a sharp phase transition. Finally, and perhaps more importantly, there is often (as in first order transition in physics), an intermediate regime where a successful inference is in principal possible but algorithmically hard. A first order phase transition is always linked to appearance of computational hardness, whereas a second order phase transition is not.

http://arxiv.org/pdf/1607.08601v1.pdf Limit theorems for eigenvectors of the normalized Laplacian for random graphs

We prove a central limit theorem for the components of the eigenvectors corresponding to the d largest eigenvalues of the normalized Laplacian matrix of a finite dimensional random dot product graph.

https://www.math.ucdavis.edu/~tracy/talks/SITE7.pdf The Distributions of Random Matrix Theory and their Applications

http://arxiv.org/pdf/1607.00816v1.pdf Time for dithering: fast and quantized random embeddings via the restricted isometry property

In this work, we prove that many linear maps known to respect the restricted isometry property (RIP), can induce a quantized random embedding with controllable multiplicative and additive distortions with respect to the pairwise distances of the data points beings considered. In other words, linear matrices having fast matrix-vector multiplication algorithms (e.g., based on partial Fourier ensembles or on the adjacency matrix of unbalanced expanders), can be readily used in the definition of fast quantized embeddings with small distortions. This implication is made possible by applying right after the linear map an additive and random dither that stabilizes the impact of a uniform scalar quantization applied afterwards.

https://arxiv.org/abs/1702.08039 Criticality and Deep Learning, Part I: Theory vs. Empirics

Motivated by the idea that criticality and universality of phase transitions might play a crucial role in achieving and sustaining learning and intelligent behaviour in biological and artificial networks, we analyse a theoretical and a pragmatic experimental set up for critical phenomena in deep learning. On the theoretical side, we use results from statistical physics to carry out critical point calculations in feed-forward/fully connected networks, while on the experimental side we set out to find traces of criticality in deep neural networks. This is our first step in a series of upcoming investigations to map out the relationship between criticality and learning in deep networks.

https://arxiv.org/abs/1703.02435v1 Unsupervised learning of phase transitions: from principle component analysis to variational autoencoders

We find that the predicted latent parameters correspond to the known order parameters. The latent representation of the states of the models in question are clustered, which makes it possible to identify phases without prior knowledge of their existence or the underlying Hamiltonian. Furthermore, we find that the reconstruction loss function can be used as a universal identifier for phase transitions.

The averaged reconstruction loss also changes drastically at Tc during a phase transition. While the latent parameter is different for each physical model, the reconstruction loss can be used as a universal parameter to identify phase transitions. To summarize, without any knowledge of the Ising model and its order parameter, but sample con- figurations, we can find a good estimation for its order parameter and the occurrence of a phase transition.

https://arxiv.org/pdf/1612.01717v2.pdf Statistical mechanics of unsupervised feature learning in a restricted Boltzmann machine with binary synapses