# Ising Model and Simulation

Computational Physics, Department of Physics, University of Basic Science, (Fall 2008). The Isinig model is a famous and applicable model in the statistical mechanics. In this project, I propose to consider the Ising model and its computational simulation. In this text, I try to review the main idea of the Ising model and Metropolis algorithm, the computational approach to this problem. Finally, I report the result of my own simulation and compare its results with analytical solution as a conclusion.

## Introduction

Statistical mechanics is a very active area of research and there are many open problems. In fact, the general phenomenon of phase transitions is still actively being researched. An example of these phenomena is ferromagnetic transition in the Curie temperature. In this phenomenon, Ferromagnetic materials, when heated, eventually lose their magnetic properties. This loss becomes complete above the Curie temperature, named after the French physicist Pierre Curie, who discovered it in 1895 [1-2].
Paramagnetism is the normal induced magnetization of a material when it is put into a magnetic field. Paramagnetism could be explained by assuming that all spins are independent. Ferromagnetism is the spontaneous magnetization occurring in certain metals such as iron and nickel. To explain this phenomenon we must take into account the interaction between spins .
The Ising model is an attempt to simulate the structure of this phase transition. Originally, it was invented for the phase transition of ferromagnets at the Curie temperature; however, in the course of time it was realized that with only slight changes the model can also be applied to other phase transitions, like order-disorder transitions in binary alloys. Furthermore, the model may be applied to several modem problems of many particle physics, for instance for the description of so-called spin glasses. These are metals having amorphous instead of crystalline structures, which have the interesting property of no vanishing entropy at T = 0. Recently, it has been realized that Ising's idea (in modified form) could also explain pattern recognition in schematic neural networks. Thus, this model gains more and more importance for the development of models for the human brain [3-8].
The main idea of Ising model is one orientation spin-interaction of fixed point on the lattice. This simplification is based on quantum mechanics concepts ‘exchange force’ . It’s meaning that the only outer electrons (conduction electrons) on the matter have interaction and their interaction is only in z-direction. On the other hand, this means that the characters of lattice like dimension, size, morphology and structure are variables of this model. We have to resume these quantities to have a specific sample of Ising model.
The other consideration of Ising model is that the only nearest neighbors interactions are necessary and others can be neglect . The finally assumption is that the lattice in statistical limit are periodic . ###### Figure 1. The 2-demension lattice; (a) Square (4 nearest neighbor); (b) Triangular (6 nearest neighbor) (c) Hexagunal (3 nearest neighbor). 

There are exact analytical solutions for 1 and 2 dimensions Ising models. In this project I have to simulate the 2-dimension Ising model with Square lattice. To see exact solution of this structure with addition theoretical detail, refer to Riechl .

## Computational Approach

The computational simulation of Ising model is based on the two facts: The statistical nature of events and the Ising Hamiltonian . From statistical mechanics, we know the probability of every event A is proportional to Boltzmann factor: so the transition probability of A → B phase transition is: The computational algorithm which applies this factor at a problem is Metropolis algorithm , which I needed to simulate the Ising model. In this section, I will introduce the Metropolis algorithm and then present my own simulation algorithm.

### (a) Metropolis Algorithm Steps :

1. Disorder condition (hot start): In this state we start our program with complete disordered state, which means the entropy is maximal.
2. Order condition (cold start): In this state we start our program with complete ordered state, which means the entropy is minimal and the temperature is 0.
2. Generate a new state:
1. Choose ijth element randomly (Monte Carlo)
2. Reverse ijth spin direction to create a trial configuration
3. Calculate the energy of the trial configuration
4. If E new < E {sij} then accept the change
5. Else with random process with P probability accept the change

### (c) Metropolis Algorithm [Appendix: Subroutine MonteCarlo]:

###### Figure 4. The result structures in 40×40 lattice.

The result structures of simulation (in 40×40 lattice) are shown in Figure 4. As we expect, the entropy of system increasing by the temperature of system.
Figure 5 are the charts of results for 20×20, 40×40, 60×60 and 100×100 in compare of analytical graph. As we expect, the value of the average magnetization in high temperature are decrease by increasing of system. It’s mean that the system becomes nearest to statistical limit and the simulation work well as a statistical system. The other point is that the qualitative behavior of system and the simulation’s results are same. On the other hand, we know from renormalization group method in the critical points we have a self similarity property for the system. This figure shows that the critical point is TC ≈ 2.1 j/kB, which is near to Curie temperature (TC ≈ 2.269 j/kB) validly.

## Acknowledgment

It’s a pleasure to thank Dr. Esfarjani and Dr. Ghaleh, my professors of computational physics and Dr. Langari, Dr. Sadat and Dr. Ketabi, my professors of statistical physics and thermodynamics.

## Refrences

2. Dorlas, T.C. “Statistical Mechanics: Fundamentals and Model Solutions”, IOP Publishing Ltd, (1999), Ch. 28, 160-173.
3. Greiner, W., Neise, L. and Stoecker, H. “Thermodynamics and Statistical Mechanics”, Springer, (1995), Ch. 18, 436-456.
4. Nishimori, H., “Statistical Physics of Spin Glasses and Information Processing: An Introduction”, Oxford University Press, (2001), Ch. 1, 1-10.
5. Pathria, K. S., “Statistical Mechanic”, 2nd ed., Butterworth-Heinemann, (1996), Ch. 11, 314-319.
6. Huang, K., “Statistical Mechanics”, 2nd ed., John Wiley & Sons, (1987), Ch. 14, 341-363.
7. Reichl, L.E., “A Modem Course in Statistical Physics”, 2nd ed., John Wiley & Sons, (1998), Ch. 8, 462-485.
8. Dalvit, D. A. R., Frastai, J. and Lawrie, I. D., “Problems on Statistical Mechanics”, IOP Publishing Ltd, (1999), Ch. 5, Prob. 14-28.
9. Landau, R. and Paez, M. J. “Computation Physics: Problem Solving with Computers”, John Wiley & Sons, (1976), Ch. 22, 297-305.