Editing Metropolis Monte Carlo
Jump to navigation
Jump to search
The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then publish the changes below to finish undoing the edit.
Latest revision | Your text | ||
Line 1: | Line 1: | ||
'''Metropolis Monte Carlo (MMC) | |||
''' | |||
== Main features == | == Main features == | ||
Metropolis Monte Carlo simulations can be carried out in different ensembles. For the case of one-component systems the usual ensembles are: | Metropolis Monte Carlo simulations can be carried out in different ensembles. For the case of one-component systems the usual ensembles are: | ||
* [[Canonical ensemble]] (<math> NVT </math> ) | * [[Canonical ensemble]] (<math> NVT </math> ) | ||
* [[Isothermal- | |||
* [[Isothermal-Isobaric ensemble]] (<math> NpT </math>) | |||
* [[Grand canonical ensemble]] (<math> \mu V T </math>) | * [[Grand canonical ensemble]] (<math> \mu V T </math>) | ||
In the case of mixtures, it is useful to consider the so-called [[Semi- | |||
In the case of mixtures, it is useful to consider the so-called: | |||
* [[Semi-Grand ensembles]] | |||
The purpose of these techniques is to sample representative configurations of the system at the corresponding | The purpose of these techniques is to sample representative configurations of the system at the corresponding | ||
thermodynamic conditions. The sampling techniques make use | thermodynamic conditions. | ||
The sampling techniques make use the so-called pseudo-[[Random numbers |random number]] generators. | |||
Metropolis Monte Carlo makes use of importance sampling techniques. | |||
== Configuration == | == Configuration == | ||
A configuration is a microscopic realisation of the ''thermodynamic state'' of the system. | A configuration is a microscopic realisation of the ''thermodynamic state'' of the system. | ||
To define a configuration (denoted as <math> \left. X \right. </math> ) we usually require: | To define a configuration (denoted as <math> \left. X \right. </math> ) we usually require: | ||
*The position coordinates of the particles | *The position coordinates of the particles | ||
*Depending on the problem, other variables like volume, number of particles, etc. | *Depending on the problem, other variables like volume, number of particles, etc. | ||
The probability of a given configuration, denoted as <math> \Pi \left( X | k \right) </math>, | The probability of a given configuration, denoted as <math> \Pi \left( X | k \right) </math>, | ||
depends on the parameters <math> k </math> (e.g. | depends on the parameters <math> k </math> (e.g. temperature, pressure) | ||
Example: | Example: | ||
Line 25: | Line 39: | ||
In most of the cases <math> \Pi \left( X | k \right) </math> exhibits the following features: | In most of the cases <math> \Pi \left( X | k \right) </math> exhibits the following features: | ||
* It is a function of many variables | * It is a function of many variables | ||
* Only for a very small fraction of the configurational space the value of <math> \Pi \left( X | k \right) </math> is not negligible | * Only for a very small fraction of the configurational space the value of <math> \Pi \left( X | k \right) </math> is not negligible | ||
Due to these properties, Metropolis Monte Carlo requires the use of '''Importance Sampling''' techniques | Due to these properties, Metropolis Monte Carlo requires the use of '''Importance Sampling''' techniques | ||
== Importance sampling == | == Importance sampling == | ||
The importance sampling is useful to evaluate average values given by: | |||
: <math> \langle A(X|k) \rangle = \int dX \Pi(X|k) A(X) </math> | : <math> \langle A(X|k) \rangle = \int dX \Pi(X|k) A(X) </math> | ||
where: | where: | ||
* <math> \left. X \right. </math> represents a set of many variables, | * <math> \left. X \right. </math> represents a set of many variables, | ||
* <math> \left. \Pi \right. </math> is a probability distribution function which depends on <math> X </math> and on the constraints (parameters) <math> k </math> | * <math> \left. \Pi \right. </math> is a probability distribution function which depends on <math> X </math> and on the constraints (parameters) <math> k </math> | ||
* <math> \left. A \right. </math> is an observable which depends on the <math> X </math> | * <math> \left. A \right. </math> is an observable which depends on the <math> X </math> | ||
Depending on the behavior of <math> \left. \Pi \right. </math> we can use to compute <math> \langle A(X|k) \rangle </math> different numerical methods: | Depending on the behavior of <math> \left. \Pi \right. </math> we can use to compute <math> \langle A(X|k) \rangle </math> different numerical methods: | ||
* If <math> \left. \Pi \right. </math> is, roughly speaking, quite uniform: [[Monte Carlo Integration]] methods can be effective | * If <math> \left. \Pi \right. </math> is, roughly speaking, quite uniform: [[Monte Carlo Integration]] methods can be effective | ||
* If <math> \left. \Pi \right. </math> has | * If <math> \left. \Pi \right. </math> has significative values only for a small part of the configurational space, Importance sampling could be the appropriate technique | ||
'''Sketches of the Method:''' | |||
* Random walf over <math> \left. X \right. </math>: | |||
: <math> \left. X_{i+1}^{test} = X_{i} + \delta X \right. </math> | : <math> \left. X_{i+1}^{test} = X_{i} + \delta X \right. </math> | ||
From the configuration at the i-th step | From the configuration at the i-th step we build up a ''test'' configuration by modifying a bit (some of) the variables <math> X </math> | ||
* The test configuration is accepted as the new (i+1)-th configuration with certain criteria (which depends basically on <math> \Pi </math>) | * The test configuration is accepted as the new (i+1)-th configuration with certain criteria (which depends basically on <math> \Pi </math>) | ||
* If the test configuration is not accepted as the new configuration then: <math> \left. X_{i+1} = X_i \right. </math> | * If the test configuration is not accepted as the new configuration then: <math> \left. X_{i+1} = X_i \right. </math> | ||
The procedure is based on the [[Markov | |||
The acceptance criteria must be chosen to guarantee that after a certain equilibration ''time'' a given configuration appears with probability given by <math> \Pi(X|k) </math> | The procedure is based on the [[Markov Chain]] formalism, and on the [[Perrom-Frobenius theorem]]. | ||
The acceptance criteria must be chosen to guarantee that after a certain equilibration ''time'' a given configuration appears with | |||
probability given by <math> \Pi(X|k) </math> | |||
== Temperature == | == Temperature == | ||
The | The temperature is usually fixed in Metropolis Monte Carlo simulations, since in classical statistics the kinetic degrees of freedom (momenta) can be generally, integrated out. However, it is possible to design procedures to perform Metropolis Monte Carlo simulations in the [[Microcanonical ensemble| microcanonical ensemble]] (NVE). | ||
See [[Monte Carlo in the microcanonical ensemble]] | See [[Monte Carlo in the microcanonical ensemble]] | ||
== Boundary Conditions == | == Boundary Conditions == | ||
== | The simulation of homogeneous systems is usually carried out using periodic boundary conditions | ||
== Advanced techniques == | |||
* [[Configurational bias Monte Carlo]] | |||
* | * [[Gibbs-Duhem integration]] | ||
* | * [[Cluster algorithms]] | ||
== References == | == References == | ||
#M.P. Allen and D.J. Tildesley "Computer simulation of liquids", Oxford University Press | |||
#[http://dx.doi.org/10.1063/1.1699114 Nicholas Metropolis, Arianna W. Rosenbluth, Marshall N. Rosenbluth, Augusta H. Teller and Edward Teller, "Equation of State Calculations by Fast Computing Machines", Journal of Chemical Physics '''21''' pp.1087-1092 (1953)] | |||
[[Category: Monte Carlo]] | [[Category: Monte Carlo]] |