Latest revision |
Your text |
Line 1: |
Line 1: |
| The '''Metropolis Monte Carlo''' technique
| | '''Metropolis Monte Carlo (MMC) |
| <ref>[http://dx.doi.org/10.1063/1.1699114 Nicholas Metropolis, Arianna W. Rosenbluth, Marshall N. Rosenbluth, Augusta H. Teller and Edward Teller, "Equation of State Calculations by Fast Computing Machines", Journal of Chemical Physics '''21''' pp.1087-1092 (1953)]</ref>
| | ''' |
| is a variant of the original [[Monte Carlo]] method proposed by [[Nicholas Metropolis]] and [[Stanislaw Ulam]] in 1949 <ref>[http://links.jstor.org/sici?sici=0162-1459%28194909%2944%3A247%3C335%3ATMCM%3E2.0.CO%3B2-3 Nicholas Metropolis and S. Ulam "The Monte Carlo Method", Journal of the American Statistical Association '''44''' pp. 335-341 (1949)]</ref>
| | |
| == Main features == | | == Main features == |
| Metropolis Monte Carlo simulations can be carried out in different ensembles. For the case of one-component systems the usual ensembles are:
| | MMC Simulations can be carried out in different ensembles. For the case of one-component systems the usual ensembles are: |
| | |
| * [[Canonical ensemble]] (<math> NVT </math> ) | | * [[Canonical ensemble]] (<math> NVT </math> ) |
| * [[Isothermal-isobaric ensemble]] (<math> NpT </math>) | | |
| | * [[Isothermal-Isobaric ensemble]] (<math> NpT </math>) |
| | |
| * [[Grand canonical ensemble]] (<math> \mu V T </math>) | | * [[Grand canonical ensemble]] (<math> \mu V T </math>) |
| In the case of mixtures, it is useful to consider the so-called [[Semi-grand ensembles]].
| | |
| The purpose of these techniques is to sample representative configurations of the system at the corresponding | | The purpose of these techniques is to sample representative configurations of the system at the corresponding |
| thermodynamic conditions. The sampling techniques make use of the so-called pseudo-[[Random numbers |random number]] generators. | | thermodynamic conditions. |
| | |
| | The sampling techniques make use the so-called pseudo-[[Random numbers |random number]] generators |
| | |
| | MMC makes use of importance sampling techniques |
|
| |
|
| == Configuration == | | == Configuration == |
|
| |
|
| A configuration is a microscopic realisation of the ''thermodynamic state'' of the system. | | A configuration is a microscopic realisation of the ''thermodynamic state'' of the system. |
| | |
| To define a configuration (denoted as <math> \left. X \right. </math> ) we usually require: | | To define a configuration (denoted as <math> \left. X \right. </math> ) we usually require: |
| | |
| *The position coordinates of the particles | | *The position coordinates of the particles |
| | |
| *Depending on the problem, other variables like volume, number of particles, etc. | | *Depending on the problem, other variables like volume, number of particles, etc. |
| | |
| The probability of a given configuration, denoted as <math> \Pi \left( X | k \right) </math>, | | The probability of a given configuration, denoted as <math> \Pi \left( X | k \right) </math>, |
| depends on the parameters <math> k </math> (e.g. [[temperature]], [[pressure]]) | | depends on the parameters <math> k </math> (e.g. temperature, pressure) |
|
| |
|
| Example:
| | In most of the cases <math> \Pi \left( X | k \right) </math> exhibits the following features: |
|
| |
|
| :<math> \Pi_{NVT}(X|T) \propto \exp \left[ - \frac{ U (X) }{k_B T} \right] </math>
| | * It is a function of many variables |
| | * Only for a very small fraction of the configurational space the value of <math> \Pi \left( X | k \right) </math> is not negligible |
|
| |
|
| In most of the cases <math> \Pi \left( X | k \right) </math> exhibits the following features:
| | Due to these properties, MMC requires the use of '''Importance Sampling''' techniques |
| * It is a function of many variables
| |
| * Only for a very small fraction of the configurational space the value of <math> \Pi \left( X | k \right) </math> is not negligible.
| |
| Due to these properties, Metropolis Monte Carlo requires the use of '''Importance Sampling''' techniques | |
|
| |
|
| == Importance sampling == | | == Importance sampling == |
|
| |
| Importance sampling is useful to evaluate average values given by:
| |
|
| |
| : <math> \langle A(X|k) \rangle = \int dX \Pi(X|k) A(X) </math>
| |
|
| |
| where:
| |
| * <math> \left. X \right. </math> represents a set of many variables,
| |
| * <math> \left. \Pi \right. </math> is a probability distribution function which depends on <math> X </math> and on the constraints (parameters) <math> k </math>
| |
| * <math> \left. A \right. </math> is an observable which depends on the <math> X </math>
| |
| Depending on the behavior of <math> \left. \Pi \right. </math> we can use to compute <math> \langle A(X|k) \rangle </math> different numerical methods:
| |
| * If <math> \left. \Pi \right. </math> is, roughly speaking, quite uniform: [[Monte Carlo Integration]] methods can be effective
| |
| * If <math> \left. \Pi \right. </math> has significant values only for a small part of the configurational space, Importance sampling could be the appropriate technique
| |
| ====Outline of the Method====
| |
| * Random walk over <math> \left. X \right. </math>:
| |
|
| |
| : <math> \left. X_{i+1}^{test} = X_{i} + \delta X \right. </math>
| |
|
| |
| From the configuration at the i-th step one builds up a ''test'' configuration by slightly modifying some of the variables <math> X </math>
| |
| * The test configuration is accepted as the new (i+1)-th configuration with certain criteria (which depends basically on <math> \Pi </math>)
| |
| * If the test configuration is not accepted as the new configuration then: <math> \left. X_{i+1} = X_i \right. </math>
| |
| The procedure is based on the [[Markov chain]] formalism, and on the [[Perron-Frobenius theorem]].
| |
| The acceptance criteria must be chosen to guarantee that after a certain equilibration ''time'' a given configuration appears with probability given by <math> \Pi(X|k) </math>
| |
|
| |
|
| == Temperature == | | == Temperature == |
|
| |
|
| The [[temperature]] is usually fixed in Metropolis Monte Carlo simulations, since in classical statistics the kinetic degrees of freedom (momenta) can be generally, integrated out. However, it is possible to design procedures to perform Metropolis Monte Carlo simulations in the [[Microcanonical ensemble| microcanonical ensemble]] (NVE). | | The temperature is usually fixed in MMC simulations, since in classical statistics the kinetic degrees of freedom (momenta) can be generally, integrated out. |
|
| |
|
| See [[Monte Carlo in the microcanonical ensemble]]
| | However, it is possible to design procedures to perform MMC simulations in the [[Microcanonical ensemble| microcanonical ensemble]] (NVE). |
|
| |
|
| == Boundary Conditions == | | == Boundary Conditions == |
| The simulation of homogeneous systems is usually carried out using [[periodic boundary conditions]].
| |
|
| |
|
| == Initial configuration ==
| | The simulation of homogeneous systems is usually carried out using periodic boundary conditions |
|
| |
|
| The usual choices for the initial configuration in fluid simulations are:
| | == Advanced techniques == |
|
| |
|
| * an equilibrated configuration under similar conditions (for example see <ref>[http://dx.doi.org/10.1016/j.cpc.2005.02.006 Carl McBride, Carlos Vega and Eduardo Sanz "Non-Markovian melting: a novel procedure to generate initial liquid like phases for small molecules for use in computer simulation studies", Computer Physics Communications '''170''' pp. 137-143 (2005)]</ref>) | | * [[Configurational bias Monte Carlo]] |
|
| |
|
| * an ordered lattice structure. For details concerning the construction of such structures see: [[Lattice Structures | lattice structures]]. | | * [[Gibbs-Duhem integration]] |
| == Advanced techniques ==
| | |
| :''Main article: [[Monte Carlo]]''
| | * [[Cluster algorithms]] |
|
| |
|
| == References == | | == References == |
| <references/>
| |
| '''Related reading'''
| |
| *[http://links.jstor.org/sici?sici=0162-1459%28194909%2944%3A247%3C335%3ATMCM%3E2.0.CO%3B2-3 Nicholas Metropolis and S. Ulam "The Monte Carlo Method", Journal of the American Statistical Association '''44''' pp. 335-341 (1949)]
| |
| *[http://library.lanl.gov/cgi-bin/getfile?00326886.pdf Herbert L. Anderson "Metropolis, Monte Carlo, and the MANIAC", Los Alamos Science '''14''' pp. 96-107 (1986)]
| |
| *[http://library.lanl.gov/cgi-bin/getfile?00326866.pdf N. Metropolis "The Beginnning of the Monte Carlo Method" Los Alamos Science '''15''' pp. 125-130 (1987)]
| |
| *[http://cise.aip.org/getpdf/servlet/GetPDFServlet?filetype=pdf&id=CSENFA000002000001000065000001&idtype=cvips&prog=normal Isabel Beichl and Francis Sullivan "The Metropolis Algorithm", Computing in Science & Engineering '''2''' Issue 1 pp. 65-69 (2000)]
| |
| *[http://dx.doi.org/10.1063/1.1632112 Marshall N. Rosenbluth "Genesis of the Monte Carlo Algorithm for Statistical Mechanics", AIP Conference Proceedings '''690''' pp. 22-30 (2003)]
| |
| *[http://dx.doi.org/10.1063/1.1632114 Marshall N. Rosenbluth "Proof of Validity of Monte Carlo Method for Canonical Averaging", AIP Conference Proceedings '''690''' pp. 32-38 (2003)]
| |
| *[http://dx.doi.org/10.1063/1.1632115 William W. Wood "A Brief History of the Use of the Metropolis Method at LANL in the 1950s", AIP Conference Proceedings '''690''' pp. 39-44 (2003)]
| |
| *[http://dx.doi.org/10.1063/1.1632124 David P. Landau "The Metropolis Monte Carlo Method in Statistical Physics", AIP Conference Proceedings '''690''' pp. 134-146 (2003)]
| |
|
| |
|
| [[Category: Monte Carlo]] | | #M.P. Allen and D.J. Tildesley "Computer simulation of liquids", Oxford University Press |
| | #[http://dx.doi.org/10.1063/1.1699114 Nicholas Metropolis, Arianna W. Rosenbluth, Marshall N. Rosenbluth, Augusta H. Teller and Edward Teller, "Equation of State Calculations by Fast Computing Machines", Journal of Chemical Physics '''21''' pp.1087-1092 (1953)] |