lecture notes on Advanced physical Chemistry

Advanced Physical Chemistry Statistical Thermodynamics and techniques involved in advanced physical chemistry, advanced physical chemistry for process metallurgy
Dr.SamuelHunt Profile Pic
Dr.SamuelHunt,United Arab Emirates,Teacher
Published Date:21-07-2017
Your Website URL(Optional)
Comment
Lecture Notes Advanced Physical Chemistry Statistical Thermodynamics Gunnar JeschkeCopyright©2015 Gunnar Jeschke Titleimage: „Billard“ von No-w-ay incollaborationwithH.Caps-EigenesWerk. Lizenziert unter GFDL über WikimediaCommons- https://commons.wikimedia.org/wiki/File:Billard.JPG Chapter2 Word cloud: http://de.123rf.com/profile_radiantskies Chapter3 Dice image: http://de.123rf.com/profile_whitetag Chapter4 Matryoshka image: http://de.123rf.com/profile_mikewaters Chapter5 Word cloud: http://de.123rf.com/profile_radiantskies Chapter7 Dumbbell image: http://de.123rf.com/profile_filipobr Chapter8 Spaghetti image: http://de.123rf.com/profile_winterbee PublishedbyGunnarJeschke http://www.epr.ethz.ch Licensed under the Creative Commons Attribution-NonCommercial 3.0 Unported License (the “License”). YoumaynotusethisfileexceptincompliancewiththeLicense. Youmayobtaina copyofthe License athttp://creativecommons.org/licenses/by-nc/3.0. Design and layout of the lecture notes are based on the Legrand Orange Book available at http://latextemplates.com/template/the-legrand-orange-book. Firstprinting, September 2015Contents 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 1.1 General Remarks 7 1.2 Suggested Reading 8 1.3 Acknowledgment 9 2 Basics of Statistical Mechanics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2.1 Basic Assumptions of Statistical Thermodynamics 11 2.1.1 Thermodynamics Based on Statistical Mechanics . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2.1.2 The Markovian Postulate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.2 Phase space 13 2.2.1 Hamiltonian Equations of Motion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.2.2 The Liouville Equation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.2.3 Quantum Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.3 Statistical Mechanics Based on Postulates 15 2.3.1 The Penrose Postulates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 2.3.2 Implications of the Penrose Postulates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 3 Probability Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 3.1 Discrete Probability Theory 17 3.1.1 Discrete Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 3.1.2 Multiple Discrete Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 3.1.3 Functions of Discrete Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3.1.4 Discrete Probability Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3.1.5 Probability Distribution of a Sum of Random Numbers . . . . . . . . . . . . . . . . . . . . . . . . . 21 3.1.6 Binomial Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 3.1.7 Stirling’s Formula . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243.2 Continuous Probability Theory 24 3.2.1 Probability Density . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 3.2.2 Selective Integration of Probability Densities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 3.2.3 Sum of Two Continuous Random Numbers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 4 Classical Ensembles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 4.1 Statistical Ensembles 31 4.1.1 Concept of an Ensemble . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 4.1.2 Ergodicity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 4.2 Microcanonical Ensemble 32 4.3 Canonical Ensemble 33 4.3.1 Boltzmann Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 4.3.2 Equipartition Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 4.3.3 Internal Energy and Heat Capacity of the Canonical Ensemble . . . . . . . . . . . . . . . . . . . . 38 4.4 Grand Canonical Ensemble 39 5 Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 5.1 Swendsen’s Postulates of Thermodynamics 41 5.1.1 Cautionary Remarks on Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 5.1.2 Swendsen’s Postulates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 5.1.3 Entropy in Phenomenological Thermodynamics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 5.1.4 Boltzmann’s Entropy Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 5.2 The Relation of State Functions to the Partition Function 44 5.2.1 Entropy and the Partition Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 5.2.2 Helmholtz Free Energy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 5.2.3 Gibbs Free Energy, Enthalpy, and Pressure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 5.3 Irreversibility 47 5.3.1 Historical Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 5.3.2 Irreversibility as an Approximation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 5.4 Entropy and Information 48 5.4.1 Gibbs Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 5.4.2 Von Neumann Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 5.4.3 Shannon Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 6 Quantum Ensembles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 6.1 Quantum Canonical Ensemble 51 6.1.1 Density Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 6.1.2 Quantum Partition Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 6.2 Quantum and Classical Statistics 54 6.2.1 Types of Permutation Symmetry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 6.2.2 Bose-Einstein Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 6.2.3 Fermi-Dirac Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 6.2.4 Maxwell-Boltzmann Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 566.3 Simple Quantum Systems 58 6.3.1 SpinS = 1=2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 6.3.2 Harmonic Oscillator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 6.3.3 Einstein and Debye Models of a Crystal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61 7 Partition Functions of Gases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 7.1 Separation of contributions 65 7.1.1 Collective Degrees of Freedom . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 7.1.2 Factorization of Energy Modes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66 7.2 Translational Partition Function 67 7.2.1 Density of States of an Ideal Gas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 7.2.2 Partition Function and Accessible States . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68 7.3 Nuclear Spin Partition Function 69 7.3.1 High-Temperature Limit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 7.3.2 Symmetry Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 7.4 Rotational Partition Function 71 7.4.1 Rigid Rotor Assumption and Rotamers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 7.4.2 Accessible States and Symmetry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 7.5 Vibrational Partition Function 74 7.5.1 The Harmonic Oscillator Extended . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 7.5.2 Vibrational Contributions toU,C , andS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 V 7.6 Electronic Partition Function 75 7.7 Equilibrium Constant for Gas Reactions 77 8 Macromolecules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 8.1 Thermodynamics of Mixing 81 8.1.1 Entropy of Binary Mixing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 8.1.2 Energy of Binary Mixing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 8.2 Entropic Elasticity 85 8.2.1 Ideal Chain Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 8.2.2 Random Walk . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 8.2.3 Conformational Entropy and Free Energy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 Books 93 Articles 94 Web Pages 94 Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95General Remarks Suggested Reading Acknowledgment 1 — Introduction „Today“, Mr. K. complained, „Scores of people claim in public that they can type sizeable books entirely on their own, and this is universally accepted. The Chinese philosopher Chuan-Tzu, when already athis prime age, brushed a tome ofone hundred thousand words with nine out of ten being citations. Such books cannot be written anymore, since wit is missing. Hence, only by the tools of a single man thoughts are produced, whereas he feels lazy if their number is low. Indeed,nothoughtcanbereusedandnoexpressionbecited. Howlittletheyallneedfortheir doings Apenandsomepaperisalltheyhavetoshow Andwithoutanyhelp,withonlythepuny material that a single man can carry on his arms, they set up their huts They don’t know of largerbuildings than a loner can raise.” Freely translatedfrom: BertoltBrecht,GeschichtenvomHerrnKeuner 1.1 General Remarks The field of Statistical Thermodynamics is probably the branch of physical chemistry whose coverageintextbooksismostdiverse. Acanonicalwayofteachingthissubjectstillappearsto be missing, which is partially due to the fact that practitioners have not completely agreed on interpretationoftheprobabilisticcharacter,onthepostulates,andonthewayhowthemathematical apparatus is derived from a set of postulates. While this may appear troublesome, actually there exists rarely any problem in applying statistical thermodynamics1. Accordingly, textbooks usuallyignorethemoreproblematicaspectsandtrytogivereasonswhytheinterpretationand formulation used bythe respective authorshouldbethepreferredone. Thisbeinganadvanced lecturecourse,weshallnotdoso,butweshallstillpresentanapparatusthatisready-madefor application. The basic idea of statistical thermodynamics is simple: On the one hand we have Newtonian and quantum mechanics and we know that molecules should adhere to it, and on the other hand we know that systems consisting of many molecules can be adequately described by phenomenological (or classical)thermodynamics. Nowlet’stry toderivethelatter theoryfrom theformerone. Somecarewillhavetobetakenforsystemsthataresubjecttoquantumstatistics, but we might expect that straightforward application of probability theory will provide the 1The problematic casesaremostlythosewhere thenumberofparticlesistoosmall for theapproximationsmade inthestatisticalapproach.8 Introduction requiredconnection. Chapter2willdiscussthisbasicideainsomemoredetailandwillpresenta set of postulates due to Oliver Penrose Pen70. The discussion of these postulates clarifies what theremaining mathematical problem isandhowweavoiditinapplications. In this course we do not assume that students are already familiar with probability theory, ratherwewillintroduceitsmostimportantconceptsinChapter3. Wedoassumethattheconcepts of phenomenological thermodynamics are known, although we shall shortly explain them on firstuseintheselecturenotes. Themostimportantnewconceptinthiscourseistheoneofan ensemble description, which will be introduced in Chapter 4 first only for classical particles. This will set the stage for discussing the concepts of irreversibility and entropy in Chapter 5. We willcompletethefoundationspartwithadiscussionofquantumensemblesinChapter6. This Chapter will also make the transition to applications, by treating first the harmonic oscillator and secondthe Einstein model of a crystalwiththeapparatusthatwecommandatthatpoint. We shall then illustrate the relation to phenomenological thermodynamics by discussing thepartitionfunctionsofgasesandbycomputingthermodynamicalstatefunctionsfromthese partitionfunctionsinChapter7. ThefinalChapter8willshortlydiscusstheconsequencesof statistical thermodynamics for macromolecular systems and introduce the concepts of lattice models,random walks, and entropicelasticity. The time available for this course does not permit to treat all aspects of statistical thermo- dynamicsandstatisticalmechanicsthatareimportantinphysicalchemistry,chemicalphysics, polymer physics, and biophysics, let alone in solid-state physics. The most important omissions are probably kinetic aspects of chemical reactions, which are treated in detail in a lecture course onAdvancedKinetics,andthetopicofphasetransitions,includingthefamousIsingchainmodel. We believe that the foundations laid in the present course will allow students to understand these topicsfrom reading in the textbooks listedinthefollowingSection. 1.2 Suggested Reading Generaltextbooksonphysicalchemistry,suchasAP13;ER06providesomeoverviewofthe most important concepts of statistical thermodynamicsaswellassomeofthekeyformulas,but they are not quite on the level of this advanced course. If you already own these books, it might still be useful to read what they write on the topic. If you have the choice, Engel/Reid ER06 is betteronthistopicthanAtkinsAP13. Thebestintroductioninageneralphysicalchemistry textbook can be found in the German bookbyWedlerandFreundWF12. A rathergoodandmodernintroductionatanadvancedlevelhasbeenpublishedinEnglish language by Swendsen Swe12. Swendsenintroducesstatisticalthermodynamicstogetherwith phenomenologicalthermodynamicsandcoversmoreexamplesthanwecantreatinthiscourse. He does not introduce some concepts that are widely used in the field, because he dislikes them. Inthiscourse we do introduce these conceptsanddiscusstheproblemsassociatedwiththem. A modern German-language introduction is the one by Schwabl Sch06, which caters more tothephysicistthantothephysicalchemist. Schwablisstrongeronphasetransitionsanddynamic phenomena, but probably harder to read than Swendsen, even if German is your native language. For Chapter 7, but only for this Chapter, Maczek’s book from the Oxford Chemistry Primers seriesMac98canbequiteuseful. Severaltopicsthatarenotoronlysuperficiallycoveredinmy lecturenotesaretreatedinthenotesby Cohen from Ben Gurion University Coh11, which are stronglyfocusedonquantumapplications. Finally,IwanttomentionPenrose’sbookPen70, which is certainly not an introductory textbook and may be most appealing to the strongly mathematically or philosophically inclined. If you look for guidance on applying statistical thermodynamics to real systems this book is certainly deficient, but from an epistemological pointofview it is probably the best one.1.3 Acknowledgment 9 For many of the central concepts I have looked up (English) Wikipedia articles and have found that these articles are, on average, of rather good quality. They do differ quite strongly from each other in style and notation. When using only Wikipedia or other internet resources it isdifficultto fitthepiecesofinformationtogether. If,ontheotherhand,youalreadydohavea basiclevelofunderstanding,butsomedifficultieswithaparticularconcept,suchsourcesmay provide just the missing piece of information. The NIST guide for computing thermodynamical state functions from the results of ab initio computations is a particularly good example for a usefulinternet resource Iri98. 1.3 Acknowledgment IamgratefultoM.Schäfer,U.Hollenstein,andF.Merktformakingtheirlecturenotesforthis course available and to Takuya Segawa for thorough proofreading of the first manuscript of these notes. All remaining errors are my ownsoleresponsibility.Basic Assumptions of Statistical Thermodynamics Thermodynamics Based on Statistical Mechanics The Markovian Postulate Phase space Hamiltonian Equations of Motion The Liouville Equation Quantum Systems Statistical Mechanics Based on Postulates The Penrose Postulates Implications of the Penrose Postulates 2 — Basics of Statistical Mechanics 2.1 Basic Assumptions of Statistical Thermodynamics 2.1.1 Thermodynamics Based on Statistical Mechanics Phenomenological thermodynamics describes relations between observable quantities that characterize macroscopic material objects. We know that these objects consist of a large number of small particles, molecules or atoms, and, for all we know, these small particles adhere to the laws of quantum mechanics and often in good approximation to the laws of Newtonian mechanics. Statistical mechanics is the theory that explains macroscopic properties, not only thermodynamic state functions, by applying probability theory to the mechanic equations of motion for a large ensemble of systems of particles. In this lecture course we are concerned with thepartof statistical mechanics that relatestophenomenologicalthermodynamics. In spiteof its name,phenomenological (equilibrium)thermodynamicsis essentiallya static theory that provides an observational, macroscopic description of matter. The underlying mechanicaldescriptionisdynamicalandmicroscopic,butitisobservationalonlyforsystems consisting of a small number of particles. To see this, we consider a system ofN identical classicalpoint particles that adhere toNewton’sequationsofmotion. Concept 2.1.1 — Newtonian equations of motion. Withparticlemassm,Cartesiancoordinates q (i = 1; 2;:::; 3N) and velocity coordinates q_ , a system of N identical classical point i i particles evolves by 2 d q i m = V (q ;:::;q ) ; (2.1) 1 3N 2 dt q i whereV (q ;:::;q ) is the potentialenergyfunction. 1 3N Notation2.1. Thedynamicalstateormicrostateofthesystematanyinstantisdefinedbythe 6N Cartesian and velocity coordinates, which span the dynamical space of the system. The curveof the system in dynamical spaceiscalledatrajectory. The concept extends easily to atoms with different massesm . If we could, at any instant, i preciselymeasureall 6N dynamicalcoordinates,i.e.,spatialcoordinatesandvelocities,wecould12 Basics of Statistical Mechanics preciselypredictthefuturetrajectory. ThesystemasdescribedbytheNewtonianequationsof motions behaves deterministically. Forany system that humans can seeandhandledirectly,i.e.,withoutcomplicated technical 18 devices,thenumberN ofparticlesistoolarge(atleastoftheorderof 10 )forsuchcomplete measurements to be possible. Furthermore, for such large systems even tiny measurement errors would make the trajectory prediction useless after a rather short time. In fact, atoms are quantum objectsandthemeasurementsaresubject tothe Heisenberguncertainty principle,and eventhe smalluncertainty introduced by that wouldmakeadeterministicdescriptionfutile. We can only hope for a theory that describes what we can observe. The number of observationalstatesormacrostatesthatcanbedistinguishedbytheobserverismuchsmallerthan thenumberofdynamicalstates. Twoclassicalsystemsinthesamedynamicalstatearenecessarily also in the same observational state, but the converse is not generally true. Furthermore, the observational state also evolves with time, but we have no equations of motion for this state (but seeSection2.2.2). Infactwecannothavedeterministicequationsofmotionfortheobservational stateofanindividualsystem,preciselybecausethesameobservationalstatemaycorrespondto differentdynamical states that will followdifferenttrajectories. Stillwecanmakepredictions,onlythesepredictionsarenecessarilystatisticalinnature. If weconsideralargeensembleofidenticalsystemsinthesameobservationalstatewecaneven make fairly precise predictions about the outcome. Penrose Pen70 gives the example of a women at a time when ultrasound diagnosis can detect pregnancy, but not sex of the foetus. The observationalstateispregnancy,thetwopossibledynamicalstatesareonpathtoaboyorgirl. We have no idea what will happen in the individual case, but if the same diagnosis is performed onamillion of women, we know thatabout51-52%willgivebirthtoaboy. How then can we derive stable predictions for an ensemble of systems of molecules? We need to consider probabilities of the outcome and these probabilities will become exact numbers inthelimitwherethenumberN ofparticles(ormolecules)tendstoinfinity. Thetheoryrequired forcomputing such probabilities will betreatedinChapter3. Ourcurrentusageofthetermensembleisloose. WewilldevotethewholeChapter4to R clarifyingwhattypesofensembles we use in computations and why. 2.1.2 The Markovian Postulate There are different ways for defining and interpreting probabilities. For abstract discussions and mathematical derivations the most convenient definition is the one of physical or frequentist probability. Definition 2.1.1 — Physical probability. Given a reproducible trialT of whichA is one of the possible outcomes, the physical probabilityP oftheoutcomeAisdefinedas n(A;N;T ) P (AjT ) = lim (2.2) N1 N wheren(A;N;T ) is the number oftimestheoutcomeAisobservedinthefirstN trials. AtrialT conformingtothisdefinitionisstatisticallyregular,i.e.,thelimitexistsandisthe same for all infinite series of the same trial. If the physical probability is assumed to be a stable property of the system under study, it can be measured with some experimental error. This experimental error has two contributions: (i) the actual error of the measurement of the quantity Aand(ii)thedeviationoftheexperimentalfrequency ofobservingAfrom the limitdefined in Eq. (2.2). Contribution (ii) arises fromtheexperimentalnumberoftrialsN notbeinginfinite.2.2 Phase space 13 WeneedsomecriterionthattellsuswhetherT isstatisticallyregular. Forthiswesplitthe trial into a preparation period, an evolution period, and the observation itself. The evolution periodisawaitingtimeduringwhichthesystemisundercontrolledconditions. Togetherwith thepreparation period it needs to fulfilltheMarkovianpostulate. Concept 2.1.2 — Markovian postulate. A trialT that invariably ends up in the observational stateO of the system after the preparation stage is called statistically regular. The start of the evolution period is assigned a timet = 0. Note that the system can be in different observational states at the time of observation; otherwisethepostulatewouldcorrespondtoatrivialexperiment. TheMarkovianpostulateis relatedtotheconceptofaMarkovianchainofevents. Insuchachaintheoutcomeofthenext event depends only on the current state of the system, but not on states that were encountered earlier in the chain. Processes that lead to a Markovian chain of events can thus be considered as memoryless. 2.2 Phase space 2.2.1 Hamiltonian Equations of Motion The Newtonian equations of motion (2.1) are very convenient for atomistic molecular dynamics (MD) computations. Statistical analysis of trajectories encountered during such MD simulations can be analyzed in terms of thermodynamic quantities, such as free energy. However, for analyzingevolutionofthesystemintermsofspectroscopicproperties,theNewtoniandescription isveryinconvenient. Sincespectroscopicmeasurementscanprovidethemoststringenttestsof theory,weshallusetheHamiltonianformulationofmechanicsinthefollowing. Thisformulation isparticularlyconvenientformoleculesthatalsohaverotationaldegreesoffreedom. Forthat, wereplacethevelocitycoordinatesbymomentumcoordinatesp =m q_ ,whereindexj runs j j j over all atoms. Furthermore, we assumeM identical molecules, with each of them havingf degrees of freedom, so that the total number of degrees of freedom isF =fM. Such as system canbedescribed by 2F differential equations Concept 2.2.1 — Hamiltonian equations of motion. With the single-molecule Hamiltonian H(p;q )theequationsofmotionforM non-interactingidenticalmoleculeswithf degrees i i offreedom for each molecule read dq H (p;q ) i i i = (2.3) dt p i dp H (p;q ) i i i = ; (2.4) dt q i wherei = 1:::M. Each of the dynamical variablesq andp is a vector of lengthf. The i i 2fM dynamical variables span thephasespace. Definition 2.2.1 — Phase space and state space. Phasespaceisthespacewheremicrostates of a system reside. Sometimes the term is used only for problems that can be described in spatial and momentum coordinates, sometimes for all problems where some type of a Hamiltonian equation of motion applies. Sometimes the term state space is used for the spaceofmicrostatesinproblemsthatcannotbedescribedby(only)spatialandmomentum coordinates.14 Basics of Statistical Mechanics Ifthemoleculeisjustasingleatom,wehaveonlyf = 3translationaldegreesoffreedom andtheHamiltonian is given by  1 2 2 2 H (p;q ) = p +p +p ; (2.5) i i x;i y;i z;i 2m describing translation. For molecules withn atoms, three of thef = 3n degrees of freedom are translational, two or three are rotational for linear and non-linear molecules, respectively, and the remaining 3n 5 or 3n 6 degrees offreedomarevibrational. 2.2.2 The Liouville Equation Our observations do not allow us to specify phase space trajectories, i.e. the trajectory of microstates for a single system. Instead, we consider an ensemble of identical systems that all represent the same (observational) macrostateO but may be in different microstates. At a given time we can characterize such an ensemble by a probability density(p;q;t) in phase space, wherep andq are the vectors of all momentum and spatial coordinates in the system, respectively. We are interested in an equation of motion for this probability density, which correspondstothefullknowledgethatwehaveonthesystem. Thisequationcanbederivedfrom anintegral representation of and theHamiltonianequationsofmotionSch06. Concept 2.2.2 — Liouville Equation. The probability density(p;q;t) in phase space evolves intime according to   X   H  H = : (2.6) t p q q p i i i i i WiththePoisson brackets   X u v u v fu;vg = : (2.7) p q q p i i i i i this Liouville equationcan be expressedas  =fH;g : (2.8) t For the probability density along a phase space trajectory, i.e., along a trajectory that is taken bymicrostates, we find d  (q(t);p(t);t) = 0: (2.9) dt Ifweconsiderauniformlydistributednumber dN ofensemblemembersinavolumeelement d in phase space at timet = 0 and ask about the volume element d in which these ensemble 0 members are distributed at a later time,wefind d = d : (2.10) 0 Thisisthe Liouville theoremof mechanics. 2.2.3 Quantum Systems Hamiltonian mechanics can be appliedtoquantumsystems,withtheHamiltonian equationsof motion beingreplaced bythe time-dependentSchrödingerequation. Theprobabilitydensityin2.3 Statistical Mechanics Based on Postulates 15 phasespaceisreplacedbythedensityoperatorbandtheLiouvilleequationbytheLiouville-von- Neumann equation h i b i b = H;b : (2.11) t b Inquantummechanics,observablesarerepresentedbyoperatorsA. Theexpectationvalueof anobservablecanbecomputedfromthedensityoperatorthatrepresentsthedistributionofthe ensemble in phase space,   b b hAi = Trace bA : (2.12) We note that the Heisenberg uncertainty relation does not introduce an additional complication in statistical mechanics. Determinism had been lost before and the statistical character of the measurementonanindividualsystemisunproblematic,asweseekonlystatisticalpredictions for a large ensemble. In the limit of an infinite ensemble, N1, there is no uncertainty andtheexpectationvaluesofincompatibleobservablesarewelldefinedandcanbemeasured simultaneously. Suchaninfinitelylargesystemisnotperturbedbytheactofobservingit. The only difference between the description of classical and quantum systems arises from their statistical behavior on permutation ofthecoordinatesoftwoparticles,seeSection6.2. 2.3 Statistical Mechanics Based on Postulates 2.3.1 The Penrose Postulates Penrose Pen70 has made the attempt to strictly specify what results can be expected from statistical mechanics if the theory is basedonasmallnumberofplausiblepostulates. 1. Macroscopic physical systemsare composed of molecules that obey classical or quantum mechanical equations of motion(dynamicaldescriptionofmatter). 2. An observation on such a macroscopic system can be idealized as an instantaneous, simultaneous measurement of a set of dynamical variables, each of which takes the values 1or 0 only (observational descriptionofmatter). 3. A measurement on the system has no influence whatsoever on the outcome of a later measurement on the same system(compatibility). 4. The Markovian postulate. (Concept2.1.2) 5. Apart from the Bose and Fermi symmetry conditions for quantum systems, the whole phase space can, in principle, beaccessedbythesystem(accessibility). After the discussion above, only the second of these postulates may not immediately appear plausible. In the digital world of today it appears natural enough: Measurements have resolution limitsandtheirresultsarefinallyrepresentedinacomputerbybinarynumbers,whichcanbe takento be the dynamical variables inthispostulate. 2.3.2 Implications of the Penrose Postulates Entropy is one of the central quantities of thermodynamics, as it tells in which direction a spontaneousprocessinanisolatedsystemwillproceed. Forclosedsystemsthatcanexchange heat and work with their environment, such predictions on spontaneous processes are based on free energy, of which the entropy contribution is usually an important part. To keep such considerations consistent, entropy musthavetwofundamentalproperties 1. Ifthesystemdoesnotexchange energy with its environment, its entropy cannotdecrease. (non-decrease). 2. The entropy of two systems considered together is the sum of their separate entropies. (additivity).16 Basics of Statistical Mechanics BasedonthePenrosepostulatesitcanbeshownPen70thatthedefinitionofBoltzmannentropy (Chapter5)ensuresbothproperties,butthatstatisticalexpressionsforentropyensureonlythe non-decrease property, not in general the additivity property. This appears to leave us in an inconvenient situation. However, it can also be shown that for large systems, in the sense that the numberofmacrostatesismuchsmallerthanthenumberofmicrostates,thetermthatquantifies non-additivityisnegligiblysmallcomparedtothetotalentropyPen70. Theproblemisthus ratheramathematical beauty spot thanaseriousdifficultyinapplicationofthetheory.Discrete Probability Theory Discrete Random Variables Multiple Discrete Random Variables Functions of Discrete Random Variables Discrete Probability Distributions Probability Distribution of a Sum of Random Numbers Binomial Distribution Stirling’s Formula Continuous Probability Theory Probability Density Selective Integration of Probability Densities Sum of Two Continuous Random Numbers 3 — Probability Theory 3.1 Discrete Probability Theory 3.1.1 Discrete Random Variables Consider a trialT where the observation is a measurement of thez component m of spin S angular momentum of a spinS = 5=2. There are just six possible outcomes (events) that can belabeledwiththemagneticspinquantum numberm orindexed byintegernumbers1, 2,::: S 6. Ingeneral,theprobabilitiesofthesixpossibleeventswilldifferfromeachother. Theywill dependonpreparationandmaydependonevolutiontimebeforetheobservation. Todescribe suchsituations, we define a set of elementaryevents A =fag ; (3.1) j whereinourexampleindexj runsfrom1to6,whereasingeneralitrunsfrom1tothenumber N ofpossibleevents. Eachoftheeventsisassignedaprobability 0P (a ) 1. Impossible A j events (for a given preparation) have probability zero and a certain event has probability 1. Since one and only one of the events must happen in each trial, the probabilities are normalized, P N A P (a ) = 1. Asimplifiedmodelofourexampletrialistherollingofadie. Ifthedieisfair, j j wehavethe special situation of a uniformprobabilitydistribution,i.e.,P (a ) = 1=6 forallj. j Concept3.1.1—Randomvariable. Asetofrandomeventswiththeirassociatedprobabilitiesis called a random variable. If the number of random events is countable, the random variable is calleddiscrete. Inacomputer,numberscanbeassignedtotheevents,whichmakestherandom variablearandomnumber. AseriesoftrialscanthenbesimulatedbygeneratingaseriesofN pseudo-random numbers that assign the events observed in theN trials. Such simulations are called Monte Carlo simulations. Pseudo-random numbers obtained from a computer function needtobe adjusted so that they reproducethegivenorassumedprobabilitiesoftheevents. ® Problem 3.1 Using the Matlab function rand, which provides uniformly distributed random numbers in the open interval (0; 1), write a program that simulates throwing a die with six faces. The outer function should have trial numberN as an input and a vector of the numbers of encounteredones,twos,... andsixesasanoutput. Itshouldbebasedonaninnerfunctionthat simulates a single throw of the die. Test the program by determining the difference from the expectationP (a ) = 1=6 for ever largernumbersoftrials. j18 Probability Theory 3.1.2 Multiple Discrete Random Variables For two sets of eventsA andB and their probabilities, we define a joint probabilityP (a ;b ) j k that is the probability of observing botha andb in the same trial. An example is the throwing j k of two dice, one black and one red, and asking about the probability that the black die shows a 2 and the red die a 3. A slightly more complicated example is the measurement of the individualz components of spin angular momentum of two coupled spinsS = 5=2 andS = 5=2. Like A B individual probabilities, joint probabilities fall in the closed interval 0; 1. Joint probabilities are normalized, XX P (a;b) = 1: (3.2) a b Notethatwehaveintroducedabriefnotationthatsuppressesindicesj andk. Thisnotationis oftenencountered because of its convenienceinwriting. IfweknowtheprobabilitiesP (a;b)forallN N possiblecombinationsofthetwoevents, A B wecancompute the probability of a singleevent,forinstancea, X P (a) = P (a;b); (3.3) A b whereP (a) is the marginal probabilityofeventa. A The unfortunate term ’marginal’ does not imply a small probability. Historically, these R probabilitieswerecalculatedinthe margins of probability tables Swe12. AnotherquantityofinterestistheconditionalprobabilityP (ajb)ofaneventa,providedthat b has happened. For instance, if we call two cards from a full deck, the probability of the second cardbeing aQueen isconditional onthe firstcard havingbeena Queen. Withthe definitionfor theconditional probability we have P (a;b) =P (ajb)P (b) (3.4) B =P (bja)P (a): (3.5) A Theorem 3.1.2 — Bayes’ theorem. If the marginal probability of event b is not zero, the conditional probability of eventa givenb is P (bja)P (a) A P (ajb) = : (3.6) P (b) B Bayes’ theorem is the basis of Bayesian inference, where the probability of propositiona issoughtgivenpriorknowledge(short: theprior)b. OftenBayesianprobabilityisinterpreted subjectively, i.e., differentpersons, becausetheyhavedifferentprior knowledgeb, willcometo different assessments for the probability of propositiona. This interpretation is incompatible with theoretical physics, where, quite successfully, an objective reality is assumed. Bayesian probabilitytheory canalsobe appliedwithanobjectiveinterpretationinmindandisnowadays used, among else, in structural modeling of biomacromolecules to assess agreement of a model (theproposition) with experimental data(theprior). In experimental physics, biophysics, and physical chemistry, Bayes’ theorem can be used to assign experimentally informed probabilities to different models for reality. For example assume that a theoretical modeling approach, for instance an MD simulation, has provided a setofconformationsA =fag of a proteinmoleculeandassociatedprobabilitiesP (a ). The j A j probabilitiesarerelated,viatheBoltzmanndistribution,tothefreeenergiesoftheconformations3.1 Discrete Probability Theory 19 (thispointisdiscussedlaterinthelecturecourse). Wefurtherassumethatwehaveameasurement B with outputb and we know the marginal probabilityP (b) of encountering this output for a k B randomsetofconformationsoftheproteinmolecule. Thenweneedonlyaphysicalmodelthat provides the conditional probabilitiesP (bja ) of measuringb given the conformationsa and j j k k can compute the probabilityP (ajb ) that the true conformation isa , given the result of our j k j measurement, via Bayes’ theorem. Eq. (3.6). This procedure can be generalized to multiple measurements. The requiredP (bja ) depend on measurement errors. The approach allows for k j combiningpossiblyconflictingmodelingandexperimentalresultstoarriveata’bestestimate’ forthedistribution of conformations. The events associated with two random variables can occur completely independent of each other. Thisisthecaseforthrowingtwodice: thenumbershownontheblackdiedoesnotdepend onthe numbershown onthered die. Hence, the probability to observe a 2 on the black and a 3 on the red die is (1=6) (1=6) = 1=36. In general, joint probabilities of independent events factorizeintotheindividual(ormarginal)probabilities,whichleadstohugesimplificationsin computations. In the example of two coupled spinsS = 5=2 andS = 5=2 the two random A B variablesm andm may or may not be independent. This is decided by the strength of the S;A S;B coupling, the preparation of trialT , andtheevolutiontimet beforeobservation. Concept 3.1.3 — Independent variables. If two random variables are independent, the joint probability of two associated events istheproductofthetwomarginalprobabilities, P (a;b) =P (a)P (b): (3.7) A B Asaconsequence,theconditionalprobabilityP (ajb)equalsthemarginalprobabilityofa(and viceversa), P (ajb) =P (a): (3.8) A Forasetofmorethantworandomvariablestwodegreesofindependencecanbeestablished, a weak type of pairwise independence and a strong type of mutual independence. The set is mutually independent if the marginal probability distribution in any subset, i.e. the set of marginal probabilities for all event combinations in this subset, is given by the product of the correspondingmarginaldistributionsfortheindividualevents.1 Thiscorrespondstocomplete independence. Weakerpairwiseindependenceimpliesthatthemarginaldistributionsforanypair ofrandomvariablesaregivenbytheproductofthetwocorrespondingdistributions. Notethat even weaker independence can exist within the set, but not throughout the set. Some, but not all pairsorsubsets of random variables canexhibitindependence. Anotherimportantconceptformultiplerandomvariablesiswhetherornottheyaredistin- guishable. In the example above we used a black and a red die to specify our events. If both dice would be black, the event combinations (a ;b ) and (a ;b ) would be indistinguishable 2 3 3 2 and the corresponding composite event of observing a 2 and a 3 would have a probability of 1=18, i.e. the product of the probability 1=36 of the basic composite event with its multiplicity 2. In general, ifn random variables are indistinguishable, the multiplicity equals the number of permutations of then variables, whichisn = 1 2 (n 1)n. 1Asthe distributions are vectorsand all combinations havetobe considered,an outerproductmustbe taken.20 Probability Theory 3.1.3 Functions of Discrete Random Variables Weconsideraneventg thatdependsontwoothereventsaandb. Forexample,weaskforthe probabilitythatthesumofthenumbersshownbytheblackandreddieisg,whereg canrange from 2 to 12, given that we know the probabilitiesP (a;b), which in our example all have the value1/36. In general, the probabilitydistributionofrandomvariableG canbecomputedby XX P (g) =  P (a;b); (3.9) G g;G(a;b) a b where G(a;b) is an arbitrary function of a and b and the Kronecker delta  assumes g;G(a;b) the value one ifg = G(a;b) and zero otherwise. In our example,g = G(a;b) = a +b will assumethevalueof5fortheeventcombinations (1; 4); (2; 3); (3; 2); (4; 1)andnoothers. Hence, P (5) = 4=36 = 1=9. There is only a single combination forg = 2, henceP (2) = 1=36, and G G there are 6 combinations forg = 7, henceP (7) = 1=6. Although the probability distributions G for the individual random numbersA andB are uniform, the one forG is not. It peaks at the value ofg = 7 that has the most realizations. Such peaking of probability distributions that depend on multiple random variables occurs very frequently in statistical mechanics. The peaks tend to become the sharper the larger the number of random variables that contribute to the sum. IfthisnumberN tendstoinfinity,thedistributionofthesumg issosharpthatthedistribution width (to be specified below) is smaller than the error in the measurement of the mean value g=N (see Section 3.1.5). This effect is the very essence of statistical thermodynamics: Although quantities forasinglemoleculemaybebroadly distributedandunpredictable, themeanvaluefor 18 alargenumberofmolecules,let’ssay 10 ofthem,isverywelldefinedandperfectlypredictable. Inanumericalcomputerprogram,Eq. (3.9)foronlytworandomvariablescanbeimplemented veryeasilybyaloopoverallpossiblevaluesofg withinnerloopsoverallpossiblevaluesofa andb. Insidetheinnermostloop,G(a;b)iscomputedandcomparedtoloopindexg toaddor notaddP (a;b)tothebincorrespondingtovalueg. Notehoweverthatsuchanapproachdoes not carry to large numbers of random variables, as the number of nested lops increases with the number of random variables and computation time thus increases exponentially. Analytical computations are simplified by the fact that usually deviates from zero only within g;G(a;b) certainrangesofthesummationindexesj (fora)andk (forb). Thetrickisthentofindtheproper combinations of index ranges. Problem 3.2 Computetheprobabilitydistributionforthesumg ofthenumbersshownbytwo diceintwoways. First,writeacomputerprogramusingtheapproachsketchedabove. Second, computetheprobabilitydistributionanalyticallybymakinguseoftheuniformdistributionfor theindividualevents(P (a;b) = 1=36foralla;b. Forthis,considerindexrangesthatleadtoa givenvalue of the sumg.2 3.1.4 Discrete Probability Distributions In most cases random variables are compared by considering the mean values and widths of their probability distributions. As a measure of the width, the standard deviation of the values 2 from the mean value is used, which is the square root of the variance . The concept can be generalized by considering functionsf(A) of the random variable. In the following expressions, f(A) =Aprovides the mean value andstandarddeviationoftheoriginalrandomvariableA. Concept 3.1.4 — Mean value and standard deviation. For any function F (A) of a random 2Thesolution ofthe second taskcanbefound in Swe12.

Advise: Why You Wasting Money in Costly SEO Tools, Use World's Best Free SEO Tool Ubersuggest.