Statistical Physics Prof. Dirk Brockmann WS13/14 1 Introduction The content of this lecture is contained in its title. We will (mostly) be talking about stastistical physics. First however, we will have to learn a few fundamental concepts about random processes and that in turn requires that we talk about random events, randomness in general and probabilities. We all have a more or less well established intuition about what random events and probabilities are. Yet, our intuition often fails, when we try to use our intuition to estimate probabilities for even very simple random events. This is why casinos work. What is a random event? Well, an event that one didn’t expect to happen maybe. That notion turns out to be insufficient to develop a theory. What we mean by a random event is connected to measurement. In physics we do measurements. It means that if we carry out an experiment it may have a certain outcome O 1 . If we perform the exact experiment a second time the outcome is O 2 which may by different from O 1 . Implicitely we assumed here that we can actually repeat an experiment under identical conditions. This is a fundamental assumption in science. And a bit silly, if you think about it. Now, if we perform a sequence of identical experiments, and the outcome is not the same every time, these outcomes are random events. We may argue that the experiment is ill designed. However, in the sequence of outcomes O i we may nevertheless see structure. In science we do this by connecting a measurement to the experiment. This measurement tpyically connects to each outcome O i or to subsets of outcomes a number. Note that, in order to do this we must first define the entire set of potential outcomes. In the simplest cast, we map every outcome O 1 to either 1 or 0. For instance, the experiment could be rolling a regular octahedron, the potential outcome are faces S 1 , ...., S 8 and the apparatus displays 1 for outcomes S 1 , S 2 , S 3 , S 4 , S 5 and 0 for S 6 , S 7, S 8 . Now, let’s assume we perform the experiment N times. In well designed experiments we will find that as we increase N the fractions f 1 = N1 /N and f 0 = N0 /N in which we measure 1 and 0, respectively, will converge to a single number (N1 and N0 are the number of experiments that give 1 and 0, respectively). This is science, and nothing more. It’s the only requirement we make really. In the case of the regular octahedron we would find f 1 ! 0.625 and f 0 ! 0.375. This requirement of convergence of frequencies of experimental measurements is connected to the Law of Large Numbers. We will talk about this later in more detail. In mathematical terms, the measurement is a random variable X that maps from the space of possible outcomes to the rational numbers (because all we can measure are rational numbers). We will be more precise later on. The frequencies that we observe in an infinite sequence of experiments are probabilities. Because we cannot perform an infinite sequence of experiments it is impossible to determine these quantities with arbitrary precision. Typically we therefore make a priori assumptions about these probabilities and then see if our finite sequence of experiments is consistent with these assumptions. 2 1 Introduction Figure 1.1: Experiment and Measurement. Figure 1.2: Stochasticity in systems. Now what are the possible reasons behind the uncertainty of experimental outcomes? There are basically three potential mechanisms that can lead to uncertainty. 1. Chaotic dynamics 2. Noise 3. Quantum mechanical effects 1.1 Reasons for Randomness Chaotic dynamics The regular octahedron example above represents a large class of systems which evolve according to classical dynamics, for instance ẋ = f(x) in which the vector x(t ) describes the time evolution of the die. We should therefore think that the same initial conditions (the same experiment) x(0) = x0 would yield the same outcome. The source of randomness here is the uncertainly in the initial condition plus a dynamics f that is such that small deviations in initial conditions diverge exponentially fast such that very quickly the initial condition is “forgotton” and it doesn’t matter in what initial condition we started. The “same” experiment in this case only refers to using for instance the same type of die or dropping it from the same (sufficiently large) height. 3 1 Introduction Problem 1.1. Consider the following (very famous) iterated map x n+1 = ∏x n (1 ° x n ) Given a certain starting value x 0 2 (0, 1) and ∏ = 4. Let’s assume we devise an experimental apparatus that iterates this map 10000 times and then produces the experiemental outcome x N . Start with the two values x 0 = 0.3 and x 0 = 0.30001. What is x N in both cases. Also compute a sequence x 1 , x 2 , ..... say, 100 times for x 0 = 0.3. Do you see any pattern? Noise Another source of randomness is noise that impacts on a system that without the noise would evolve according to determinstic dynamical laws. Noise means for example that a deterministic dynamical system, e.g. x n+1 = x n ° ∏x n with 0 < ∏ < 1 that would predictably converge to x 1 = 0 is subjected to small random impacts ±n in each iteration x n+1 = x n ° ∏x n + ±n This sort of thing occurs in many physical, biological and other systems and will eventually lead to stochastic differential equations. Quantum Mechanics In chaotic systems as well as the noisy systems one could argue that it’s just our lack of knowledge that we must view the systems with probabilistic eyes. The chaotic system is deterministic and if we specify the initial conditions with arbitrary precision we no longer have a random evolution. In the noisy system we just have to figure out the source of the noise and it is most likely also just a chaotic systems. Therefore further study of the noise system would also lead to a determinstic and non random description. However, there’s one fundamental intrinsically probabalistic theory for the time evolution of things on a small scale, and that is quantum mechanics. Quantum systems evolve according to natural laws that are by definition probabalistc. A quantum mechanical system is described by a wave function that defines the probability of finding the system in a particular state. Quantum mechanics is a very odd theory. One consequence of it is that no matter how hard we try, if we measure for example the position and momentum of a particle then we are limited by Heisenberg’s uncertainty principle ¢x¢p ∏ ~/2 where ¢x and ¢p are uncertainty in measurement about the position and momentum, respectively. We can therefore not measure position and momentum with arbitrary precision. 4
© Copyright 2026 Paperzz