# umdberg / Probability

Working content I > Modeling with mathematics > Math recapitulation

### Introducing probability

probability is one of those words for which we all have an intuition, but which is surprisingly hard to define. For model, in our discussion of dissemination, we make the premise that particles move either to the left or right with equal probability. But try to define what this means. Try, more concretely, to define what it means for a coin to have an “ adequate probability ” of coming up heads or tails—but without using words in your definition that are synonymous to “ probability ” ( such as “ probability ” and “ likely ” ). It ‘s very hard to do ! In fact, stallion branches of philosophy have been devoted to the question of how to define what is meant by “ equal probability ”. indeed if you find yourself thinking hard about the probabilities we encounter, you are in good party !

The key theme in probability is lack of control. When we flip a coin, it ‘s extremely hard to control which manner the coin will come gloomy. The consequence is very sensitive to the starting conditions you give it at a grade of sensitivity greater than you can control. Which, of course, is the point of flipping a coin .

One definition of “ equal probability ” might look something like this :

As the number of tosses of a bazaar coin approaches eternity, the phone number of times that the coin will land heads and the number of times that the mint will land tails approach the lapp value .

Is that a useful definition ? possibly, but it does n’t seem to capture everything that we intuitively know to be dependable. We ‘d like to know what the chances are that the coin will land on “ heads ” when we toss it equitable once, without having to toss it an infinite number of times. And we all have the palpate that the answer is obvious – it ‘s ½ ! – even if we have a difficult meter expressing it rigorously .

### How would we know if it’s fair?

Of course determining whether a mint is “ bazaar ” or not would require testing it an infinite number of times. And in the substantial world we expect that no real number coin would be absolutely fair. It might be a bantam bit unbalanced therefore that it systematically, over many many flips, comes out 0.1 % more heads than tails. Would we accept that as a “ fairly ” mint ?

One of the interesting questions of probability is “ how do you know ” that a coin is honest, for exercise ? Or better : how well do you know that a coin “ appears to be fair ” ? This discipline carries us beyond the scope of this class into the kingdom of bayesian Statistics. We wo n’t discuss that here, though we will note that bayesian analyses play a large character in the modern approach to medical diagnosis and both aesculapian students and biological researchers will finally have to master this national !

### A simple model for thinking about probabilities: a fair coin

quite, we will make a simplified exemplar that we can analyze in contingent mathematically. We will assume that we have a ( mathematically ) clean coin — one that if it we flipped it an infinite number of times would come up an equal numeral of times heads and tales .

now we can get back to our fib. Let ‘s see if we can make some interesting observations about probabilities by relying on good our intuitions. Suppose, for example, that I toss a ( mathematically ) carnival coin ten-spot times. How many times will it come up “ heads ” ? The correct answer is : who knows ! In ten-spot flips, the mint may land on heads seven times, and it may land on heads only doubly. We ca n’t predict for sure. But what we do know is that if it is a bonny mint it is more probably that it will land on heads 5 times than it is that it will land on heads all 10 times .

But why do we feel that is the casing ? Why is the leave of 5 heads and 5 tails more likely than the resultant role of 10 heads and 0 tails ? If each flip is equally likely to give heads as it is to give tails, why is the 5/5 solution more likely than the 10/0 consequence ?

The answer is that there are many more ways for us to arrive at the 5/5 consequence than there are ways for us to arrive at the 10/0 consequence. In fact, there is precisely one means to arrive at the 10/0 consequence. note that in stating “ 5/5 ” we are assuming that we do n’t care in which order the heads and tails appear — we only care about the total number .

### If we only care about the totals: microstates and macrostates

If we only care about the totals there is only ONE way in which you would arrive at the result that the series of tosses produced 10 heads : HHHHHHHHHH. You have a 50 % luck the beginning flip will be a question, a 50 % change the second will be a head, and so on. Therefore the probability of 10 heads is 1/210 or 1 in 1024 .

On the other handwriting, here are fair a few of the 252 ways of arriving at the 5/5 solution : HHHHHTTTTT, HTHTHTHTHT, TTTTTHHHHH, HTTHHTTHTH. Each of these particular strings besides only has the probability of 1 in 1024 to come up, since there is a 50-50 gamble of a pass or a tail on each somersault. But since there are 252 ways of arriving at 5/5 the luck of finding 5/5 ( in any order ) is 252/1024 — much bang-up than finding 10/0 and in fact greater than finding any other specific mix of heads and tails .

Another way of expressing the probabilistic intuition we have been describing is to say that a system is much more probably to be in a state that is coherent with many “ arrangements ” of the elements comprising the state than it is to be in a express coherent with merely a few “ arrangements ” of the elements comprising the state. An “ arrangement ” in the coin toss discussion corresponds to a particular ten-spot toss result, say, HTTHTTHHHT. The 10/0 leave is coherent with only one such arrangement, while the 5/5 result is consistent with 252 such arrangements .

The remainder between a specific string of heads and tails and the entire count ( in any decree ) is a model of a very significant eminence we use in our development of the 2nd Law of Thermodynamics. The specific string, where every flip is identified, is called a microstate. The softer condition — where we only specify the sum number of heads and tails that result — is called a macrostate. In our mathematical clean coin model, what “ fair ” means is that every microstate has the like probability of appearing .

What happens as the number of tosses increases from 10 to, say, 1000 ? As you might guess, it becomes even more likely to obtain a consequence near 500/500 than it is to obtain a result near 1000/0. In the slang of statistics, the probability distribution gets “ acuate. ” In chemical and biological systems we often deal with HUGE numbers of particles, often on the scale of moles ( one mole of molecules contains more than 1023 particles ! ) so one can imagine what the probability distribution looks like in such cases. It is incredibly abrupt. The merely macrostate that we ever see is the most probable one. Regularities emerge from the probability that are ( adenine hanker as we are talking about many particles ) angstrom strong as laws we consider to be “ absolute ” alternatively of “ probabilistic ” .