Skip to main content
Home
plus.maths.org

Secondary menu

  • My list
  • About Plus
  • Sponsors
  • Subscribe
  • Contact Us
  • Log in
  • Main navigation

  • Home
  • Articles
  • Collections
  • Podcasts
  • Maths in a minute
  • Puzzles
  • Videos
  • Topics and tags
  • For

    • cat icon
      Curiosity
    • newspaper icon
      Media
    • graduation icon
      Education
    • briefcase icon
      Policy

    Popular topics and tags

    Shapes

    • Geometry
    • Vectors and matrices
    • Topology
    • Networks and graph theory
    • Fractals

    Numbers

    • Number theory
    • Arithmetic
    • Prime numbers
    • Fermat's last theorem
    • Cryptography

    Computing and information

    • Quantum computing
    • Complexity
    • Information theory
    • Artificial intelligence and machine learning
    • Algorithm

    Data and probability

    • Statistics
    • Probability and uncertainty
    • Randomness

    Abstract structures

    • Symmetry
    • Algebra and group theory
    • Vectors and matrices

    Physics

    • Fluid dynamics
    • Quantum physics
    • General relativity, gravity and black holes
    • Entropy and thermodynamics
    • String theory and quantum gravity

    Arts, humanities and sport

    • History and philosophy of mathematics
    • Art and Music
    • Language
    • Sport

    Logic, proof and strategy

    • Logic
    • Proof
    • Game theory

    Calculus and analysis

    • Differential equations
    • Calculus

    Towards applications

    • Mathematical modelling
    • Dynamical systems and Chaos

    Applications

    • Medicine and health
    • Epidemiology
    • Biology
    • Economics and finance
    • Engineering and architecture
    • Weather forecasting
    • Climate change

    Understanding of mathematics

    • Public understanding of mathematics
    • Education

    Get your maths quickly

    • Maths in a minute

    Main menu

  • Home
  • Articles
  • Collections
  • Podcasts
  • Maths in a minute
  • Puzzles
  • Videos
  • Topics and tags
  • Audiences

    • cat icon
      Curiosity
    • newspaper icon
      Media
    • graduation icon
      Education
    • briefcase icon
      Policy

    Secondary menu

  • My list
  • About Plus
  • Sponsors
  • Subscribe
  • Contact Us
  • Log in
  • Maths in a minute: The binomial distribution

    7 January, 2022

    In our brief introduction to probability distributions we talked about rolling dice, so let's stick with that example. Imagine I roll a die three times and each time you try and guess what the outcome will be. What's the probability of you guessing exactly k rolls right, where k is 0, 1, 2 or 3?

    More generally, imagine you perform an experiment (eg roll a die) $N$ times, and each time the result can be success or failure. What's the probability you get exactly $k$ successes, where $k$ can be any integer from $0$ to $N$?

    In our example, as long as the die is fair, you have a probability of $p=1/6$ of guessing right. Since the probability of three independent events (ie guessing correctly) is the product of the individual probabilities, your probability of three correct guesses is $$P(Correct = 3) = 1/6 \times 1/6 \times 1/6 = (1/6)^3\approx 0.005$$ Your probability of guessing wrong is $1-p=1-1/6=5/6$. By the same reasoning as above, the probability of getting no guess right is $$P(Correct = 0) = 5/6 \times 5/6 \times 5/6 = (5/6)^3\approx 0.579.$$

    What about the probability of guessing one roll right, so $k=1$? There are three ways in which this could happen:

    • You get the 1st roll right and the other two wrong
    • You get the 2nd roll right and the other two wrong
    • You get the 3rd roll right and the other two wrong

    Since each involves one correct and two incorrect guesses, the probability of each of the three scenarios is $1/6 \times (5/6)^2.$ And since the probability of any one of three events occurring is the sum of the individual probabilities, we have that the probability of getting exactly one guess right is $$P(Correct=1)=3\times 1/6 \times (5/6)^2 \approx 0.347. $$ Finally, we look at the probability of two correct guesses. Again this can happen in three ways (we leave it up to you to work these out). Each individual way has a probability of $(1/6)^2\times 5/6$, so the overall probability is $$P(Correct=2)=3\times (1/6)^2 \times 5/6 \approx 0.069. $$

    Here's the histogram displaying the distribution.

    Histogram

    Now let's look at the general set-up. You're doing $N$ experiments that can each end in success or failure, and you're asking for the probability that there are exactly $k$ successes among the $N$ experiments. Write $p$ for the probability of success so $1-p$ is the probability of failure. By the same reasoning as above, a particular sequence of $k$ successes and $N-k$ failures has probability $$p^k(1-p)^{N-k}.$$ But, also as above, such a sequence can occur in several ways, each way defined by how the successes are sprinkled in among the failures. It turns out that the number of ways you can sprinkle $k$ objects in among a sequence of $N$ objects, denoted by ${N\choose k}$, is given by $${N\choose k} = \frac{n!}{k!(n-k)!}.$$ Here the notation $i!$, where $i$ is a positive integer, stands for $$i!=i\times (i-1)\times (i-2)\times ... \times 2 \times 1.$$ (and $0!$ is defined to equal 1). We now have a neat way of writing the probability of $k$ successes: $$P(Correct=k) = {N\choose k}p^k(1-p)^{N-k}.$$ That's the binomial distribution.

    The mean of this distribution, also known as the expectation is $Np.$ So in our example above where $N=3$ and $p=1/6$ the mean is $$Np=3/6=1/2.$$ Loosely speaking, this means that if we played our game of guessing three rolls lots and lots of times, then on average you could expect to get half a roll per game right. Or, to phrase it in a way that uses whole numbers on average you could expect to get one roll in two games right.

    The variance of the binomial distribution, which measures how spread out the probabilities are, is $$Np(1-p).$$ So in our example above it is $$Np(1-p)=\frac{3}{6}\times \frac{5}{6}=\frac{5}{12}.$$

    The shape of the binomial distribution depends on the value of the mean and the number of experiments. Here are some more examples:

    Bonomial distributions

    • Log in or register to post comments

    Read more about...

    probability
    probability distribution
    binomial distribution
    statistics
    Maths in a minute
    statistical distribution

    Our Podcast: Maths on the Move

    Our Maths on the Move podcast brings you the latest news from the world of maths, plus interviews and discussions with leading mathematicians and scientists about the maths that is changing our lives.

    Apple Podcasts
    Spotify
    Podbean

    Plus delivered to you

    Keep up to date with Plus by subscribing to our newsletter or following Plus on X or Bluesky.

    University of Cambridge logo

    Plus is part of the family of activities in the Millennium Mathematics Project.
    Copyright © 1997 - 2025. University of Cambridge. All rights reserved.

    Terms