## Bayes Theorem

**Bayes Theorem** Assignment Help | **Bayes Theorem** Homework Help

# Bayes’ Theorem

One of the most interesting applications of the results of probability theory involves estimating unknown probabilities and making decisions on the basis of new (sample) information. Probability of some event, A, given that another event, B, has been (or will be) observed, i.e., determining the value of P (A/B). The event A is usually thought of as sample information so that Bayes’ rule is concerned with determining the probability of an event given certain sample information.Bayes’ Theorem is based on the formula for conditional probability explained earlier.

Let: A

_{1}and A

_{2}= The set of events which are mutually exclusive (the two events cannot occur together) and exhaustive (the combination of the two event cannot occur together) and exhaustive (the combination of the two events in the entire experiment), and

B = A simple event which intersects each of the A events as shown in the diagram below:

Observe the diagram. The part of B, which is within A

_{1}represents the area “A

_{1 }and B” and the part of B within A

_{2 }represents the area “A

_{2}and B”.

Then the probability of event A1 given B, is

P (A

_{1}/B) =

__P (A__

_{1}and B)______P (B)

And, similarly the probability of event A

_{2}given B, is

P (A

_{2}/B) = P (A

_{2}and B)

P (B)

Where P (B) = P (A

_{1}and B) + P (A

_{2}and B),

P (A

_{1}and B) = P (A

_{1}) X P (B/A

_{1}), and

P (A

_{2}and B) = P (A

_{2}) X P (B/A

_{2})

In general, let A

_{1}, A

_{2}A

_{3}…..A

_{i}…A

_{n}, be a set of n mutually exclusive and collectively exhaustive events. If B is another event such that P(B) is not zero, then

P (A

_{1}/B) = P (B/A

_{1}) P(A

_{i})

^{k}

∑ p (B/A

_{1}) P (A

_{i})

_{i=1 }

Probabilities before revision by Bayes’ rule are called a priori or simply prior probabilities, because they are determined before the sample information is taken into account. A probability has undergone revision in the light of sample information (via Bayes’ rule) is called a posterior probability. Posterior probabilities are also called revised probabilities because they are obtained by revising the prior probabilities in the light of the additional information gained. Posterior probabilities are always conditional probabilities, the conditional event being the sample information.

Some interesting points worth noting about Bayes’ Theorem are:

1. Though it deals with a conditional probability, its interpretation is different from that of the general conditional probability theorem. The general conditions probability theorem discussed earlier in the chapter. The general conditional probability theorem asks, “What is the probability of the state value given the sample or experimental result?

2. When we talk of Bayes’ Theorem, different decision-makers may assign different probabilities to the same set of states of nature. Also we may conduct a new experiment by using posterior probabilities of the preceding experiment as prior probabilities. As we proceed with repeated experiments, evidence accumulates and modifies the initial prior probabilities, thereby modifying the intensity of a decision-maker’s belief in various states of nature. In other words, the more evidence we accumulated, the less important are the prior probabilities.

3. The notions of “prior” and “posterior” in Bayes’ theorem are relative to a given sample outcome. That is, if a posterior distribution has been determined from a particular sample, this posterior distribution would be considered the prior distribution relative to a new sample.

For more help in

**Bayes’ Theorem**click the button below to submit your homework assignment