Thursday, September 6, 2018

Demster Shaffer Theory

 Demster Shaffer Theory :
Deals with the distinction between uncertainty and ignorance. Rather than computing the probability of a proposition it computes the probability that the evidence supports the proposition. This measure of belief is called a belief function, Bel (x).
In Bayesian network technique, degree of belief assigned to a proposition with given evidence, is a point, whereas in D.S. theory we consider sets of propositions and to each set we assign an interval [Belief, Plausibility]. 
in which the degree of belief must lie. We use probability density function m and for all subsets of ɸ for an exhaustive universe of mutually exclusive hypotheses Θ (called frame of discernment) and for all subsets of ɸ. Belief (represented as Bel) measures the strength of evidence in a set of propositions. This lies in the range 0-1 when no evidence means belief 0 and certainty: means belief is 1.
Plausibility (PI) is given by
PI (S) = 1- Bel (∼ S)
where S is set of proposition
In particular if we have certain evidence in favour of ∼ S then Bel (∼ S) will be 1.
and PI (S) is zero
and Bel (S) is also zero.
The belief-plausibility interval defined above measures not only our level of belief in some set of proposition but also the amount of information we have. This can be illustrated with the help of an example.
Suppose a person with doubtful character (shady character) approaches you to bet for Rs.500, telling that on the next flip on his coin heads would appear. You think for a moment that coin may not be fair. You have no evidence on the fairness or unfairness of the coin. Then as per the D-S theory, with no evidence whatsoever (on coin being fair or unfair)
Bel (Heads) = 0
Bel (−Heads) = 0
That is D−S theory has no intuitive faculty.
Now suppose you consult an expert of coins and he assets that with 90% certainty the coin is fair so you become 90% sure that P (Heads) = 0.5.
Now D-S theory gives Bel (Heads) = 0.9 x 0.5 = 0.45
and similarly bel (∼ Heads) = 0.45
There is still 10% point gap which is not accounted for by the evidence. Actually dempster rule shows how to combine evidence to give new value for belief and shafer’s work extends this into a complete computational model.
Since D-S theory deals not with point probability but with probability interval, the width of the interval might be an aid in deciding when do we need to acquire evidence.
In the present example before acquiring the expert’s testimony probability interval for Heads is [0, 1] and this gets reduced to [0.45, 0.55] after the expert’s testimony is received for the coin (information about fairness of coin).
However, there are no clear guidelines for how to do this and there is no clear meaning for what the width of an interval means. For example, knowing whether the coin is fair would have a significant impact on the belief that it will come up heads and detecting an asymmetric weight would have an impact on the belief that the coin is fair.
Consider another example:
Diagnosis problem, as a case of exhaustive universe of mutually exclusive hypotheses. This can be called the frame of discernment, written as Q.
This may be consisting of the set (Alg, Flu, Coe, Pne) following symptoms, where:
Alg – Allergy
Flu – Flue
Coe – Cold
Pne – Pneumonia 
Probability density function m is defined not for elements of Q but for all subsets of it is for ɸ and all subsets a m (p) is the amount of belief in the subset (p) of there is no prior evidence in is 1.0.
But when it becomes known through an evidence (at level of 0.6) that the correct diagnosis is in the set {Flu, Col, Pne}
then m gets updated as
{Flu, Col, Pne} (0.6)
{Θ} (0.4)
that is belief is assigned to set of symptoms {Flue, Col, Pne} the remainder of belief still continues to be in the layer set Θ.
Thus, in order to be able to use m and belief and plausibility in D-S. Theory we define functions which enable us to combine m’s from multiple sources of evidence.
Our goal is to attach some measure of belief, m, to the various subsets Z or Q. m is sometimes called the probability density function for a subset of Q. Realistically not all evidence is directly supportive of individual elements of Q. In fact, evidence most often supports different subsets Z of Q.
In addition, since the elements of Q are assumed mutually exclusive evidence in favour of some may have an effect on our belief in others. In purely Bayesian section, we address both of these situations by listing all the combinations of conditional probabilities. In the D-S theory, we handle these interactions by directly manipulating the sets of hypotheses.
There are 2n subsets of Q. We must assign m so that the sum of all the m values assigned to the subsets of Q is 1. Although dealing with 2n values may appear intractable, it usually turns out that many of the subsets will never need to be considered because they may have no significance in the problem domain; so their associated value m may tend to be zero to 3 above.
D-S theory is an example of an algebra supporting the use of subjective probabilities in reasoning, as compared with the objective probabilities of Bayes. In subjective probability theory, we build a reasoning algebra, often by relaxing some of the constraints of Bayes. It is sometimes felt that subjective probabilities better reflect human expert reasoning.
So we conclude uncertain reasoning by saying that D-S allows us to combine:
(i) Multiple sources of evidence for a simple hypothesis.
(ii) Multiple sources of evidence for different hypothesis.

No comments:

Post a Comment