Answer to Question #215670 in Python for Nayshafatima

Question #215670
Write Python code for baysean learning to dataset as given in Q1 theory paper .Assume test
data=33% and Dataset contains integers values in dataset.
b) Classify following example Fast yes or no using baysean classification algorithm
Engine = Small , SC/Turbo= Yes , Weight=Light , Fuel Eco= G
1
Expert's answer
2021-07-11T14:34:58-0400

Bayesian Classification


Naive Bayes classifiers are built on Bayesian classification methods. These rely on Bayes's theorem, which is an equation describing the relationship of conditional probabilities of statistical quantities. In Bayesian classification, we're interested in finding the probability of a label given some observed features, which we can write as "P(L \\backslash features)". Bayes's theorem tells us how to express this in terms of quantities we can compute more directly:


"P(L~|~features)=\\dfrac{P(features~|~L)~P(L)}{P(features)}"


If we are trying to decide between two labels—let's call them "L_1"  and "L_2" — then one way to make this decision is to compute the ratio of the posterior probabilities for each label:


"\\dfrac{P(L_1~|~features)}{P(L_2~|~features)}=\\dfrac{P(features~|~L_1)P(L_1)}{P(features~|~L_2)P(L_2)}"


All we need now is some model by which we can compute "P(features~|~L_i)"  for each label. Such a model is called a generative model because it specifies the hypothetical random process that generates the data. Specifying this generative model for each label is the main piece of the training of such a Bayesian classifier. The general version of such a training step is a very difficult task, but we can make it simpler through the use of some simplifying assumptions about the form of this model.


Need a fast expert's response?

Submit order

and get a quick answer at the best price

for any assignment or question with DETAILED EXPLANATIONS!

Comments

No comments. Be the first!

Leave a comment

LATEST TUTORIALS
New on Blog
APPROVED BY CLIENTS