Naive Bayes

Naive Bayes is a Classification and prediction technique that is based on the Bayes Theorem. It predicts that the probabilities for each class belongs to a particular class and that the class with the highest probability is considered the most likely class.

Notes:

  • It's called Naive because it assumes the independence between attributes of data points(the occurrence of a certain feature is independent of the occurrence of other features).

Applications:


Types:


Advantages:

  1. It's simple and easy to implement.
  2. It works well with high-dimensional datasets.
  3. It can handle both continuous and discrete data.
  4. It's good at handling missing data.
  5. Performs both binary and multi-class classification.
  6. It performs well even with a small amount of training data.

Disadvantages:

  1. The algorithm is sensitive to the quality of the input data.
  2. Cannot handle negative correlation between input features.
  3. The assumption of independence among input features may not always hold true.
  4. Requires a large amount of memory to store the Conditional Probability for each input feature.