"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

October 08, 2016

Day #34 - What is diffference between Logistics Regression and Naive Bayes

Both are probabilistic
Logistics
  • Discriminative (Entire approach is purely discriminative)
  • P(Y/X)
  • Final Value lies between Zero and 1
  • Formula given by exp(w0+w1x)/(exp(w0+ w1x)+1)
  • Further can be expressed as 1/(1+(exp-(w0+ w1x))
Binary Logistic Regression - 2 class
Multinomial Logistic Regression - More than 2 class

Example - Link




Link - Ref
Logistic Regression
  • Classification Model
  • Probability of success as a sigmoid function of a linear combination of features
  • y belongs to (0,1) - 2 Class problem
  • p(yi) = 1 / 1+e-(w1x1+w2x2)
  • Linear combination of features - w1x1+w2x2
  • w can be found with max likelihood estimate- 
Naive Bayes
  • Generative Model
  • P(X/ Given Y) is Naive Bayes Assumption
  • Distribution for each class
Happy Learning

No comments: