
Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
A comparison of various machine learning algorithms, including naive bayes, logistic regression, perceptron, lda, decision trees, and nearest neighbor. The comparison covers their flexibility, robustness, and computational efficiency. Topics include interpretability, handling of missing values and noise/outliers, irrelevance of features, and monotone transformations. The document also discusses the use of trees and neural networks, and provides a summary of each model.
Typology: Exams
1 / 1
This page cannot be seen from the preview
Don't miss anything!
Some- what interpret Yes Yes Yes able Data numeric Numeric Numeric mixed Outliers Bad Fatal Ok Fair/poor Missing yes No No yes values Models P(x|y) LTU P(y|x) P(x|y) Naïve Bayes Logistic regression LDA^ Perceptron
irrelevant Bad Bad Bad some features Compute good good good good time Monotone no no no maybe transform Naïve Bayes Logistic regression LDA^ Perceptron
Training set no, but ok for test points Missing values Tricks Good with Good with knn pruning Noise/outliers Only in 1 or 2 dimensions interpretable If small tree Data mixed Usually Numeric Instance based, flexible Trees - flexible Model Decision Nearest Neighbor Trees
Lazy - expensive Computation OK time Irrelevant Fair Very bad features Monotone Great Very bad transformation Nearest neighbor Decision tree