Linear Discriminant Analysis (LDA) also called Fisher’s Linear Discriminant reduces dimension (like PCA) but focuses on maximizing seperability among known categories 💡 Idea Create a new axis Project the data onto this new axis in a way to maximize the separation of two categories How it works?
2020-11-07
No assumption about distributions -> non-parametric Linear decision surfaces Begin by supervised training (given class of training data) Linear Discriminant Functions and Decision Surfaces A discriminant function that is a linear combination of the components of $x$ can be written as $$ g(\mathbf{x})=\mathbf{w}^{T} \mathbf{x}+w\_{0} $$ $\mathbf{x}$: feature vector $\mathbf{w}$: weight vector $w\_0$: bias or threshold weight The two category case Decision rule:
2020-11-07
Do not make strong assumptions about the form of the mapping function or data distribution.
2020-09-07