This slide shows the switch from a model's "primal form" (using w) to its "dual form" (using Ξ±), which is the key to using the kernel trick. Let's break down your questions intuitively.
Intuitively, is a vector of weights, where each weight tells you the importance of the i-th training example in defining the decision boundary.
This is a fundamental shift in perspective:
In an SVM, most of the values will be zero. The only non-zero values belong to the support vectors. This means the model is defined only by the most critical points on the margin.
So, to answer your question, Ξ± does not replace w directly, but it provides an alternative way to construct it. The mathematical relationship is: This formula says that the feature-weight vector w is simply a linear combination of the training data points, where the coefficients of that combination are the importance weights in Ξ±.
This formula looks complex, but it's just the original prediction formula, , rewritten using the relationship we just defined. Let's walk through it.
1. Start with the original prediction formula: To predict a new point , we use:
2. Substitute the dual form of w: We know that . Let's plug that in:
3. Rearrange using linear algebra: Because of the properties of transpose, we can re-group the terms:
4. Define : The slide defines the term . Let's see what that actually is.
So, is a vector containing the dot products of your new point () with every single point in the training set. It's a measure of similarity between the new point and all the old points.
5. The Final Formula: By substituting back into the equation from step 3, we get: The reason it's (transpose) is a notational convention for the dot product between two column vectors. The formula simply calculates a weighted sum, where you are summing the "similarity scores" in , weighted by the "importance scores" in .
This is powerful because the entire processβboth training and predictionβnow depends only on inner products (dot products), which allows us to swap them out for kernels.
Β© 2025 James Yap
Personal Website and Knowledge Base