Do you know the Trick..?

Sri Dedeepya Nandamuri
2 min readNov 21, 2019

In this blog i’m going to explain one of the model in machine Learning which i learn today….

what is kernel trick?

The data in the “lower level dimensional space” is “non linear” in nature and this data is projected into “higher dimensional” space to “linear” to do the classification.

Example of Kernel Trick

In the above picture, we see the difference of these transformation

let me explain in normal terms

In the first picture, it is clear that the variables are present in the 2 dimensional plane where the division of plane in between the two variables is not possible.Hence they are non-linear variables.These types of variables which cannot be divide is called low dimensional space.

In the second picture,to get the variables into high dimensional space we add extra columns by squaring the existence variables (or) cubing the existence variables.By increasing the columns in such a way so that the variables get into high dimensional space.it becomes linear variables where classification is done further to divide the plane.

Note: the New columns should be existed with non-linear variables only.(different powers ) and created by the existing columns.

So,we use this Kernel trick,so we can apply the linear and logistic regression by calculating the Errors and residual sum of squares and SVM model.For the data to solve the problem.

--

--