I know the Nadaraya-Watson kernel regression. What is new to me is the Kernel ridge regression from scitkit-learn's KernelRidge with `kernel='rbf'`

. It mentions that the kernel trick is used, which is not done in the Nadaraya-Watson kernel regression, so it seems to me they are different concepts. Am I right, or are they the same afterall?

**Contents**hide

#### Best Answer

Yeah, you are right. You practically replace the square matrix $X^TX$ with a Kernel $K$ when you estimate your coefficients.

### Similar Posts:

- Solved – Nadaraya-Watson Kernel Regression Estimator for Multivariate Response
- Solved – Gaussian kernel estimator as Nadaraya-Watson estimator
- Solved – Isn’t kernel ridge regression supposed to be the mean of gaussian processes
- Solved – What kind of kernel is used by statsmodels.nonparametric.kernel_regression.KernelReg
- Solved – Is a polynomial kernel ridge regression really equivalent to performing a linear regression on those expanded features