Severity: Warning
Message: fopen(/home/answnniz/public_html/system/sessions/ci_session8293fe9077b011f66cfbfbea202ae1b3c39d44f5): failed to open stream: Disk quota exceeded
Filename: drivers/Session_files_driver.php
Line Number: 176
Backtrace:
File: /home/answnniz/public_html/index.php
Line: 315
Function: require_once
Severity: Warning
Message: session_start(): Failed to read session data: user (path: /home/answnniz/public_html/system/sessions)
Filename: Session/Session.php
Line Number: 143
Backtrace:
File: /home/answnniz/public_html/index.php
Line: 315
Function: require_once
1. Kernel Regression Given a training data {xi,yi}ni=1, kernel regression approximates the unknown nonlinear relation between x and y with a function of form y ≈ f(x; w) = Xwik(x, xi), where k(x, x′) is a positive definite kernel specified by the users, and {wi} is a set of weights, estimated by minimizing a regularized mean square error: min X(yi − f(x; w))2 + βw⊤Kw , where w is the column vector formed by w = [wi]ni=1 and K the n×n matrix by K = [k(xi,xj)]nij=1, and β is a positive regularization parameter. We use a simple Gaussian radius basis function (RBF) kernel, k(x,x)=exp − 2h2 , where h is a bandwith parameter. A common way to set h in practice is the so called “median trick”, which set h to be the median of the pairwise distance on the training data, that is, hˆmed =median({||xi−xj||:i̸=j, i,j=1,...,n}). (1) [10 points] Complete the code of kernel regression following the instruction of the at- tached Python notebook. Specifically, you need to complete all the code necessary for function kernel_regression_fit_and_predict to run. (2) [10 points] Run the algorithm with β = 1 and h ∈ {0.1hˆmed, hˆmed, 10hˆmed}. Show the curve learned with different h in the notebook and comment on how h influences the smoothness of the curve. (3) [10 points] Use 5-fold cross validation to find the optimal combination of h and β within h ∈ {0.1hˆmed, hˆmed, 10hˆmed} and β ∈ {0.1, 1}.