3.4 Outro

3.4.1 Remarks


Note that K-NN is suitable for any kind of multiclass classification.

Some algorithms we are going to discuss in the next part are restricted to binary (0/1) outputs. They will have to be extended somehow to allow for more classes.

In the next part we will try to answer the question of how to choose the best \(K\), and hence how to evaluate and pick the best model.

We will also discuss some other noteworthy classifiers:

  • Decision trees
  • Logistic regression

3.4.2 Side Note: K-NN Regression


The K-Nearest Neighbour scheme is intuitively pleasing.

No wonder it has inspired a similar approach for solving a regression task.

In order to make a prediction for a new point \(\mathbf{x}'\):

  1. find the K-nearest neighbours of \(\mathbf{x}'\) amongst the points in the train set, denoted \(\mathbf{x}_{i_1,\cdot}, \dots, \mathbf{x}_{i_K,\cdot}\),
  2. fetch the corresponding reference outputs \(y_{i_1}, \dots, y_{i_K}\),
  3. return their arithmetic mean as a result, \[\hat{y}=\frac{1}{K} \sum_{j=1}^K y_{i_j}.\]

Recall our modelling of the Credit Rating (\(Y\)) as a function of the average Credit Card Balance (\(X\)) based on the ISLR::Credit data set.


plot of chunk unnamed-chunk-33

3.4.3 Further Reading

Recommended further reading:

  • (Hastie et al. 2017: Section 13.3)