K Nearest Neighbors
Create a KNN distance computation according to these specs
Answer
- Initialize distances
- Calculate the L2 distance ∑(xi−yi)2 keeping shapes/broadcasting in mind
Implement cross validation with the following parameters
Answer
- Set up the folds
- Loop through choices
- Loop through fold combinations (use
concatenate
, compress
, and arange
for the operation)
- Train on the respective fold and store the accuracies
Support Vector Machine Classsifier
Calculate the SVM gradient and loss according to specs
Answer
- Initialize loss and gradient
- Get the raw scores
- Compute the margins (distance with hinge- in this case hinge is 1)
- Get list of predictions for correct position broadcasting (ensure that axes and whatnot are correct)
- Calculate margins
- Get the total loss with regularization
- Calculate the gradient
- Get matrix of 1/0s where gradient exists
- Updte to incude correct labels (remember the equation)
- Compute the gradient using this matrix
Given the above loss function, calculate training with sgd according to these specs
Answer
- Initialize weights
- For each iteration
- Get random indices
- Computer loss and gradient
- Append loss to history
- Update weights with gradient
Softmax Classifer
Implement gradient and loss for softmax classifier given these specs
Answer
Remember that the softmax function is essentially SVM with a different loss function and no hinge loss
This means that you
- Get the raw outputs (logits)
- Normalize (subtract by max)
- Turn into exponents
- Get the softmax values (softmax equation)
Then, to get loss
5. Get the loss from the softmax (negative log) for the index of what would be correct prediction
6. Regularize + average
To get gradient
7. Update the correct classification (subtract 1)
8. Calculate the gradient
Fully Connected Neural Network
Create an affine_forward
function according to these specs
Answer
- Reshape the input
- Compute output
Create an affine_backward
function according to these specs
Answer
- Reshape the input data
- Compute gradients
- Input
- Weight
- Bias
Create a forward pass for ReLU
activation function according to these specs
Answer
- Calculate RELU
Create a backward pass for ReLU
activation function according to these specs
Answer
- Calculate gradient
Implement svm_loss
according to these specs
Answer
- Get the probabilities assigned to the correct predictions (ensure the shape is correct for broadcasting)
- Calculate the margins
- Use margins to calculate loss
- Calculate the gradient of loss wrt x (remember the SVM loss equation)
- Remember how to calculate the correct index’s equation
Implement softmax_loss
according to these specs
Answer
- Compute the raw logit outputs
- Normalize the outputs
- Turn outputs into softmax values
- Calculate the loss (maximum log likelihood of right class)
- Calculate the gradient
- Remember to update the probabilities first
Build out a full Two Layer Neural Network according to these specs
Answer
- Initialize
- Weights
- Biases
- Loss function
- Run
affine_forward
, relu_forward
, then affine_forward
to get outputs
- Get losses and dloss with
softmax_loss
- Run
affine_backward
, relu_backward
, henaffine_backward
to get gradients wrt all weights, biases, etc.
- Update weights