Question:

class we conside optimizing can Square MSE practice ther are

Last updated: 10/15/2023

class we conside optimizing can Square MSE practice ther are

class we conside optimizing can Square MSE practice ther are other choices of loss functions as well For this problem we will consider linear regression using Mean Absolute Error MAE as our loss function Specifically the MAE loss is given as JMAE 0 Solution a 4 points Derive the partial derivative JMAE 0 30 otherwise Solution OJMAE 0 20 Hint For this question you don t need to worry about J 0 You can use the sign function 1 2 y s t sign y n 76 n 0 x i y i i 1 VOJMAE 0 1 20 n 81 30 n sign 0 x y x sign 0x y 0 x i y i 1 1 0Tx y b 2 points Write the vectorized solution for the gradient of the loss function i e VeJMAE 0 Solution n sign Tx y i x c 4 points Given the following dataset of 8 points what is the value of of JMAE 0 and JMSE 0 at 0 1 0 1 0 1 0 How about their gradients VJMAE 0 and VoJMSE 0 1 i 1 2 5 3 24 0 1 1 0 1 2 2 2 1 12 3 1 6 3 JMAR A 3 375 IMSE A 10 3125 6 7 8 1 0 5 2 3 0 6 8 7