UFLDL-Softmax

这是UFLDL的编程练习。

Weight decay(Softmax 回归有一个不寻常的特点:它有一个“冗余”的参数集)后的cost function和梯度函数:

bsxfun函数的使用:

练习题答案(建议自己完成,后参考):

  • softmaxCost.m:
1
2
3
4
5
6
7
M = theta*data; %exp(theta(l)' * x(i))
M = bsxfun(@minus, M, max(M, [], 1));
h = exp(M);
h = bsxfun(@rdivide, h, sum(h));
size(groundTruth);
cost = -1/numCases*sum(sum(groundTruth.*log(h)))+lambda/2*sum(sum(theta.^2));
thetagrad = -1/numCases*((groundTruth-h)*data')+lambda*theta;
  • softPredict.m:
1
[index ,  pred]= max(theta * data,[],1);