Fully Connected Neural Network Lab


November 11, 2018

by Zongcai Feng

PIC

Figure 1: (a) Display of input vectors, (b) Corresponding labeling.

PIC

Figure 2: (a) FNN convergence plot , (b) observed classes, (c) predicted classes .

1 Objective Generate training set of vectors that contain 1s and 0s. Classify which ones only contain only 0 (class=1) or not (class=0). Use multilayer neural network that requires the input of parameters to set up the FNN structure.

2 Prerequisties

PIC

Figure 3: Softmax function and its gradient.

3 Procedure

  1. Download the codes ./FNN_multiclass_code.zip and unzip it.
  2. Change your Matlab working directory, trype "NNode_multiclass" while in MATLAB and follow the instruction to input your setting of parameters. Results will be displayed for the predicted classes.
  3. What is the effect on convergence rate and accuracy when increasing the number of nodes in a layer? Explain the reason for this behavior.
  4. What is the effect on convergence rate for increasing the number of layers in the neural network? Explain the reason for this behavior.
  5. Add some mislabeling errors to the labels. How sensitive is the network to mislabled data? Is the percentage of prediction errors proportional to the % of labeling errors?

4 Update of the code for_html

[ g,dg ] = activation( z,type )
%activation function
%input:
%z: input for atviation function
%output:
% g is the value of the activation function
% dg is the gradient of g with respective to z

[n1,n2]=size(z);
ez=exp(z);
ezsum=sum(ez,1);
g=ez./repmat(ezsum,[n1,1]);
dg=zeros(n1,n1,n2);
for i=1:n2
    dg(:,:,i)=diag(g(:,i))-g(:,i)*g(:,i)';
end

Backpropagation in gradientnn.m

[~,dg]=activation(zz{iter});  %%%----- in=dg[i].*in (in book) ------%%%
if iter ==layer_num-1 %output layer use softmax
    for i=1:M
        in(:,i)=dg(:,:,i)'*in(:,i);
    end
end