Skip to content

Model Feed forward and Back propagation

Riccardo Viviano edited this page Apr 29, 2019 · 28 revisions

I'm gonna explain in detail how model_tensor_input_ff and model_tensor_input_bp work!

Model_tensor_input_ff:

The function:

void model_tensor_input_ff(model* m, int tensor_depth, int tensor_i, int tensor_j, float* input)

Description:

/* This function computes the feed-forward for a model m. each layer at the index l makes the feed-forward
 * for the first layer at the index l-1. if the input is a 1d array then you should split its dimension
 * in 3 dimension to turn the input in a tensor, for example:
 * I have an input array of legth 59, then i can split this in 3 dimensions: depth = 1, row = 1, cols = 59
 * 
 * Input:
 *             
 *             @ model* m:= the model with the layers
 *             @ int tensor_depth:= the depth of the input tensor
 *             @ int tensor_i:= the number of rows of the tensor
 *             @ int tensor_j:= the number of columns of the tensor
 *             @ float* input:= your input array
 * 
 * */

Line By Line description:

If there is no model passed as parameter the function returns without doing anything.

if(m == NULL)
    return;

There are 4 fundamental functions used during the feed forward: fcl_fcl_ff, fcl_cl_ff, cl_fcl_ff, cl_cl_ff. These functions compute respectively the feed forward between fully-connected and fully-connected layers, fully-connected and convolutional layers, convolutional and fully-connected layers, convolutional and convolutional layers, handling all the cases where there is DROPOUT, NO DROPOUT, DROPOUT_TEST, CONVOLUTION, NO_CONVOLUTION, PADDING, POOLING, NORMALIZATION. So, to make the things esier the input is put inside a convolutional layer (temp) where is copied in temp->post_activation. the activation flag is set to SIGMOID just to say to the next layer for the feed forward, Ehy, there is an activation in this convolutional layer, so look inside the temp->post_activation array for your input.

/* Setting the input inside a convolutional structure*/
    cl* temp = (cl*)malloc(sizeof(cl));
    temp->post_activation = (float*)malloc(sizeof(float)*tensor_depth*tensor_i*tensor_j);
    temp->normalization_flag = NO_NORMALIZATION;
    temp->pooling_flag = NO_POOLING;
    temp->activation_flag = SIGMOID;
    temp->n_kernels = tensor_depth;
    temp->rows1 = tensor_i;
    temp->cols1 = tensor_j;
    copy_array(input,temp->post_activation,tensor_depth*tensor_i*tensor_j);

There is a double cycle "for". There is a double cycle but the time is set to O(m->layers). The double cycle is set only because you can have multiple layers at the same level, for example you can have 2 layers with the layer param set to the same number. you must pay attention to this form, because the feed forward is computed only considering as input the first layer before the actual one. is not recommended having multiple layers at the same level, if you want a ramification for the feed forward and backpropagation is recommended using multiple models. The sla matrix says: Ehy i'm a matrix of m->layers*m->layers, a generic sla[i][j] is set to 0 if there are no layers, otherwise is set to FCLS, CLS or RLS flag, here an example:

model* m is a model with 6 total layers, 2 convolutional layers inside residual layer, 2 convolutional layers and 2 fully-connected layers, then, sla[0][0] = RLS, sla[0][1] = 0, sla[0][2] = 0, sla[0][3] = 0, sla[0][4] = 0, sla[0][5] = 0, sla[1][0] = RLS, sla[1][1] = 0, sla[1][2] = 0, sla[1][3] = 0, sla[1][4] = 0, sla[1][5] = 0, sla[2][0] = CLS, sla[2][1] = 0, sla[2][2] = 0, sla[2][3] = 0, sla[2][4] = 0, sla[2][5] = 0, sla[3][0] = CLS, sla[3][1] = 0, sla[3][2] = 0, sla[3][3] = 0, sla[3][4] = 0, sla[3][5] = 0, sla[4][0] = FCLS, sla[4][1] = 0, sla[4][2] = 0, sla[4][3] = 0, sla[4][4] = 0, sla[4][5] = 0, sla[5][0] = FCLS, sla[5][1] = 0, sla[5][2] = 0, sla[5][3] = 0, sla[5][4] = 0, sla[5][5] = 0,

/* apply the feed forward to the model*/
    for(i = 0; i < m->layers; i++){
        for(j = 0; j < m->layers && m->sla[i][j] != 0; j++){
Clone this wiki locally