You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This will impact the forward and backward methods in:
network type
layer type
dense_layer type
conv2d_layer type
Effectively, rather than looping over sample in a batch inside of network % train, we will pass batches of data all the way down to the lowest level, that is, the forward and backward methods of dense_layer and conv2d_layer types. Lowering the looping over the sample in a batch will also allow the implementation of a batchnorm_layer.
It will also potentially allow more efficient matmuls in dense and conv layers if we replace the stock matmul with some more specialized and efficient sgemm or similar from some flavor of BLAS or MKL.