Tags: oonisim/python-programs
Tags
added standardization layer. removed BN from network objective stack
NLP NN standardization layer implemented and testd
NLP NN Sequential layer implemented and testd
Test effect of zero-clear bias weight for zero-centered data.
Because the data is almost zero-centered, the bias x0 is not required. Hence set the bias weight w0 to zero to short-cut the training. Without, the
W1_bias_0 = copy.deepcopy(W1) # np.copy() is sufficient without deepcopy.
W2_bias_0 = copy.deepcopy(W2)
W1_bias_0[
::,
0
] = 0
W2_bias_0[
::,
0
] = 0
Two layer (matmul + ReLU) * 2 + log loss output training on non-linea… …r circles of A NOT B.
Softmax cross entropy log loss tested. Tag before modifying the cross… …_entropy_log_loss() for sigmoid/binary label.