Skip to content

Application of composite LRP  #255

@HugoTex98

Description

@HugoTex98

Hello everyone.

I was trying to implement the composite LRP like the one presented in: G. Montavon, A. Binder, S. Lapuschkin, W. Samek, K.-R. MüllerLayer-wise Relevance Propagation: An Overviewin Explainable AI: Interpreting, Explaining and Visualizing Deep Learning, Springer LNCS, vol. 11700,2019, but unsuccessful...

Anyone knows how can I implement this?

Here is my model:

def ConnectomeCNN(input_shape, keep_pr=0.65, n_filter=32, n_dense1=64, n_classes=2):
bias_init = tf.constant_initializer(value=0.001)
input_1 = InputLayer(input_shape=input_shape, name="input")  
conv1 = Conv2D(
    filters=n_filter,
    kernel_size=(1, input_shape[1]),
    strides=(1, 1),
    padding="valid",
    activation="selu",
    kernel_initializer="glorot_uniform",
    bias_initializer=bias_init,
    name="conv1",
    input_shape=input_shape,
)
dropout1 = Dropout(keep_pr, name="dropout1")
conv2 = Conv2D(
    filters=n_filter * 2,
    kernel_size=(input_shape[1], 1),
    strides=(1, 1),
    padding="valid",
    activation="selu",
    kernel_initializer="glorot_uniform",
    bias_initializer=bias_init,
    name="conv2",
)
dropout2 = Dropout(keep_pr, name="dropout2")
reshape = Reshape((n_filter * 2,), name="reshape")
dense1 = Dense(
    n_dense1, activation="selu", name="dense1", kernel_regularizer="l1_l2"
)  # kernel_regularizer = regularizers.l1(0.0001))
if n_classes == 1:
    activation = "sigmoid"
else:
    activation = "softmax"
output = Dense(n_classes, activation=activation, name="output")

model = keras.models.Sequential(
    [input_1, conv1, dropout1, conv2, dropout2, reshape, dense1, output]
)
return model

`

Thank you!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions