Skip to content

LRP with pretrained ResNet returns weird heatmaps #1035

@razla

Description

@razla

Hey,

I’m trying to use your LRP implementation on a pretrained model resnet34/32 (ImagetNet and CIFAR10/100 respectively).

I needed to convert the ReLU layers to inplace=False in order to make it work and then when I execute this - it works but the output is weird and inconsistent…

these are the images for example:

image

and these are the explanations (heat maps)

image

which doesnt make sense, since LRP tends to be MUCH more readable and coherent.

thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions