Skip to content

about gat_layer #30

@adverbial03

Description

@adverbial03

Thank you for your excellent work.
I don’t understand something about the gat layer. Your graph_attention is implemented with the function make_attention_input, but it seems that you just copied and spliced x(v) in various ways. I can’t understand how this part implements graph_attention. Can you explain it in detail? ?
In addition, if I want to build a graph that is not fully connected (each node has a fixed number of edges), is this possible?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions