-
-
Notifications
You must be signed in to change notification settings - Fork 195
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🚚🤖 Add Transformer Interaction #714
Conversation
Is the concept of the transformer interaction an idea proposed by galkin et al. (2020)? Wondering if I should add the citation in |
Is there a blocker for adding a StarE model appropriate for regular KGs? |
StarE collapses to CompGCN if you remove all the hyper-relational parts 🙂 (StarE is "only" the encoder part of what is described in the publication; the decoder is this interaction function) |
So then it's fine to finish this PR without adding a new model, but it would be important to write some narrative documentation explaining how one might use this interaction to construct a new model |
This should be the place to put it? To slightly modify the example there and create an "anonymous" model instead of declaring a new class: from pykeen.models import ERModel
from pykeen.nn import EmbeddingSpecification
from pykeen.losses import BCEWithLogitsLoss
model = ERModel(
triples_factory=...,
loss=BCEWithLogitsLoss(), # TODO: we should support passing a string here, and use the loss-resolver to resolve it
interaction="transformer",
entity_representations=EmbeddingSpecification(embedding_dim=64),
relation_representations=EmbeddingSpecification(embedding_dim=64),
) |
Thanks Max! A few minor comments:
|
I changed the init to xavier with 08ae09a
This PR only adds the interaction function, not a full model configuration with loss etc. I see that we lose the geometric interpretation of margin loss, but is there any other reason why dot product should not work with margin? After all, margin encourages a relative ordering of scores, which could also be achieved by dot-product based interaction, if I am not mistaken. |
@PyKEEN-bot test |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry didn't mean to approve - I still want to see some added tutorial functions like you mentioned
trigger ci
This PR adds the Transformer-based interaction from the StarE paper.
cf.
Note: this PR doesn't add the citation since it's already in the
references.rst
because we referenced it before.