We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi,
I have a quick question with respect to the relative shift operation:
def _rel_shift(self, x, zero_triu=False): zero_pad = torch.zeros((x.size(0), 1, *x.size()[2:]), device=x.device, dtype=x.dtype) x_padded = torch.cat([zero_pad, x], dim=1) x_padded = x_padded.view(x.size(1) + 1, x.size(0), *x.size()[2:]) x = x_padded[1:].view_as(x) if zero_triu: ones = torch.ones((x.size(0), x.size(1))) x = x * torch.tril(ones, x.size(1) - x.size(0))[:,:,None,None] return x
In the transformer-xl paper, Appendix B (https://arxiv.org/pdf/1901.02860.pdf), we see that the upper right triangular of matrix B consists of zeros. In the above code and throughout the model implementation zero_triu == False so that after performing the relative shift, the upper right triangle is not filled with zeros as described in the paper. In the huggingface implementation of this function, this unused parameter is completely removed (see https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_transfo_xl.py#L275).
zero_triu == False
Is the upper right triangle masked at a later place no matter what, or why can zero_triu be neglected?
zero_triu
The text was updated successfully, but these errors were encountered:
I was confused about this too, but I think the attn_mask functionality applied here and initialized here does the job.
attn_mask
Sorry, something went wrong.
No branches or pull requests
Hi,
I have a quick question with respect to the relative shift operation:
In the transformer-xl paper, Appendix B (https://arxiv.org/pdf/1901.02860.pdf), we see that the upper right triangular of matrix B consists of zeros. In the above code and throughout the model implementation
zero_triu == False
so that after performing the relative shift, the upper right triangle is not filled with zeros as described in the paper.In the huggingface implementation of this function, this unused parameter is completely removed (see https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_transfo_xl.py#L275).
Is the upper right triangle masked at a later place no matter what, or why can
zero_triu
be neglected?The text was updated successfully, but these errors were encountered: