Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update med.py: Fixed the issue of BERTEncoder.forward() not returning cross-attentions when requested #171

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

programmingLearner
Copy link

Update med.py: Fixed the issue of BERTEncoder.forward() not returning cross-attentions when requested

In class BertEncoder.forward() method, all_cross_attentions is defined in Line 409, but not maintained, which causes a retuning of None object when requested. In this revision, all_cross_attentions is properly updated and maintained in Line 461. The maintenance code is referred from the original Hugging-face Transformer library https://github.com/huggingface/transformers/blob/v4.15.0/src/transformers/models/bert/modeling_bert.py Line600, and is tested to be valid.

… cross-attentions when requested

In class BertEncoder.forward() method, `all_cross_attentions` is defined in Line 409, but not maintained, which causes a retuning of None object when requested. In this revision, `all_cross_attentions` is properly updated and maintained in Line 461. The maintenance code is referred from the original Hugging-face Transformer library https://github.com/huggingface/transformers/blob/v4.15.0/src/transformers/models/bert/modeling_bert.py Line600, and is tested to be valid.
@salesforce-cla
Copy link

Thanks for the contribution! Before we can merge this, we need @programmingLearner to sign the Salesforce Inc. Contributor License Agreement.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant