Skip to content

fixed issue number 83:84 Error for attention state missing#91

Open
chahalinder0007 wants to merge 1 commit intobgshih:masterfrom
officework1993:master
Open

fixed issue number 83:84 Error for attention state missing#91
chahalinder0007 wants to merge 1 commit intobgshih:masterfrom
officework1993:master

Conversation

@chahalinder0007
Copy link

@chahalinder0007 chahalinder0007 commented Nov 16, 2019

The problem is in the _compute_attention() function in line: 52

The attention state is required for the computations in the attention wrapper which is missing.

The attention_state has not been accounted for when calling the function and therefore the missing value is actually this next_attention_state that has not been passed on to the AttentionWrapper.

@chahalinder0007
Copy link
Author

Hi,
Kindly look at the fixes and let me know if any improvements in the same are required.

Regards

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant