You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In cases where the input sequence is a list of sentences,
we concatenate the sentences into a long list of word
tokens, inserting after each sentence an end-of-sentence token.
I went through the code it's bit different here. What is actually happening ? Each sentence separately feed through a RNN ?
The text was updated successfully, but these errors were encountered:
shamanez
changed the title
Can you explain bit about input module of the dynamic memory network mentioned in here.
Can you explain bit about input module of the dynamic memory network(DMN+) mentioned in here.
Jun 22, 2017
In the input module how to embed the information that coming from the context? In some papers it has mentioned
The will concatenate all the words in the context and add an EOS at the end of each sentence and feed in through r RNN with GRU units. Then take the hidden states at each time step.
Ask Me Anything:Dynamic Memory Networks for Natural Language Processing
I went through the code it's bit different here. What is actually happening ? Each sentence separately feed through a RNN ?
The text was updated successfully, but these errors were encountered: