You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
nn.Recurrent passes an offset variable to :recycle and :forget, which doesn't seem to be used anywhere. Am I missing something, or is the offset parameter unused?
This causes the Recurrent class to store an extra copy of the RecurrentModule, which can be extremely expensive if the module requires significant memory:
require 'rnn'
local _ = require 'moses'
local model = nn.Recurrent(nn.MulConstant(2), nn.MulConstant(3))
print('#sharedClones after init:', _.count(model.sharedClones))
local input = torch.rand(1)
model:forward(input)
model:forward(input)
print('#sharedClones after forward:', _.count(model.sharedClones)) -- 1
model:forget()
print('#sharedClones after forget:', _.count(model.sharedClones)) -- 1
model:forward(input)
model:forward(input)
print('#sharedClones after 2nd forward:', _.count(model.sharedClones)) -- 2
nn.Recurrent
passes anoffset
variable to:recycle
and:forget
, which doesn't seem to be used anywhere. Am I missing something, or is theoffset
parameter unused?Call to
:recycle
:rnn/Recurrent.lua
Line 132 in ef98a97
Call to
:forget
:rnn/Recurrent.lua
Line 136 in ef98a97
Defn of
:recycle
:rnn/AbstractRecurrent.lua
Line 80 in ef98a97
Defn of
:forget
:rnn/AbstractRecurrent.lua
Line 113 in ef98a97
The text was updated successfully, but these errors were encountered: