Skip to main content

All Questions

Filter by
Sorted by
Tagged with
1 vote
1 answer
368 views

Stacked RNN to overcome vanishing gradient

Let me clarify that with vanishing gradient i don't mean the gradient to become zero (like for DNN with multiple sigmoidal layers) but learning long term dependencies.. So, LSTM are well known that ...
Alberto's user avatar
  • 1,381
1 vote
1 answer
360 views

Simulating data with long range dependencies

I want to evaluate how well a recurrent neural network I've created captures long-range dependencies, and the effects altering the network have on this. I'm not entirely sure how I would go about ...
as646's user avatar
  • 175