Skip to content
This repository has been archived by the owner on Mar 10, 2023. It is now read-only.

Commit

Permalink
Fixed multi layers issue. Now it's possible to have multiple layers.
Browse files Browse the repository at this point in the history
  • Loading branch information
ugnelis committed Jan 29, 2018
1 parent 3f485eb commit 2b383ad
Showing 1 changed file with 6 additions and 5 deletions.
11 changes: 6 additions & 5 deletions train.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,8 +42,8 @@
# Hyper-parameters.
NUM_EPOCHS = 200
NUM_HIDDEN = 50
NUM_LAYERS = 1
BATCH_SIZE = 1
NUM_LAYERS = 2
BATCH_SIZE = 4

# Optimizer parameters.
INITIAL_LEARNING_RATE = 1e-2
Expand Down Expand Up @@ -101,11 +101,12 @@ def main(argv):
sequence_length_placeholder = tf.placeholder(tf.int32, [None])

# Defining the cell.
cell = tf.contrib.rnn.LSTMCell(NUM_HIDDEN, state_is_tuple=True)
def lstm_cell():
return tf.contrib.rnn.LSTMCell(NUM_HIDDEN, state_is_tuple=True)

# Stacking rnn cells.
stack = tf.contrib.rnn.MultiRNNCell([cell] * NUM_LAYERS,
state_is_tuple=True)
stack = tf.contrib.rnn.MultiRNNCell(
[lstm_cell() for _ in range(NUM_LAYERS)], state_is_tuple=True)

# Creates a recurrent neural network.
outputs, _ = tf.nn.dynamic_rnn(stack, inputs_placeholder, sequence_length_placeholder, dtype=tf.float32)
Expand Down

0 comments on commit 2b383ad

Please sign in to comment.