Skip to content

Commit

Permalink
Fix: refactor getattr(keras.optimizers, optimizer) to `keras.optimi…
Browse files Browse the repository at this point in the history
…zers.get(optimizer)` in autoencoders
  • Loading branch information
RollerKnobster committed Jun 24, 2024
1 parent b64dc08 commit fe3530b
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion gordo/machine/model/factories/feedforward_autoencoder.py
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ class (e.x. Adam(lr=0.01,beta_1=0.9, beta_2=0.999)). If no arguments are

# Instantiate optimizer with kwargs
if isinstance(optimizer, str):
Optim = getattr(keras.optimizers, optimizer)
Optim = keras.optimizers.get(optimizer)
optimizer = Optim(**optimizer_kwargs)

# Final output layer
Expand Down
2 changes: 1 addition & 1 deletion gordo/machine/model/factories/lstm_autoencoder.py
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ class (e.x. Adam(lr=0.01,beta_1=0.9, beta_2=0.999)). If no arguments are

# output layer
if isinstance(optimizer, str):
Optim = getattr(keras.optimizers, optimizer)
Optim = keras.optimizers.get(optimizer)
optimizer = Optim(**optimizer_kwargs)

model.add(Dense(units=n_features_out, activation=out_func))
Expand Down

0 comments on commit fe3530b

Please sign in to comment.