You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The optimizer cannot recognize variable bn_Conv1/beta:0. This usually means you are trying to call the optimizer to update different parts of the model separately. Please call optimizer.build(variables)with the full list of trainable variables before the training loop or use legacy optimizertf.keras.optimizers.legacy.Adam.'`
To Reproduce
Steps to reproduce the behavior:
Whenrunninglr=model.find_lr() gettingtheaboveerrori.e.,
`LRFinderiscomplete, type {learner_name}.recorder.plot() toseethegraph.
Errorduringlearningratefinding: 'The optimizer cannot recognize variable bn_Conv1/beta:0. This usually means you are trying to call the optimizer to update different parts of the model separately. Please call `optimizer.build(variables)` with the full list of trainable variables before the training loop or use legacy optimizer `tf.keras.optimizers.legacy.Adam.'`
error:
KeyErrorTraceback (mostrecentcalllast)
CellIn[8], line1---->1lr=model.lr_find()
FileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\arcgis\learn\models_arcgis_model.py:798, inArcGISModel.lr_find(self, allow_plot)
795self.learn.lr_find()
796exceptExceptionase:
797# if some error comes in lr_find-->798raisee799finally:
800self.learn.metrics=metricsFileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\arcgis\learn\models_arcgis_model.py:795, inArcGISModel.lr_find(self, allow_plot)
791withtempfile.TemporaryDirectory(
792prefix="arcgisTemp_"793 ) as_tempfolder:
794self.learn.path=Path(_tempfolder)
-->795self.learn.lr_find()
796exceptExceptionase:
797# if some error comes in lr_find798raiseeFileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\arcgis\learn_utils\fastai_tf_fit.py:634, intf_lr_find(learn, start_lr, end_lr, num_it, stop_div, **kwargs)
632cb=TfLRFinder(learn, start_lr, end_lr, num_it, stop_div)
633a=int(np.ceil(num_it/len(learn.data.train_dl)))
-->634learn.fit(a, start_lr, callbacks=[cb], **kwargs)
FileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\arcgis\learn_utils\fastai_tf_fit.py:370, inTfLearner.fit(self, epochs, lr, wd, callbacks)
368self.create_opt(lr, wd)
369callbacks= [cb(self) forcbinself.callback_fns] +listify(callbacks)
-->370tf_fit(
371epochs,
372self.model,
373self.loss_func,
374opt=self.opt,
375data=self.data,
376metrics=self.metrics,
377callbacks=self.callbacks+callbacks,
378 )
FileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\arcgis\learn_utils\fastai_tf_fit.py:300, intf_fit(epochs, model, loss_func, opt, data, callbacks, metrics)
298exceptExceptionase:
299exception=e-->300raisee301finally:
302cb_handler.on_train_end(exception)
FileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\arcgis\learn_utils\fastai_tf_fit.py:282, intf_fit(epochs, model, loss_func, opt, data, callbacks, metrics)
280xb, yb=_pytorch_to_tf_batch(xb), _pytorch_to_tf(yb)
281xb, yb=cb_handler.on_batch_begin(xb, yb)
-->282loss=tf_loss_batch(model, xb, yb, loss_func, opt, cb_handler)
283ifcb_handler.on_batch_end(loss):
284breakFileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\arcgis\learn_utils\fastai_tf_fit.py:188, intf_loss_batch(model, xb, yb, loss_func, opt, cb_handler)
186grads=tape.gradient(loss, model.trainable_variables)
187cb_handler.on_backward_end()
-->188opt.apply_gradients(zip(grads, model.trainable_variables))
189cb_handler.on_step_end()
190else:
FileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\arcgis\learn_utils\fastai_tf_fit.py:673, inTfOptimWrapper.apply_gradients(self, grads_and_vars)
671ifnext_var[0] isNone:
672continue-->673opt.apply_gradients([next_var])
FileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\keras\optimizers\optimizer.py:1174, inOptimizer.apply_gradients(self, grads_and_vars, name, skip_gradients_aggregation, **kwargs)
1172ifnotskip_gradients_aggregationandexperimental_aggregate_gradients:
1173grads_and_vars=self.aggregate_gradients(grads_and_vars)
->1174returnsuper().apply_gradients(grads_and_vars, name=name)
FileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\keras\optimizers\optimizer.py:650, in_BaseOptimizer.apply_gradients(self, grads_and_vars, name)
648self._apply_weight_decay(trainable_variables)
649grads_and_vars=list(zip(grads, trainable_variables))
-->650iteration=self._internal_apply_gradients(grads_and_vars)
652# Apply variable constraints after applying gradients.653forvariableintrainable_variables:
FileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\keras\optimizers\optimizer.py:1200, inOptimizer._internal_apply_gradients(self, grads_and_vars)
1199def_internal_apply_gradients(self, grads_and_vars):
->1200returntf.internal.distribute.interim.maybe_merge_call(
1201self._distributed_apply_gradients_fn,
1202self._distribution_strategy,
1203grads_and_vars,
1204 )
FileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\tensorflow\python\distribute\merge_call_interim.py:51, inmaybe_merge_call(fn, strategy, *args, **kwargs)
31"""Maybe invoke fn via merge_call which may or may not be fulfilled.3233 The caller of this utility function requests to invoke fn via merge_call(...)48 The return value of the fn call.49 """50ifstrategy_supports_no_merge_call():
--->51returnfn(strategy, *args, **kwargs)
52else:
53returndistribution_strategy_context.get_replica_context().merge_call(
54fn, args=args, kwargs=kwargs)
FileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\keras\optimizers\optimizer.py:1250, inOptimizer._distributed_apply_gradients_fn(self, distribution, grads_and_vars, **kwargs)
1247returnself._update_step(grad, var)
1249forgrad, varingrads_and_vars:
->1250distribution.extended.update(
1251var, apply_grad_to_update_var, args=(grad,), group=False1252 )
1254ifself.use_ema:
1255_, var_list=zip(*grads_and_vars)
FileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\tensorflow\python\distribute\distribute_lib.py:2637, inStrategyExtendedV2.update(self, var, fn, args, kwargs, group)
2634fn=autograph.tf_convert(
2635fn, autograph_ctx.control_status_ctx(), convert_by_default=False)
2636withself._container_strategy().scope():
->2637returnself._update(var, fn, args, kwargs, group)
2638else:
2639returnself._replica_ctx_update(
2640var, fn, args=args, kwargs=kwargs, group=group)
FileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\tensorflow\python\distribute\distribute_lib.py:3710, in_DefaultDistributionExtended._update(self, var, fn, args, kwargs, group)
3707def_update(self, var, fn, args, kwargs, group):
3708# The implementations of _update() and _update_non_slot() are identical3709# except _update() passes var as the first argument to fn().->3710returnself._update_non_slot(var, fn, (var,) +tuple(args), kwargs, group)
FileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\tensorflow\python\distribute\distribute_lib.py:3716, in_DefaultDistributionExtended._update_non_slot(self, colocate_with, fn, args, kwargs, should_group)
3712def_update_non_slot(self, colocate_with, fn, args, kwargs, should_group):
3713# TODO(josh11b): Figure out what we should be passing to UpdateContext()3714# once that value is used for something.3715withUpdateContext(colocate_with):
->3716result=fn(*args, **kwargs)
3717ifshould_group:
3718returnresultFileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\tensorflow\python\autograph\impl\api.py:595, incall_with_unspecified_conversion_status.<locals>.wrapper(*args, **kwargs)
593defwrapper(*args, **kwargs):
594withag_ctx.ControlStatusCtx(status=ag_ctx.Status.UNSPECIFIED):
-->595returnfunc(*args, **kwargs)
FileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\keras\optimizers\optimizer.py:1247, inOptimizer._distributed_apply_gradients_fn.<locals>.apply_grad_to_update_var(var, grad)
1245returnself._update_step_xla(grad, var, id(self._var_key(var)))
1246else:
->1247returnself._update_step(grad, var)
FileD:\PythonProjects\NexICAT\propnex\Lib\site-packages\keras\optimizers\optimizer.py:232, in_BaseOptimizer._update_step(self, gradient, variable)
230return231ifself._var_key(variable) notinself._index_dict:
-->232raiseKeyError(
233f"The optimizer cannot recognize variable {variable.name}. "234"This usually means you are trying to call the optimizer to "235"update different parts of the model separately. Please call "236"optimizer.build(variables) with the full list of trainable "237"variables before the training loop or use legacy optimizer "238f"`tf.keras.optimizers.legacy.{self.class.name}."239 )
240self.update_step(gradient, variable)
KeyError: 'The optimizer cannot recognize variable bn_Conv1/beta:0. This usually means you are trying to call the optimizer to update different parts of the model separately. Please call optimizer.build(variables) with the full list of trainable variables before the training loop or use legacy optimizer `tf.keras.optimizers.legacy.Adam.'
Screenshots
If applicable, add screenshots to help explain your problem.
Describe the bug
The bug is as follows:
The optimizer cannot recognize variable bn_Conv1/beta:0. This usually means you are trying to call the optimizer to update different parts of the model separately. Please call
optimizer.build(variables)with the full list of trainable variables before the training loop or use legacy optimizer
tf.keras.optimizers.legacy.Adam.'`To Reproduce
Steps to reproduce the behavior:
error:
Screenshots
If applicable, add screenshots to help explain your problem.
Expected behavior
The images should have been read and the
learning_rate
should have been found out.I have used the notebook file from: https://github.com/Esri/arcgis-python-api/blob/master/samples/04_gis_analysts_data_scientists/wildlife_species_identification_in_camera_trap_images.ipynb
Platform (please complete the following information):
1.6.2
] (you can get this by typingprint(arcgis.__version__)
: 2.3.0Additional context
The images I have used are my own. Adding some images for reference.
Kindly help.
The text was updated successfully, but these errors were encountered: