tensorflow - 每次前向传递后保存 model 权重

我正在通过将权重模式作为 model 超参数和输入数据的函数来研究可解释的 AI。我正在研究的一件事是,在学习完成后,权重如何从随机性(或初始化器开始 values)发展到稳定。我想,而不是在每个时期都保存权重,而是在每隔一个或第三次前传时保存它们。我该怎么做?我必须调整 Keras model checkpoint 方法的“周期”参数吗?如果是这样,设置该参数的简单公式是什么?谢谢,祝你有美好的一天。

回答1

实例化 tf.keras.callbacks.ModelCheckpoint 时只需传递 save_freq=3

引用文档:

https://keras.io/api/callbacks/model_checkpoint/

save_freq: 'epoch' or integer. When using 'epoch', the callback saves the model after each epoch. When using integer, the callback saves the model at end of this many batches. If the Model is compiled with steps_per_execution=N, then the saving criteria will be checked every Nth batch. Note that if the saving isn't aligned to epochs, the monitored metric may potentially be less reliable (it could reflect as little as 1 batch, since the metrics get reset every epoch). Defaults to 'epoch'.

相似文章

最新文章