How to save trained Q-Network by RL-DQN?

1 visualizzazione (ultimi 30 giorni)
一馬 平田
一馬 平田 il 31 Ott 2021
I would like to load the trained Q-Network in rlQValueRepresetation.
How can I save the pre-trained Q-network.
I know that DQN agent can be saved with rlTrainingOptions. but I could not confirm pre-trained Q-network.
Due to my lack of confirmation, if it is possible to save pre-trained Q-Network in rlTrainingOptions, could you please tell me how to load the Q-Network?

Risposte (0)

Prodotti


Release

R2021b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by