- https://www.mathworks.com/help/releases/R2021b/matlab/ref/save.html
- https://www.mathworks.com/help/releases/R2021b/matlab/ref/load.html
- https://www.mathworks.com/help/releases/R2021b/reinforcement-learning/ref/rl.agent.rlqagent.getcritic.html
- https://www.mathworks.com/help/reinforcement-learning/ref/rl.agent.rldqnagent.html
How to save trained Q-Network by RL-DQN?
5 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
I would like to load the trained Q-Network in rlQValueRepresetation.
How can I save the pre-trained Q-network.
I know that DQN agent can be saved with rlTrainingOptions. but I could not confirm pre-trained Q-network.
Due to my lack of confirmation, if it is possible to save pre-trained Q-Network in rlTrainingOptions, could you please tell me how to load the Q-Network?
0 Commenti
Risposte (1)
Abhiram
il 12 Giu 2025
To save and load a trained Q-Network in rlQValueRepresentation, the Q-Network can be extracted from the agent and be saved as a MAT file. Code snippets for saving and loading a Q-Network are given:
% Extract Q-network from trained agent
qRep = getCritic(agent);
% Save the Q-network to a file
save('savedQNetwork.mat','qRep');
% Load the Q-network from file
load('savedQNetwork.mat','qRep');
% Rebuild agent from loaded Q-network (assuming agent options are available)
agentFromLoadedQ = rlDQNAgent(qRep, agentOpts);
For more information on the “save”, “load”, “rlDQNAgent” and “getCritic” functions, refer to the MATLAB Documentation:
Hope this helps!
0 Commenti
Vedere anche
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!