how can I display the trained network weights in reinforcement learning agent?

6 visualizzazioni (ultimi 30 giorni)
Hello,
I trained a DDPG agent by using reinforcement learning in Reinforcement Learning Toolbox.
I wanted to know the trained weight in the agentm, so after the train was finished I checked the agent variables in work space.
However, I couldn't fine any values of the weights in the variables not even 'agent' and 'evn' variable.
I know it is possible to check weights of network in Neural Network Toolbox, but is it able to access to the weights in Reinforcement Learning Toobox?
What should I do?

Risposte (1)

Anh Tran
Anh Tran il 21 Feb 2020
Modificato: Anh Tran il 21 Feb 2020
Hi Ru SeokHun,
In MATLAB R2019b and below, there is a 2-step process:
  1. Use getActor, getCriitic functions to gather the actor and critic representations from the trained agent.
  2. Use getLearnableParameterValues function to get the weights and biases of the neural network representation.
See the code below to get the parameters of the trained actor. You can compare these values with those of an untrained agent. Assume you have DDPG agent named 'agent'
% get the agent's actor, which predicts next action given the current observation
actor = getActor(agent);
% get the actor's parameters (neural network weights)
actorParams = getLearnableParameterValues(actor);

Categorie

Scopri di più su Deep Learning Toolbox in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by