ExperienceBuffer has 0 Length when i load a saved agent and continue training in reinforcement training
2 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Yikai
il 5 Apr 2021
Commentato: Dmitriy Ogureckiy
il 12 Gen 2023
Hi all,
I'm trying to train a saved agent further. In the training option of this saved agent, the SaveExperienceBufferWithAgent is set to true. But when I load the saved_agent and open the property ExperienceBuffer I noticed the Length is 0. I tried to look in the documentation of such property but the there is no information on it. If I stop a training and directly check the property "Length" of the agent in the workspace, it has some value.
My question would be what does this "Length" mean? If it's 0, when I perform training further with a saved agent like in https://de.mathworks.com/matlabcentral/answers/495436-how-to-train-further-a-previously-trained-agent?s_tid=answers_rc1-2_p2_MLT , does it really continue training with saved agent and with saved expeirence buffer?

Yours
0 Commenti
Risposta accettata
Takeshi Takahashi
il 20 Apr 2021
Length 0 means there isn't any experience in this buffer. I think it didn't save the experience buffer due to this bug. Please set agent.AgentOptions.SaveExperienceBufferWithAgent = true immediately before saving the agent.
2 Commenti
Dmitriy Ogureckiy
il 12 Gen 2023
Can I ask you, does networks weights saved when agent saved between simulations?
Più risposte (0)
Vedere anche
Categorie
Scopri di più su Introduction to Installation and Licensing in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!