How to visualize episode behaviour with the reinforcement learning toolbox?
9 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Jan de Priester
il 5 Giu 2019
Modificato: Emmanouil Tzorakoleftherakis
il 15 Set 2020
How can I create a visualization for a custom environment that shows the behaviour of the system in the environment during an episode of training? I cannot find code examples or clarifations of code that visualizes systems behaviour during training episodes anywhere on Mathworks. I would like to achieve a visualization that looks something like the cart-pole visualizer shown on this page: https://nl.mathworks.com/help/reinforcement-learning/ug/train-pg-agent-to-balance-cart-pole-system.html?searchHighlight=cart%20pole&s_tid=doc_srchtitle.
PS I am trying to solve the continuous mountain car problem with a ddpg agent with the reinforcement learning toolbox
0 Commenti
Risposta accettata
Emmanouil Tzorakoleftherakis
il 7 Giu 2019
Hello,
To create a custom MATLAB environment, use the template that pops up after running
rlCreateEnvTemplate('myenv')
In this template there are two methods that can be used for visualization, "plot", and "envUpdatedCallback" (it is called from within "plot"). Use "plot" to create the basic stationary parts of your visualization, and "envUpdatedCallback" to update the coordinates of the moving parts based on your states.
5 Commenti
Prashanth Chivkula
il 15 Set 2020
And another question where do I define a reward function in the template
Emmanouil Tzorakoleftherakis
il 15 Set 2020
Modificato: Emmanouil Tzorakoleftherakis
il 15 Set 2020
The error sounds self-explanatory - make sure whatever you are plotting makes sense.
In this template there is no separate function for rewards - it is implemented inside 'step' if you go through the generated code. You could create a separate function if you want as well.
In the future please create a separate question if it's not related to the original one. Thanks!
Più risposte (0)
Vedere anche
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!